SAP Fiori for iOS ARKit

What's New


Has a few but easily adoptable BREAKING CHANGES


  • 🎸 Loading directly from a usdz File (fc14e55)
  • 🎸 Decoding JSON in RealityComposerStrategy (#47) (6c14176)
  • 🎸 Loading from Reality File and SceneLoadable (c25ac1a)

SAP Fiori for iOS ARKit


This project is a SwiftUI implementation of the Augmented Reality (AR) patterns in the SAP Fiori for iOS Design Guidelines.

Currently supported:

AR Annotations


Annotations refer to Cards that match with a corresponding Marker located relative to an image or object in the real world. To view annotations in the world view, the user scans the image / object with the AR Scanner.

3D modeling is not required to represent AR annotations as the respective controls (ARScanView, MarkerView and CardView) are implemented with SwiftUI in this package.

An app developer needs to provide a scene of markers relative to an Image or Object anchor. Such scene creation is possible with Apple's Reality Composer tool.

Depending on how the scene is stored (.rcproject, .reality or .usdz files) the app developer has to specify an appropiate loading strategy to populate the scene and the associated card data.

Cards and Markers support SwiftUI ViewBuilder to allow custom design.

Reality Composer

Composing the scene

  1. Open the Reality Composer app and create a scene with an image or object anchor
  2. Choose an image or scan an object and give the scene a name e.g. ExampleScene
  3. Place spheres in the desired positions
  4. Preview in AR to fine tune
  5. Name the spheres with a type that conforms to LosslessStringConvertable
  6. The name of the sphere will correspond to the CardItemModel id
  7. Export the scene depending on the chosen supported loading strategy
    • Export the scene as .usdz file (Enable usdz export in preferences or iOS app settings)
    • Export the scene as a .reality file
    • Save the entire project as an .rcproject with a single scene


  • Reality Composer is required to scan an object when choosing an Object Anchor.
  • Scanning an object requires using an iOS device in the Reality Composer app
  • The spheres are for scene creation and will be invisible in the ARCards scene

rcDemo1        rcDemo2

Data Consumption


A loading strategy accepts an array of elements, each element conforming to the CardItemModel protocol, to populate card-related data. The id property of the model has to correspond to the name of the Entity (sphere) from Reality Composer.


Each of the loading strategies also has an initializer to acceptData represented by a JSON array.

// JSON key/value:
"id": String,
"title_": String,
"descriptionText_": String?,
"detailImage_": Data?, // base64 encoding of Image
"actionText_": String?,
"icon_": String? // systemName of SFSymbol

Loading Strategies

The supported loading strategies (UsdzFileStrategy, RealityFileStrategy, and RCProjectStrategy) require, in addition to the scene and card-related data, information about the anchor used for detecting a scene. Using an Image anchor requires the app developer to provide anchorImage and its physicalWidth as initializer parameters. For an Object anchor the anchorImage and physicalWidth parameters can be nil.

The scene can be represented in different file types and each strategy requires different data and setup.

  • USDZ Strategy: Requires a URL path to the .usdz file
  • Reality Strategy: Requires a URL path to the .reality file and the name of the scene
  • RCProject Strategy: Requires the name of the .rcproject file and the name of the scene


  • The RCProject strategy requires that the .rcproject file is part of the application bundle so that the file is available already during build time. Drag the file into Xcode to do so.

Example Usage: Creating the ContentView and loading the data

import FioriARKit

struct FioriARKitCardsExample: View {
    @StateObject var arModel = ARAnnotationViewModel<DecodableCardItem>()
    var body: some View {
     Initializes an AR Experience with a Scanning View flow with Markers and Cards upon anchor discovery

     - Parameters:
        - arModel: The View Model which handles the logic for the AR Experience
        - image: The image which will be displayed in the Scanning View
        - cardAction: Card Action
        SingleImageARCardView(arModel: arModel, image: Image("qrImage"), cardAction: { id in
            // action to pass to corresponding card from the CardItemModel id
        .onAppear(perform: loadInitialData)
// Example to use a `UsdzFileStrategy` to populate scene related information (stored in a .usdz file which could have been fetched from a remote server during runtime) as well as card-related information (stored in a .json file which could have been fetched from a remote server as well)
    func loadInitialData() {
        let usdzFilePath = FileManager.default.getDocumentsDirectory().appendingPathComponent(FileManager.usdzFiles).appendingPathComponent("ExampleRC.usdz")
        guard let anchorImage = UIImage(named: "qrImage"), 
              let jsonUrl = Bundle.main.url(forResource: "Tests", withExtension: "json") else { return }
        do {
            let jsonData = try Data(contentsOf: jsonUrl)
            let strategy = try UsdzFileStrategy(jsonData: jsonData, anchorImage: anchorImage, physicalWidth: 0.1, usdzFilePath: usdzFilePath)
            arModel.load(loadingStrategy: strategy)
        } catch {


  • iOS 14 or higher
  • Xcode 12 or higher
  • Reality Composer 1.1 or higher
  • Swift Package Manager


The package is intended for consumption via Swift Package Manager.

  • To add to your application target, navigate to the Project Settings > Swift Packages tab, then add the repository URL.
  • To add to your framework target, add the repository URL to your Package.swift manifest.

In both cases, xcodebuild tooling will manage cloning and updating the repository to your app or framework project.


FioriARKit as umbrella product currently will contain everything the package has to offer. As the package evolves the package could be split into multiple products for different use cases.


Key gaps which are present at time of open-source project launch:

  • An authoring flow for pinning/editing an annotation in app
  • An Annotation Loading Strategy which loads an array of positions for annotations relative to the detected image/object
  • While Reality Composer is useful for scene creation, editing the scene programmatically is possible, but those changes cannot be saved to the file

Known Issues

See Limitations.

How to obtain support

Create a GitHub issue to create bug report, file a feature request or ask a question.


If you want to contribute, please check the Contribution Guidelines

To-Do (upcoming changes)

See Limitations.


Functionality can be further explored with a demo app which is already part of this package (Apps/Examples/Examples.xcodeproj).


  • Swift Tools 5.3.0
View More Packages from this Author


Last updated: Mon Oct 18 2021 00:51:05 GMT-0500 (GMT-05:00)