Spark iOS SDK
The Spark iOS SDK enables you to run your WASM models on the iOS platform.
Getting started
The SDK is available as a Swift package and can be added to an Xcode project via Swift Package Manager. Right now, we will be providing a copy of the package as source code, so you can add it to your project as a local dependency. In future, it will be available via Swift Package Manager.
Creating an instance of the SDK
First, create an instance of the SDK factory:
let factory = SparkSDKFactory()
The SDK does not contain any WASM models by default; these must be provided to the factory.
To do this, use the Impex command line tool to obtain the models as a zip file. Unzip this and add the files to your Xcode project. Then, obtain the directory where the models are stored as a string. For example, this code gets the location of an folder in the project called assets
.
let path = Bundle.main.bundlePath
let modelsPath = "\(path)/assets"
Then, pass this to the factory, along with a completion handler. The factory has to do some setup work behind the scenes, we're using a completion handler here so we don't block the calling function. When everything is ready, or has failed, the completion handler will called with a Result<SparkSDK, Error>
value. The completion handler will always be called - if setup failed, the result will be an error, with some information on what went wrong. If setup succeeded, your Result
will contain a SparkSDK
instance.
let modelsUrl = URL(fileURLWithPath: modelsPath)
factory.requestSDK(
modelsPath: modelsUrl.absoluteString,
onSDKReady: { sdkResult in
// handle sdkResult here
}
)
Using the SDK to process data
Once an instance of the SDK is created, you are ready to pass data to it, for execution. The data will be processed by the models that were loaded when creating the SDK instance. The interface presented by the SDK for executing a model request intentionally resembles the web equivalent. To pass data to the SDK, you will need to provide two things:
- The data itself, defined as a dictionary of
String, Any
. This must include your inputs, and your request metadata, similar to the web version. The SDK will convert this to JSON, so it must contain only types that can be represented as JSON. - A request ID, which can be any unique
String
. We recommend using a UUID for this. The request ID will be included with the response.
Observing data results
The SDK exposes a Publisher
called executionResponses
which will contain the results of all data execution requests. It's recommended to start observing this publisher as early as possible, once the SDK instance has been created. It will publish instances of Result<ExecutionResponse, Error>
. When a data execution request fails, an error result will be published. When a data execution request succeeds, a success result will be published, containing an ExecutionResponse
. This ExecutionResponse
will contain the request ID along with the response data generated by the models.
resultCancellable = sparkSDK?.$executionResponses.sink(receiveValue: { value in
print("Result response: \(value)")
switch value {
case .success(let executionReult):
/* Get the requestId and result value */
// executionReult.requestId
// executionReult.result
case .failure(let error):
//error handling
}
})
Executing a data request
Typically, the data request JSON will have this structure:
{
"request_data": {
"inputs": {
// More inputs can be added in this object
"Input": 1
}
},
"request_meta": {
"version_id": "0211e8f0-9988-4514-a761-9782db6700ce",
"call_purpose": "Spark - API Tester",
"source_system": "SPARK",
"correlation_id": "",
"requested_output": null,
"service_category": ""
}
}
Your input data must match this structure when encoded as JSON. Passing the data to the SDK, along with the request ID, looks like this. Request ID is a unique string to track the result in publisher.
let requestId = UUID().uuidString
sparkSDK.execute(requestId: requestId, input: inputData)
The execute
call will complete without returning a value, because the execution responses are published via the publisher.