AuroraToolkit

1.0.0

AuroraToolkit enables AI-driven workflows, context management, and seamless integration with on-device ML and large language models (LLMs).
AuroraToolkit/AuroraToolkit

What's New

1.0.0 Release

2025-11-13T13:47:05Z

AuroraToolkit v1.0.0: Production-Ready AI/ML Integration Toolkit for iOS and macOS Development 🎉 🚀 ✨

AuroraToolkit is a comprehensive suite of tools designed to simplify the integration of AI capabilities into your iOS and macOS applications. With support for multiple LLM providers (Anthropic Claude, Google Gemini, OpenAI ChatGPT, Ollama, and Apple Foundation Models), on-device ML services, declarative workflow orchestration, and intelligent domain routing, AuroraToolkit provides everything you need to build production-ready AI-powered features. This 1.0.0 release represents 14+ months of development and brings full Swift 6 compatibility, comprehensive documentation, and a stable API surface ready for production use.

Key Features

  • Multi-LLM Support
    Unified interface for Anthropic, Google, OpenAI, Ollama, and Apple Foundation Models with intelligent routing and fallback strategies
  • On-Device ML Services
    Native support for classification, intent extraction, embeddings, semantic search, and more using Core ML and Natural Language frameworks
  • Declarative Workflows
    Define complex AI workflows declaratively, similar to SwiftUI, with support for sequential and parallel task execution
  • Convenience APIs
    Simplified top-level APIs (LLM.send(), ML.classify(), Tasks.analyzeSentiment(), etc.) for common operations
  • Model Parameter Support
    Specify custom models in all LLM convenience methods for fine-tuned control over AI behavior
  • Intelligent Domain Routing
    Automatically select the best LLM service for each request using CoreML-based classification, regex rules, or confidence-based routing
  • Context Management
    Built-in conversation management with automatic summarization and context window optimization
  • Swift 6 Compatible
    Fully compatible with Swift 5.5+ and Swift 6 strict concurrency checking with actor-based state management
  • Production-Ready
    Comprehensive testing, error handling, thread-safe design, and stable API surface
  • Well-Documented
    Complete Swift-DocC documentation with examples, platform availability notes, and thread-safety guidance

Full Changelog: 0.9.6...1.0.0

AuroraToolkit

AuroraToolkit is a suite of tools designed to simplify the integration of AI capabilities into your projects. This package offers robust support for AI-driven workflows, including task orchestration, workflow management, on-device ML services, and seamless integration with large language models (LLMs) like Anthropic Claude, Google Gemini, OpenAI ChatGPT, open-source models via Ollama, and Apple's Foundation Models. Its modular architecture empowers developers to customize, extend, and integrate with external services effortlessly.

The AuroraToolkit main package is organized into several modules to enhance flexibility and maintainability:

  • AuroraCore: The foundational library for workflow orchestration, utilities, and declarative task management.
  • AuroraLLM: A dedicated package for integrating large language models (LLMs) such as Anthropic, Google, OpenAI, Ollama, and on-device Apple Foundation Models.
  • AuroraML: On-device ML services (classification, intent extraction, tagging, embedding, semantic search) and corresponding Workflow tasks.
  • AuroraTaskLibrary: A growing collection of prebuilt, reusable tasks designed to accelerate development.
  • AuroraExamples: Practical examples demonstrating how to leverage the toolkit for real-world scenarios.

Whether you're building sophisticated AI-powered applications or integrating modular components into your workflows, AuroraToolkit provides the tools and flexibility to bring your ideas to life.

Quick Start

import AuroraLLM

// Send a message using the default service (Apple Foundation Model if available)
let response = try await LLM.send("What is machine learning?")
print(response)

For more examples, see the Usage section below.

Features

  • Modular Architecture: Organized into distinct modules (Core, LLM, ML, TaskLibrary) for flexibility and maintainability
  • Declarative Workflows: Define workflows declaratively, similar to SwiftUI, for clear task orchestration
  • Multi-LLM Support: Unified interface for Anthropic, Google, OpenAI, Ollama, and Apple Foundation Models
  • On-Device ML: Native support for classification, embeddings, semantic search, and more using Core ML
  • Intelligent Routing: Domain-based routing to automatically select the best LLM service for each request
  • Convenience APIs: Simplified top-level APIs (LLM.send(), ML.classify(), etc.) for common operations
  • Swift 6 Compatible: Fully compatible with Swift 5.5+ and Swift 6 strict concurrency checking with actor-based state management
  • Production Ready: Comprehensive testing, error handling, thread-safe design, and stable API surface
  • Comprehensive Testing: Full test coverage including integration tests across all modules

Modules

AuroraCore

The foundational library providing workflows, task orchestration, and utility functions. Includes declarative workflow system with support for asynchronous execution, parallel processing, and dynamic task groups.

AuroraLLM

Unified interface for managing multiple LLM services (Anthropic, Google, OpenAI, Ollama, Apple Foundation Models). Features intelligent domain-based routing, context management, streaming support, and convenience APIs. Includes native support for on-device Apple Foundation Models (iOS 26+/macOS 26+) and CoreML-based domain routing.

AuroraML

On-device ML services powered by Apple's Natural Language and Core ML frameworks. Provides classification, intent extraction, tagging, embedding generation, and semantic search capabilities.

AuroraTaskLibrary

Prebuilt, reusable tasks for common operations including JSON/RSS parsing, URL fetching, sentiment analysis, language detection, keyword extraction, and context summarization.

AuroraExamples

Practical examples demonstrating real-world usage patterns including multi-model management, declarative workflows, domain routing, and ML+LLM hybrid pipelines.

Installation

Swift Package Manager

To integrate AuroraToolkit into your project using Swift Package Manager, add the following line to your Package.swift file:

.package(url: "https://github.com/AuroraToolkit/AuroraToolkit.git", from: "1.0.0")

Then add the desired modules as dependencies to your target. For example:

.target(
    name: "YourTarget",
    dependencies: [
        .product(name: "AuroraCore", package: "AuroraToolkit"),
        .product(name: "AuroraLLM", package: "AuroraToolkit"),
        .product(name: "AuroraML", package: "AuroraToolkit"),
        .product(name: "AuroraTaskLibrary", package: "AuroraToolkit")
    ]
),

You can include only the modules you need in your project to keep it lightweight and focused.

Usage

Basic LLM Usage

import AuroraLLM

// Simple convenience API (uses Apple Foundation Model if available)
let response = try await LLM.send("What is machine learning?")
print(response)

// Use a specific service
let response = try await LLM.anthropic.send("Explain quantum computing")
let response = try await LLM.foundation?.send("What are the privacy benefits of on-device AI?")
let response = try await LLM.google.send("Summarize the benefits of renewable energy")
let response = try await LLM.openai.send("Write a haiku about coding")

// Specify a custom model
let response = try await LLM.ollama.send("Hello", model: "gemma3:1b")
let response = try await LLM.send("Hello", to: LLM.openai, model: "gpt-4")

// Streaming responses
try await LLM.stream("Tell me a story") { partial in
    print(partial, terminator: "")
}

// Streaming with custom model
try await LLM.stream("Tell me a story", model: "llama3", maxTokens: 2048) { partial in
    print(partial, terminator: "")
}

Workflows

import AuroraCore

let workflow = Workflow(name: "Example Workflow") {
    Workflow.Task(name: "Task_1") { _ in
        return ["result": "Task 1 completed"]
    }
    Workflow.Task(name: "Task_2") { inputs in
        return ["result": "Task 2 completed"]
    }
}

await workflow.start()
print("Result: \(workflow.outputs["Task_2.result"] as? String ?? "")")

Advanced: Domain Routing

Aurora supports multiple domain routing strategies to automatically select the best LLM service:

import AuroraLLM

// Logic-based routing (regex rules)
let router = LogicDomainRouter(
    name: "Privacy Router",
    supportedDomains: ["private", "public"],
    rules: [
        .regex(name: "Email", pattern: #"[A-Z0-9._%+-]+@[A-Z0-9.-]+\.[A-Z]{2,}"#,
               domain: "private", priority: 100)
    ],
    fallbackDomain: "public"
)

// Register router with manager
let manager = LLMManager()
manager.registerDomainRouter(router)

For more advanced examples including CoreML-based routing and dual router strategies, see the full documentation.

Testing

AuroraToolkit includes comprehensive unit and integration tests. Tests run with Ollama by default (no API keys required). For testing other services, configure API keys via environment variables:

export ANTHROPIC_API_KEY="your-key"
export OPENAI_API_KEY="your-key"
export GOOGLE_API_KEY="your-key"

Important: Never commit API keys to the repository. See CONTRIBUTING.md for detailed testing setup instructions.

Documentation

AuroraToolkit uses Swift-DocC for comprehensive, interactive documentation. View documentation by opening the .doccarchive files in Xcode:

open docs/AuroraCore.doccarchive
open docs/AuroraLLM.doccarchive
open docs/AuroraML.doccarchive
open docs/AuroraTaskLibrary.doccarchive

For contributors: See CONTRIBUTING.md for documentation generation instructions.

Future Ideas

  • Multimodal LLM support: Enable multimodal LLMs for use cases beyond plain text
  • Advanced Workflow templates: Prebuilt workflow templates for common AI tasks (summarization, Q&A, data extraction)
  • Agent support: Intelligent agents that can reason, plan, and execute complex multi-step tasks
  • Tool calling / Function calling: Enable LLMs to call external tools and functions (calendar, weather, file system, APIs, etc.)
  • Structured data extraction: Type-safe extraction of structured data from LLM responses using Swift types (similar to Apple's @Generable macro)

Contributing

Contributions are welcome! Please feel free to submit a pull request or open an issue. For more details on how to contribute, please refer to the CONTRIBUTING.md file.

Code of Conduct

We expect all participants to adhere to our Code of Conduct to ensure a welcoming and inclusive environment for everyone.

License

AuroraToolkit is released under the Apache 2.0 License.

Contact

For any inquiries or feedback, please reach out to us at aurora.toolkit@gmail.com.

Description

  • Swift Tools 5.10.0
View More Packages from this Author

Dependencies

Last updated: Sat Dec 13 2025 15:32:27 GMT-1000 (Hawaii-Aleutian Standard Time)