SwiftLlama

0.2.0

A Swift Wrapper for llama.cpp
ShenghaiWang/SwiftLlama

What's New

2024-05-09T21:28:45Z

Basic session support
Add method returns string without streaming

SwiftLlama

This is basically a wrapper of llama.cpp package and the purpose of this repo is to provide a swiftier API for Swift developers.

Install

.package(url: "https://github.com/ShenghaiWang/SwiftLlama.git", from: "0.2.0")

Usage

1 Initialise swiftLlama using model path.

let swiftLlama = try SwiftLlama(modelPath: path))

2 Call it

Call without streaming

let response: String = try await swiftLlama.start(for: prompt)

Using AsyncStream for streaming

for try await value in await swiftLlama.start(for: prompt) {
    result += value
}

Using Combine publisher for streaming

await swiftLlama.start(for: prompt)
    .sink { _ in

    } receiveValue: {[weak self] value in
        self?.result += value
    }.store(in: &cancallable)

Test projects

This video was the command line app running with Llama 3 model.

For using it in iOS or MacOS app, please refer to the TestProjects folder.

Supported Models

In theory, it should support all the models that llama.cpp suports. However, the prompt format might need to be updated for some models.

If you want to test it out quickly, please use this model codellama-7b-instruct.Q4_K_S.gguf

Welcome to contribute!!!

Description

  • Swift Tools 5.9.0
View More Packages from this Author

Dependencies

Last updated: Mon Dec 16 2024 03:35:53 GMT-1000 (Hawaii-Aleutian Standard Time)