Framework for iOS and Mac Catalyst to dither images and videos.
Dithering is the process of adding noise to an image in order for us to perceive the image more colorful.
This image has only four colors: black, white, cyan, and magenta.
Check out the demo application for iOS and macOS.
- Installation
- Usage
- Dithering methods
- Built-in palettes
- Creating your own palette
- Video Dithering Engine
To use this package in a SwiftPM project, you need to set it up as a package dependency:
// swift-tools-version:5.9
import PackageDescription
let package = Package(
name: "MyPackage",
dependencies: [
.package(
url: "https://github.com/Eskils/DitheringEngine",
.upToNextMinor(from: "1.7.0") // or `.upToNextMajor
)
],
targets: [
.target(
name: "MyTarget",
dependencies: [
.product(name: "DitheringEngine", package: "DitheringEngine")
]
)
]
)
The engine works on CGImages and video URLs/AVAsset.
Supported dithering methods are:
- Threshold
- Floyd-Steinberg
- Atkinson
- Jarvis-Judice-Ninke
- Bayer (Ordered dithering)
- White noise (Ordered dithering)
- Noise (Ordered dithering)
NOTE: The ordered dither methods are computed on the GPU using Metal by default. You can specify to run them on the CPU if desired.
Supported out of the box palettes are:
Example usage:
// Create an instance of DitheringEngine
let ditheringEngine = DitheringEngine()
// Set input image
try ditheringEngine.set(image: inputCGImage)
// Dither to quantized color with 5 bits using Floyd-Steinberg.
let cgImage = try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .quantizedColor,
withDitherMethodSettings: FloydSteinbergSettingsConfiguration(direction: .leftToRight),
withPaletteSettings: QuantizedColorSettingsConfiguration(bits: 5)
)
Example usage:
// Create an instance of VideoDitheringEngine
let videoDitheringEngine = VideoDitheringEngine()
// Create a video description
let videoDescription = VideoDescription(url: inputVideoURL)
// Set preferred output size.
videoDescription.renderSize = CGSize(width: 320, height: 568)
// Dither to quantized color with 5 bits using Floyd-Steinberg.
videoDitheringEngine.dither(
videoDescription: videoDescription,
usingMethod: .floydSteinberg,
andPalette: .quantizedColor,
withDitherMethodSettings: FloydSteinbergSettingsConfiguration(direction: .leftToRight),
andPaletteSettings: QuantizedColorSettingsConfiguration(bits: 5),
outputURL: outputURL,
progressHandler: progressHandler, // Optional block to receive progress.
completionHandler: completionHandler
)
Here is an overview over the available dithering methods.
Threshold gives the nearest match of the color in the image to the color in the palette without adding any noise or improvements.
Token: .threshold
Settings: EmptyPaletteSettingsConfiguration
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .threshold,
andPalette: .cga,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: CGASettingsConfiguration(mode: .palette0High)
)
Floyd-Steinberg dithering spreads the error from reducing the color of a pixel to the neighbouring pixels—yielding an image looking close to the original in areas of fine detail (e.g. grass and trees) and with interesting artifacts in areas of little detail (e.g. the sky).
Token: .floydSteinberg
Settings: FloydSteinbergSettingsConfiguration
FloydSteinbergDitheringDirection
:
.leftToRight
.rightToLeft
.topToBottom
.bottomToTop
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .cga,
withDitherMethodSettings: FloydSteinbergSettingsConfiguration(direction: .leftToRight),
withPaletteSettings: CGASettingsConfiguration(mode: .textMode)
)
Atkinson dithering is a variant of Floyd-Steinberg dithering, and works by spreading error from reducing the color of a pixel to the neighbouring pixels. Atkinson spreads over a larger area, but does not distribute the full error—making colors matching the palette have less noise.
Token: .atkinson
Settings: EmptyPaletteSettingsConfiguration
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .atkinson,
andPalette: .cga,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: CGASettingsConfiguration(mode: .textMode)
)
Jarvis-Judice-Ninke dithering is a variant of Floyd-Steinberg dithering, and works by spreading error from reducing the color of a pixel to the neighbouring pixels. This method spreads distributes the error over a larger area and therefore leaves a smoother look to your image.
Token: .jarvisJudiceNinke
Settings: EmptyPaletteSettingsConfiguration
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .jarvisJudiceNinke,
andPalette: .cga,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: CGASettingsConfiguration(mode: .textMode)
)
Bayer dithering is a type of ordered dithering which adds a precalculated threshold to every pixel, baking in a special pattern.
Token: .bayer
Settings: BayerSettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
thresholdMapSize | Int | 4 |
Specifies the size of the square threshold matrix. Default is 4x4. |
performOnCPU | Bool | false |
Determines wether to perform the computation on the CPU. If false, the GPU is used for quicker performance. |
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .bayer,
andPalette: .cga,
withDitherMethodSettings: BayerSettingsConfiguration(),
withPaletteSettings: CGASettingsConfiguration(mode: .mode5High)
)
White noise dithering adds random noise to the image when converting to the selected palette, leaving a grained and messy look to your image.
Token: .whiteNoise
Settings: WhiteNoiseSettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
thresholdMapSize | Int | 7 |
Specifies the size of the square threshold matrix. Default is 128x128. |
performOnCPU | Bool | false |
Determines wether to perform the computation on the CPU. If false, the GPU is used for quicker performance. |
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .whiteNoise,
andPalette: .apple2,
withDitherMethodSettings: WhiteNoiseSettingsConfiguration(),
withPaletteSettings: Apple2SettingsConfiguration(mode: .hiRes)
)
You can provide your own noise texture to sample when performing ordered dithering.
This image is dithered using a blue noise pattern — leaving a grained, organic look.
Token: .noise
Settings: NoiseDitheringSettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
noisePattern | CGImage? | nil |
Specifies the noise pattern to use for ordered dithering. |
performOnCPU | Bool | false |
Determines wether to perform the computation on the CPU. If false, the GPU is used for quicker performance. |
Example:
let noisePatternImage: CGImage = ...
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .noise,
andPalette: .gameBoy,
withDitherMethodSettings: NoiseDitheringSettingsConfiguration(noisePattern: noisePatternImage),
withPaletteSettings: EmptySettingsConfiguration()
)
Here is an overview of the built-in palettes:
A palette with the two colors: black, and white.
Token: .bw
Settings: EmptyPaletteSettingsConfiguration
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .bw,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: EmptyPaletteSettingsConfiguration()
)
A palette with all shades of gray.
Token: .grayscale
Settings: QuantizedColorSettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
bits | Int | 0 | Specifies the number of bits to quantize to. The number of bits can be between 0 and 8. The number of shades of gray is given by 2^n where n is the number of bits. |
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .grayscale,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: EmptyPaletteSettingsConfiguration()
)
A palette with quantized bits for the color channel. Specify the number of bits to use for color—from 0 to 8. The number of colors is given by 2^n where n is the number of bits.
Token: .quantizedColor
Settings: QuantizedColorSettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
bits | Int | 0 | Specifies the number of bits to quantize to. The number of bits can be between 0 and 8. The number of colors is given by 2^n where n is the number of bits. |
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .quantizedColor,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: QuantizedColorSettingsConfiguration(bits: 2)
)
A palette with the oldschool CGA palettes. CGA was a graphics card introduced in 1981 with the ability to display colour on the IBM PC. It used a 4 bit interface (Red, Green, Blue, Intensity) giving a total of 16 possible colors. Due to limited video memory however, the most common resolution of 320x200 would only allow you four colors on screen simultaneously. In this mode, d developer could choose from four palettes, with beautiful colour combinations such as black, cyan, magenta and white or black, green, red and yellow.
Token: .cga
Settings: CGASettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
mode | CGAMode | .palette1High |
Specifies the graphics mode to use. Each graphics mode has a unique set of colors. The one with the most colors is .textMode . |
CGAMode
:
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .quantizedColor,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: CGASettingsConfiguration(mode: .palette1High)
)
The Apple II was one of the first personal computers with color. Technical challenges related to reducing cost enabled two modes for graphics—a high resolution mode with six colors, and a low resolution mode with 16 colors.
Token: .apple2
Settings: Apple2SettingsConfiguration
Name | Type | Default | Description |
---|---|---|---|
mode | Apple2Mode | .hiRes |
Specifies the graphics mode to use. Each graphics mode has a unique set of colors. |
Apple2Mode
:
Name | Num. Colors | Image |
---|---|---|
.hiRes |
6 colors | |
.loRes |
16 colors |
Note: The 16 colors of the Apple2 Lo-Res palette are different from CGA’s text mode palette.
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .atkinson,
andPalette: .apple2,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: Apple2SettingsConfiguration(mode: .hiRes)
)
Oldschool four color green-shaded monochrome display.
Token: .gameBoy
Settings: EmptyPaletteSettingsConfiguration
Example:
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
let cgImage = try ditheringEngine.dither(
usingMethod: .atkinson,
andPalette: .gameBoy,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: EmptyPaletteSettingsConfiguration()
)
You can create your own palettes using the appropriate APIs.
A palette is represented with the BytePalette
structure, which can be constructed from a lookup-table (LUT), and a collection of colors (LUTCollection). The most useful is perhaps the LUTCollection.
If you have an array of UIColors contained in the palette, you first need to extract the color values into a list of SIMD3<UInt8>
s. This can be done as follows:
let entries = colors.map { color in
var redNormalized: CGFloat = 0
var greenNormalized: CGFloat = 0
var blueNormalized: CGFloat = 0
color.getRed(&redNormalized, green: &greenNormalized, blue: &blueNormalized, alpha: nil)
let red = UInt8(clamp(redDouble * 255, min: 0, max: 255))
let green = UInt8(clamp(greenDouble * 255, min: 0, max: 255))
let blue = UInt8(clamp(blueDouble * 255, min: 0, max: 255))
return SIMD3(x: red, y: green, z: blue)
}
After this, you can make a LUTCollection
and from it a palette:
let collection = LUTCollection<UInt8>(entries: entries)
let palette = BytePalette.from(lutCollection: collection)
When dithering an image, choose the .custom
palette and provide your palette in the CustomPaletteSettingsConfiguration
:
try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .custom,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: CustomPaletteSettingsConfiguration(palette: palette)
)
Full example:
let entries = colors.map { color in
var redNormalized: CGFloat = 0
var greenNormalized: CGFloat = 0
var blueNormalized: CGFloat = 0
color.getRed(&redNormalized, green: &greenNormalized, blue: &blueNormalized, alpha: nil)
let red = UInt8(clamp(redDouble * 255, min: 0, max: 255))
let green = UInt8(clamp(greenDouble * 255, min: 0, max: 255))
let blue = UInt8(clamp(blueDouble * 255, min: 0, max: 255))
return SIMD3(x: red, y: green, z: blue)
}
let collection = LUTCollection<UInt8>(entries: entries)
let palette = BytePalette.from(lutCollection: collection)
let ditheringEngine = DitheringEngine()
try ditheringEngine.set(image: inputCGImage)
try ditheringEngine.dither(
usingMethod: .floydSteinberg,
andPalette: .custom,
withDitherMethodSettings: EmptyPaletteSettingsConfiguration(),
withPaletteSettings: CustomPaletteSettingsConfiguration(palette: palette)
)
In addition to DitheringEngine
dithering images, VideoDitheringEngine
exists to dither videos. The VideoDitheringEngine works by applying a palette and dither method to every frame in the video. You may also choose to resize the video as part of this process.
Example usage:
// Create an instance of VideoDitheringEngine
let videoDitheringEngine = VideoDitheringEngine()
// Create a video description
let videoDescription = VideoDescription(url: inputVideoURL)
// Set preferred output size.
videoDescription.renderSize = CGSize(width: 320, height: 568)
// Dither to quantized color with 5 bits using Floyd-Steinberg.
videoDitheringEngine.dither(
videoDescription: videoDescription,
usingMethod: .floydSteinberg,
andPalette: .quantizedColor,
withDitherMethodSettings: FloydSteinbergSettingsConfiguration(direction: .leftToRight),
andPaletteSettings: QuantizedColorSettingsConfiguration(bits: 5),
outputURL: outputURL,
progressHandler: progressHandler,
completionHandler: completionHandler
)
Using an ordered dither method is faster, and will give the best result as the pattern will not “move” (like static noise).
By default, the final video has a framerate of 30. You may adjust the final framerate by providing a frame rate when initializing VideoDitheringEngine. The final frame rate is less than or equal to the specified value.:
VideoDitheringEngine(frameRate: Int)
By default, video frames are rendered concurrently. You can disable this behaviour, or change the number of frames processed simultaneously using the numberOfConcurrentFrames
property.
Setting this to 1 will effectively disable concurrent frame processing. A higher number will be faster if the CPU has enough cores to handle the load, but will also use more memory.
When dithering a video, you may provide options for how the video should be processed. The following options are available:
-
precalculateDitheredColorForAllColors
: Makes an indexed map of all colors to dithered color. This adds an increased wait time in the begining. Might be faster with large LUTCollections (e.g. CGA) and longer videos. Is ignored with LUT (e.g. Quantized Color) which is already index based. -
removeAudio
: Does not transfer audio from the original video.
You set the video you want to use as input through the VideoDescription
type. This is a convenient wrapper around AVAsset
and lets you set the preferred output size.
Properites
Name | Type | Default | Description |
---|---|---|---|
renderSize | CGSize? { get set } | nil | Specifies the size for which to render the final dithered video. |
framerate | Float? { get } | nominalFrameRate | Returns the number of frames per second. Nil if the asset does not contain video. |
transform | CGAffineTransform? { get } | preferredTransform | The transfor (orientation, scale) of the video. |
duration | TimeInterval { get } | duration.seconds | Returns the duration of the video. |
sampleRate | Int? { get } | naturalTimeScale | Returns the number of audio samples per second. Nil if the asset does not contain audio. |
size | CGSize? { get } | naturalSize | Returns the size of the video. Nil if the asset does not contain video. |
Methods
/// Reads the first frame in the video as an image.
func getPreviewImage() async throws -> CGImage