DailyGlimpse

Running Stable Diffusion on Apple Silicon with Core ML

AI
April 26, 2026 ยท 5:13 PM
Running Stable Diffusion on Apple Silicon with Core ML

Thanks to Apple engineers, you can now run Stable Diffusion on Apple Silicon using Core ML. Apple has released a repository with conversion scripts and inference code based on the ๐Ÿงจ Diffusers library from Hugging Face. The team at Hugging Face has converted the official Stable Diffusion checkpoints and made them available on the Hugging Face Hub.

Update: A few weeks after this post, Hugging Face released a native Swift app, available on the Mac App Store, along with the source code for other projects.

Available Checkpoints

The converted models include:

Core ML supports CPU, GPU, and Apple's Neural Engine (NE), with options to split computation across devices for better performance. Each model has multiple variants to suit different hardware.

Performance Notes

Key variants:

  • Attention type: original (CPU/GPU only) vs split_einsum (all compute units). original may be faster on some devices.
  • Format: packages for Python, compiled for Swift. Compiled models split the UNet into multiple files for iOS/iPadOS compatibility.

Testing on a MacBook Pro with M1 Max (32 GPU cores, 64 GB RAM), the best results came from using original attention, all compute units, and macOS Ventura 13.1 Beta 4 (22C5059b). This generated one image in 18 seconds.

Note: macOS Ventura 13.1 is required for Apple's implementation. Older versions may produce black images and slower times.

Example repo structure:

coreml-stable-diffusion-v1-4
โ”œโ”€โ”€ README.md
โ”œโ”€โ”€ original
โ”‚   โ”œโ”€โ”€ compiled
โ”‚   โ””โ”€โ”€ packages
โ””โ”€โ”€ split_einsum
    โ”œโ”€โ”€ compiled
    โ””โ”€โ”€ packages

Core ML Inference in Python

Prerequisites

pip install huggingface_hub
pip install git+https://github.com/apple/ml-stable-diffusion

Download the Model

from huggingface_hub import snapshot_download
from pathlib import Path

repo_id = "apple/coreml-stable-diffusion-v1-4"
variant = "original/packages"

model_path = Path("./models") / (repo_id.split("/")[-1] + "_" + variant.replace("/", "_"))
snapshot_download(repo_id, allow_patterns=f"{variant}/*", local_dir=model_path, local_dir_use_symlinks=False)
print(f"Model downloaded at {model_path}")

Inference

Use Apple's Python script:

python -m python_coreml_stable_diffusion.pipeline --prompt "a photo of an astronaut riding a horse on mars" -i models/coreml-stable-diffusion-v1-4_original_packages -o </path/to/output/image> --compute-unit ALL --seed 93

Options: --compute-unit can be ALL, CPU_AND_GPU, CPU_ONLY, or CPU_AND_NE. If using a different model, specify its Hub ID with --model-version.

Core ML Inference in Swift

Download

Download the compiled variant. Example for original/compiled:

from huggingface_hub import snapshot_download
from pathlib import Path

repo_id = "apple/coreml-stable-diffusion-v1-4"
variant = "original/compiled"

model_path = Path("./models") / (repo_id.split("/")[-1] + "_" + variant.replace("/", "_"))
snapshot_download(repo_id, allow_patterns=f"{variant}/*", local_dir=model_path, local_dir_use_symlinks=False)
print(f"Model downloaded at {model_path}")

Inference

Use the Swift package.

Bring Your Own Model

Convert custom models using Apple's conversion script. For example:

python -m python_coreml_stable_diffusion.torch2coreml --convert-unet --convert-text-encoder --convert-vae-decoder --convert-safety-checker --model-version runwayml/stable-diffusion-v1-5 -o ./output

Use --chunk-unet for iOS compatibility.

Next Steps

Explore the Hugging Face Swift app or the Apple repository for more details.