Back to Freedom Tech Daily Back
All Ollama releasesAll versions
Release Wed, May 13, 2026 1 min read

Ollama 0.30.0

This version of Ollama will change the architecture to directly support llama.cpp instead of building on top of GGML, and allows for compatibility with GGUF file format.

This version of Ollama will change the architecture to directly support llama.cpp instead of building on top of GGML, and allows for compatibility with GGUF file format. MLX is used to accelerate model inference on Apple Silicon.

While in pre-release we'd love feedback on:

  • Performance improvements or degradation
  • Errors or crashes that did not previously occur
  • Memory utilization improvements or degradation

Known issues

  • laguna-xs.2 is not supported yet on this pre-release
  • llama3.2-vision is not supported yet on this pre-release

Installing

Mac/Linux

curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.30.0-rc15 sh

Primary Sources