<?xml version="1.0" encoding="UTF-8" ?>
<rss version="2.0">
  <channel>
    <title>Freedom Tech Daily — AI</title>
    <link>https://news.bitcoiner.guide</link>
    <description>AI releases on Freedom Tech Daily.</description>
    <item>
      <title>llama.cpp b9143</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-14-llama-cpp/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-14-llama-cpp/</guid>
      <pubDate>Thu, 14 May 2026 01:47:18 GMT</pubDate>
      <description>macOS Apple Silicon (arm64); macOS Apple Silicon (arm64, KleidiAI enabled); macOS Intel (x64); iOS XCFramework.</description>
    </item>
    <item>
      <title>Ollama 0.23.4</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-13-ollama-0-23-4/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-13-ollama-0-23-4/</guid>
      <pubDate>Wed, 13 May 2026 23:50:49 GMT</pubDate>
      <description>ollama launch opencode now supports vision models with image inputs; Fixed formatting of Claude tool results when using local image paths.</description>
    </item>
    <item>
      <title>LocalAI 4.2.4</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-13-localai-4-2-4/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-13-localai-4-2-4/</guid>
      <pubDate>Wed, 13 May 2026 22:54:22 GMT</pubDate>
      <description>fix(distributed): cascade-clean stale node models rows + filter routing by healthy status; fix(http): honor X-Forwarded-Prefix when proxy strips the prefix; fix(agentpool): close truncate-then-read race in agent jobs.json persistence; fix(middleware): parse OpenAI-spec tool choice in /v1/chat/comple</description>
    </item>
    <item>
      <title>Ollama 0.30.0</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-13-ollama-0-30-0/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-13-ollama-0-30-0/</guid>
      <pubDate>Wed, 13 May 2026 16:52:48 GMT</pubDate>
      <description>This version of Ollama will change the architecture to directly support llama.cpp instead of building on top of GGML, and allows for compatibility with GGUF file format.</description>
    </item>
    <item>
      <title>Hugging Face Transformers Patch 5.8.1</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-13-hugging-face-transformers-patch-5-8-1/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-13-hugging-face-transformers-patch-5-8-1/</guid>
      <pubDate>Wed, 13 May 2026 03:21:23 GMT</pubDate>
      <description>This release is mainly to fix the Deepseek V4 integration!!!</description>
    </item>
    <item>
      <title>vLLM 0.20.2</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-10-vllm-0-20-2/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-10-vllm-0-20-2/</guid>
      <pubDate>Sun, 10 May 2026 07:39:12 GMT</pubDate>
      <description>This release features 6 commits from 6 contributors (0 new)!</description>
    </item>
    <item>
      <title>LangChain== 1.2.18</title>
      <link>https://news.bitcoiner.guide/posts/2026-05-08-langchain-1-2-18/</link>
      <guid>https://news.bitcoiner.guide/posts/2026-05-08-langchain-1-2-18/</guid>
      <pubDate>Fri, 08 May 2026 13:59:43 GMT</pubDate>
      <description>release(langchain): 1.2.18 ( 37250) revert: feat(langchain): ls agent type tag on create agent calls ( 37249) chore(langchain-classic): deprecate hub, limit loads/dumps ( 37234) refactor(langchain-classic): retarget deprecations to create agent , other chores ( 37164) chore(langchain,langchain-class</description>
    </item>
  </channel>
</rss>
