Zig-based Ollama alternative for running LLMs locally. Built on top of llama.cpp.zig bindings

ai cli cpp igllama inference library llama llama-cpp llamacpp llm ollama zig
1 Open Issue Need Help Last updated: Feb 24, 2026

Open Issues Need Help

View All on GitHub

AI Summary: The `igllama pull` command's progress bar renders as garbled characters (mojibake) on standard Windows terminals. This issue arises because the tool sends UTF-8 multi-byte characters for the progress bar, which Windows terminals often misinterpret using legacy code pages instead of UTF-8. The expected behavior is a visually clean progress bar using Unicode or an ASCII fallback.

Complexity: 3/5
bug enhancement help wanted good first issue

Zig-based Ollama alternative for running LLMs locally. Built on top of llama.cpp.zig bindings

Zig
#ai#cli#cpp#igllama#inference#library#llama#llama-cpp#llamacpp#llm#ollama#zig