A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

agentic-ai ai command-line-tool generative-ai linux llm local-llm macos mcp mcp-client mcp-server model-context-protocol ollama open-source pypi-package sse stdio streamable-http tool-management windows
2 Open Issues Need Help Last updated: Jan 13, 2026

Open Issues Need Help

View All on GitHub

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

Python
#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

Python
#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows