Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

0 stars 0 forks 0 watchers TypeScript MIT License
cli knowledge-base llm local-first markdown mcp-server node-llama-cpp rag search-engine semantic-search vector-search
15 Open Issues Need Help Last updated: Mar 18, 2026

Open Issues Need Help

View All on GitHub

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search
documentation good first issue triage status/ready type/documentation difficulty/good-first area/config area/docs

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search
documentation good first issue triage status/ready type/documentation difficulty/good-first area/cli area/docs

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search
good first issue triage status/ready difficulty/good-first type/testing area/packaging area/testing

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search
documentation good first issue triage status/ready type/documentation difficulty/good-first area/config area/docs

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search
enhancement help wanted triage status/ready type/feature difficulty/intermediate area/config area/cli

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This issue proposes adding a new `kindx doctor` command to provide a centralized diagnostic report for the KINDX system. The command would consolidate troubleshooting information by reporting on sqlite-vec availability, active backend mode, remote API configuration, MCP authentication status, and the health of daemon/watch processes. This aims to streamline support, accelerate onboarding, and simplify issue triage by offering a single entry point for critical system diagnostics.

Complexity: 5/5
enhancement help wanted triage status/ready difficulty/advanced type/feature area/cli area/cross-cutting

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This issue addresses the problem of model-backed tests failing noisily on local developer machines that lack the necessary `node-llama-cpp` runtime capabilities (e.g., Metal/GPU context), as they currently only skip in CI environments. This creates a poor local development experience by generating confusing errors and obscuring the actual health of the repository. The proposed solution is to implement a reusable capability probe that allows these specific tests to cleanly skip on unsupported local environments, improving contributor feedback.

Complexity: 4/5
enhancement help wanted triage status/ready type/testing difficulty/advanced area/remote-backend area/testing

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This issue proposes adding a new CI job in GitHub Actions to run existing packaging and install smoke checks, which are currently not exercised. The goal is to catch install and packaging regressions that can slip past current CI steps like `npm test` and `npm pack --dry-run`. It will leverage existing `specs/smoke-install.sh` and `specs/Containerfile` to validate a practical install/runtime flow.

Complexity: 3/5
enhancement help wanted triage status/ready type/ci difficulty/intermediate area/ci area/packaging

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: The current system loads `index.yml` without validation, leading to silent misconfigurations, confusing runtime behavior, and issues with legacy array-shaped collections. This issue proposes adding schema validation to `index.yml` upon loading, ensuring invalid structures, unknown keys, and incorrect types fail fast with clear error messages, while also specifically addressing legacy collection formats. The goal is to improve error handling and configuration reliability.

Complexity: 3/5
enhancement help wanted triage status/ready type/tooling difficulty/intermediate area/config area/testing

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This issue proposes adding a quickstart guide to the README for setting up a remote LLM backend, specifically for services like Ollama, LM Studio, and other OpenAI-compatible endpoints. The goal is to improve user discoverability and ease of use for this crucial fallback option, as the current documentation only lists environment variables without a clear setup example.

Complexity: 1/5
documentation help wanted triage status/ready type/documentation difficulty/beginner area/remote-backend area/docs

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This GitHub issue requests an update to the `README`'s data-storage/schema documentation, which is currently outdated. It depicts old table structures and implies a `collections` table, while the system now uses SQLite tables like `content` and `documents` alongside YAML for collection configuration. The goal is to refresh this documentation to accurately reflect the current implementation defined in `engine/repository.ts`.

Complexity: 1/5
documentation help wanted triage status/ready type/documentation difficulty/beginner area/config area/docs

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This issue highlights that YAML-defined collection paths do not correctly expand the `~` (tilde) character to the user's home directory, leading to misleading examples in `sample-catalog.yml`. The problem causes broken configurations for users following the sample. The proposed solution is to implement tilde expansion for these paths and update the sample catalog or related documentation to reflect the supported behavior.

Complexity: 2/5
bug help wanted triage status/ready difficulty/beginner type/bug area/config area/indexing

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This GitHub issue proposes adding `--pattern` as a backward-compatible alias for the existing `collection add --mask` CLI flag. The primary goal is to resolve a terminology mismatch where the CLI uses "mask" but documentation and user understanding refer to "patterns." This change aims to improve consistency, learnability, and documentation across the product by aligning the terminology.

Complexity: 1/5
enhancement help wanted triage status/ready difficulty/beginner type/feature area/config area/cli

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search

AI Summary: This GitHub issue requests the addition of new regression tests to verify the correct behavior of `collection include` and `exclude` commands, specifically how they affect default queries. The current CLI implementation toggles an `includeByDefault` setting, but lacks end-to-end tests to ensure collections are properly omitted or re-included. The goal is to add command-line tests to prevent silent regressions and confirm that explicit `--collection` filters still override default behavior.

Complexity: 1/5
good first issue triage status/ready difficulty/good-first type/testing area/config area/testing

Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.

TypeScript
#cli#knowledge-base#llm#local-first#markdown#mcp-server#node-llama-cpp#rag#search-engine#semantic-search#vector-search