Open Issues Need Help
View All on GitHubLocal-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding a new `kindx doctor` command to provide a centralized diagnostic report for the KINDX system. The command would consolidate troubleshooting information by reporting on sqlite-vec availability, active backend mode, remote API configuration, MCP authentication status, and the health of daemon/watch processes. This aims to streamline support, accelerate onboarding, and simplify issue triage by offering a single entry point for critical system diagnostics.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue addresses the problem of model-backed tests failing noisily on local developer machines that lack the necessary `node-llama-cpp` runtime capabilities (e.g., Metal/GPU context), as they currently only skip in CI environments. This creates a poor local development experience by generating confusing errors and obscuring the actual health of the repository. The proposed solution is to implement a reusable capability probe that allows these specific tests to cleanly skip on unsupported local environments, improving contributor feedback.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding a new CI job in GitHub Actions to run existing packaging and install smoke checks, which are currently not exercised. The goal is to catch install and packaging regressions that can slip past current CI steps like `npm test` and `npm pack --dry-run`. It will leverage existing `specs/smoke-install.sh` and `specs/Containerfile` to validate a practical install/runtime flow.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: The current system loads `index.yml` without validation, leading to silent misconfigurations, confusing runtime behavior, and issues with legacy array-shaped collections. This issue proposes adding schema validation to `index.yml` upon loading, ensuring invalid structures, unknown keys, and incorrect types fail fast with clear error messages, while also specifically addressing legacy collection formats. The goal is to improve error handling and configuration reliability.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding a quickstart guide to the README for setting up a remote LLM backend, specifically for services like Ollama, LM Studio, and other OpenAI-compatible endpoints. The goal is to improve user discoverability and ease of use for this crucial fallback option, as the current documentation only lists environment variables without a clear setup example.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This GitHub issue requests an update to the `README`'s data-storage/schema documentation, which is currently outdated. It depicts old table structures and implies a `collections` table, while the system now uses SQLite tables like `content` and `documents` alongside YAML for collection configuration. The goal is to refresh this documentation to accurately reflect the current implementation defined in `engine/repository.ts`.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue highlights that YAML-defined collection paths do not correctly expand the `~` (tilde) character to the user's home directory, leading to misleading examples in `sample-catalog.yml`. The problem causes broken configurations for users following the sample. The proposed solution is to implement tilde expansion for these paths and update the sample catalog or related documentation to reflect the supported behavior.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This GitHub issue proposes adding `--pattern` as a backward-compatible alias for the existing `collection add --mask` CLI flag. The primary goal is to resolve a terminology mismatch where the CLI uses "mask" but documentation and user understanding refer to "patterns." This change aims to improve consistency, learnability, and documentation across the product by aligning the terminology.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This GitHub issue requests the addition of new regression tests to verify the correct behavior of `collection include` and `exclude` commands, specifically how they affect default queries. The current CLI implementation toggles an `includeByDefault` setting, but lacks end-to-end tests to ensure collections are properly omitted or re-included. The goal is to add command-line tests to prevent silent regressions and confirm that explicit `--collection` filters still override default behavior.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.