Open Issues Need Help
View All on GitHub How the prompt caching is handled in the plugin? 11 days ago
AI Summary: The user is inquiring about the plugin's handling of prompt caching, specifically when using OpenRouter as a provider. They highlight that some hosted models require explicit caching to reduce token costs, especially for note-taking. The core question is whether the plugin enables this prompt caching by default.
Complexity:
2/5
good first issue question
THE Copilot in Obsidian
TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin
good first issue feature request
THE Copilot in Obsidian
TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin
Add Kilo Gateway as an LLM Provider 26 days ago
good first issue
THE Copilot in Obsidian
TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin
good first issue
THE Copilot in Obsidian
TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin