Releases: bramses/chatgpt-md
Releases · bramses/chatgpt-md
2.3.1
2.2.3
Reintroduces real streaming for Ollama
Full Changelog: 2.2.1...2.2.3
2.2.2
Full Changelog: 2.2.1...2.2.2
2.2.1
fixes changed citations API response from Perplexity
Full Changelog: 2.2.0...2.2.1
2.2.0
v2.2.0 (April 12, 2024) - URL Parameter Refactoring
Service Configuration
- Service-Specific URLs: Each AI service now has its own dedicated URL parameter in both global settings and note frontmatter:
openaiUrl
for OpenAI APIopenrouterUrl
for OpenRouter.aiollamaUrl
for Ollama
- Automatic Settings Migration: Added migration logic to standardize service URLs for better consistency and compatibility
Feature Enhancements
- Perplexity Source Citations: Added support for web source citations when using Perplexity models via OpenRouter.ai. Access models like openrouter@perplexity/llama-3.1-sonar-small-128k-online and openrouter@perplexity/llama-3.1-sonar-large-128k-online without needing a Perplexity Pro subscription.
Bug Fixes & Improvements
- Fixed Ollama Streaming: Resolved CORS issues when using Ollama on mobile devices with an alternative request method
- Better Error Handling: Improved feedback for service type and API key validation issues
- System Commands Fix: Fixed missing system commands from notes' frontmatter
- Consistent API Endpoints: Simplified URL and endpoint usage across all services
- Template Organization: Templates now appear in alphabetical order in the template suggest modal (fixes #91)
Mobile Enhancements
- Improved Mobile Support: Enhanced Ollama support on mobile with a CORS-friendly request method
- URL Priority Logic: Refined URL priority for fetching available models
Code Maintenance
- Removed Unused Services: Removed deprecated services for a cleaner codebase
- Dependency Updates: Updated project dependencies to latest versions
v2.1.4 (Beta) - Endpoint Handling
Improvements
- Simplified URL and Endpoint Usage: Streamlined the way service endpoints are handled across the application
- Model Fetching Priority: Improved URL priority for fetching available models
v2.1.3 (Beta) - System Commands Update
Enhancements
- System Command Handling: Fixed system commands from settings not being applied correctly
- Code Cleanup: Removed unused services and improved code organization
- Template Improvements: Added alphabetical ordering to templates in the template suggestion modal
- Settings Framework: Introduced a robust settings migration framework
- Citation Support: Added support for Perplexity response citations
Full Changelog: 2.1.2...2.2.0
2.1.5
Full Changelog: 2.1.4...2.1.5
2.1.4
Full Changelog: 2.1.3...2.1.4
2.1.3
2.1.2
What's Changed
- fixes default front matter collection order by @DenizOkcu in #140
- removes external links from fetching linked content for context by @DenizOkcu in #141
Full Changelog: 2.1.0...2.1.2
2.1.0
What's New in v2.1.0 🚀
- We've added support for OpenRouter.ai as an LLM provider in our Obsidian plugin.
Set an OpenRouter.ai API key in the settings, and you can access a wide range of models like
Gemini, DeepSeek, Llama, Perplexity and many more (full list: https://openrouter.ai/models). - The new command
ChatGPT MD: Select Model
allows you to select from all available LLMs (openAI, Ollama, OpenRouter.ai)
and set the current model for your note.
Full Changelog: 2.0.1...2.1.0