π A seamless integration of ChatGPT, OpenRouter.ai and local LLMs via Ollama into Obsidian.
- Perplexity Source Citations: Added support for source citations when using Perplexity models via OpenRouter.ai (openrouter@perplexity/llama-3.1-sonar-small-128k-online, openrouter@perplexity/llama-3.1-sonar-large-128k-online). Get web sources for your queries without needing a Perplexity Pro subscription - pay only for the tokens you use via OpenRouter.ai.
- Improved URL Configuration: Each AI service now has its own dedicated URL parameter in settings and frontmatter:
openaiUrl
for OpenAI APIopenrouterUrl
for OpenRouter.aiollamaUrl
for Ollama
- Enhanced Mobile Support: Fixed Ollama streaming without CORS issues on mobile devices
- Improved System Commands: Fixed missing system commands from notes' frontmatter
- Template Organization: Templates are now ordered alphabetically in the template suggest modal
- Settings Migration: Added automatic migration of service URLs for better consistency
Get started in just a few simple steps:
- Install ChatGPT MD: Go to
Settings > Community Plugins > Browse
, search forChatGPT MD
and clickInstall
. - Add your OpenAI API key: In the plugin settings, add your OpenAI API key and/or install Ollama and local LLMs of your choice.
- Start chatting: Use the
ChatGPT MD: Chat
command (cmd + p
orctrl + p
) to start a conversation from any note.
π‘ Pro tip: Set up a hotkey for the best experience! Go to Settings > Hotkeys
, search for ChatGPT MD: Chat
and add your preferred keybinding (e.g., cmd + j
).
Start chatting, don't worry too much about the more advanced features. They will come naturally :-)
- Interactive conversations:
- Engage directly with ChatGPT, OpenRouter.ai models, and Ollama from any Markdown note, edit questions or responses on-the-fly, and continue the chat seamlessly.
- Privacy & Zero API Costs:
- Use local LLMs via Ollama, keeping your chats on your computer and avoiding API costs.
- Multiple AI Providers:
- Choose from OpenAI, OpenRouter.ai (with access to models like Gemini, Claude, DeepSeek, Llama, Perplexity), or local models via Ollama.
- System Commands:
- Instruct the LLM via system commands to get the best possible answers.
- Link context:
- Provide links to any other note in your vault for added context during conversations with Markdown or Wiki links.
- Per-note Configuration:
- Overwrite default settings via frontmatter for individual notes using params from OpenAI API, OpenRouter.ai, or Ollama API.
- Markdown Support:
- Enjoy full rendering of lists, code blocks, and more from all responses.
- Minimal Setup:
- Utilize your OpenAI API key, OpenRouter.ai API key, or install any LLM locally via Ollama.
- Comment Blocks:
- Ignore parts of your notes using comment blocks.
- Chat Templates:
- Use and share frontmatter templates for recurring scenarios. Explore chatgpt-md-templates.
ChatGPT MD is
- only storing data locally in your vault, with zero tracking and no 3rd party integrations except direct calls to the AI APIs (OpenAI, OpenRouter.ai).
- allowing you to use Ollama, a local LLM installation for offline conversation-based knowledge exploration.
The plugin comes with a well-balanced pre-configuration to get you started immediately.
You can change the global settings or use the local parameters in any note via frontmatter
(start typing ---
in the first line of your note to add properties)
---
system_commands: ['I am a helpful assistant.']
temperature: 0.3
top_p: 1
max_tokens: 300
presence_penalty: 0.5
frequency_penalty: 0.5
stream: true
stop: null
n: 1
model: gpt-4o-mini
# Service-specific URLs (optional, will use global settings if not specified)
openaiUrl: https://api.openai.com
# openrouterUrl: https://openrouter.ai
# ollamaUrl: http://localhost:11434
---
π‘ Pro tip: Increasing max_tokens
to a higher value e.g. 4096
for more complex tasks like reasoning, coding or text creation.
The default model gpt-4o-mini
is a good compromise between fast and cheap responses. Change this if you have more complex needs.
You can set and change the model for each request in your note.
Specify the model
property via frontmatter:
for openAI models
---
model: gpt-4o
system_commands: [act as a senior javascript developer]
---
prefix it with local@
for Ollama for local LLMs.
---
model: local@gemma2:27b
temperature: 1
---
The AI responses will keep the used model name in the response title for future reference.
You can find the list of your installed Ollama model names from your terminal via ollama list
or the available openAI model names online on this openAI models page.
Each AI service has its own dedicated URL parameter that can be configured globally in settings or per-note via frontmatter:
---
# For OpenAI
openaiUrl: https://api.openai.com
# For OpenRouter
openrouterUrl: https://openrouter.ai
# For Ollama
ollamaUrl: http://localhost:11434
---
The default URLs are:
- OpenAI:
https://api.openai.com
- OpenRouter:
https://openrouter.ai
- Ollama:
http://localhost:11434
Note: Previous versions used a single url
parameter which is now deprecated. Please update your templates and notes to use the service-specific URL parameters.
Run commands from Obsidian's command pallet via cmd + p
or ctrl + p
and start typing chatgpt
or set hotkeys
(a chat command hotkey is highly recommended for effortless chats (I use cmd + j
, which works fantastic, because your index finger is already resting on that key)).
- Chat: Parse the file and interact with ChatGPT. Assign a hotkey, e.g.
cmd + j
.
- New Chat with Highlighted Text: Start a chat using highlighted text and default frontmatter in
Chat Folder
. - New Chat From Template: Create chats from templates in
Chat Template Folder
.
- Infer Title: Automatically generate a note title based on the notes content. Configurable to auto-run after 4+ messages.
- Add Comment Block: Insert comment blocks for parts of your note that should be ignored.
- Select Model: Choose from all available LLMs (OpenAI, OpenRouter.ai, Ollama) and set the current model for your note.
- Clear Chat: Remove all messages while retaining frontmatter.
- Stop Streaming (Desktop Only): Halt ongoing streams if necessary.
- Add Divider: Insert horizontal rulers to organize content visually.
Use the ChatGPT MD: Chat
command from the Obsidian command Palette (cmd + p
or ctrl + p
) to start a conversation from any note.
Yes, you should! Go to Settings > Hotkeys
, search for ChatGPT MD: Chat
and add your preferred keybinding (e.g., cmd + j
).
You can use OpenAI's GPT 3 and 4 models, various models through OpenRouter.ai (like Claude, Gemini, DeepSeek, Llama, Perplexity), or any model you have installed via Ollama. DeepSeek-r1:7b works great for reasoning locally via Ollama.
Ensure your custom API adheres to OpenAI's specifications, such as Azure's hosted endpoints. Consult your provider for API key management details.
In the plugin settings, add your OpenAI API key and/or install Ollama and local LLMs of your choice.
The single 'url' parameter is now deprecated. In v2.2.0 and higher, we've introduced service-specific URL parameters: openaiUrl
, openrouterUrl
, and ollamaUrl
. This allows for more flexibility and clarity when configuring different services. Please update your templates and notes accordingly.
π€ Enjoy exploring the power of ChatGPT MD in your Obsidian vault!π
Pull requests, bug reports, and all other forms of contribution are welcomed and highly encouraged!*
Bram created ChatGPT MD in March 2023 lives in NYC and is building Your Commonbase (A Self Organizing Scrapbook with Zero Stress Storing, Searching, and Sharing). His personal website and newsletter is located at bramadams.dev
Deniz joined Bram in 2024 to continue development. He is working in a gaming company in Germany and uses AI heavily in his work and private life. Say "hi" on Bluesky: Deniz
Happy writing with ChatGPT MD! π» π