It's here! How To Run Local AI With Obsidian 🤖 Copilot Plugin + LM Studio 📝
In this video, I walk through how to set up a local AI model (LLM & Text Embedding) AND how to connect that locally running model to Obsidian Copilot & Smart Connections Plugins.
These local models help you surface insights with related notes (suggested links) and let you chat with your vault. Copilot specifically has a "Vault Q&A" where you can ask questions to your vault, that is surprisingly powerful.
I use LM Studio in this video and I'm writing a guide that will explain how you can also use Ollama (it's just a bit more involved).
Please let me know if you have any questions/feedback by commenting on the video itself 😊
Wanderloots
It's here! How To Run Local AI With Obsidian 🤖 Copilot Plugin + LM Studio 📝
In this video, I walk through how to set up a local AI model (LLM & Text Embedding) AND how to connect that locally running model to Obsidian Copilot & Smart Connections Plugins.
These local models help you surface insights with related notes (suggested links) and let you chat with your vault. Copilot specifically has a "Vault Q&A" where you can ask questions to your vault, that is surprisingly powerful.
I use LM Studio in this video and I'm writing a guide that will explain how you can also use Ollama (it's just a bit more involved).
Please let me know if you have any questions/feedback by commenting on the video itself 😊
2 weeks ago | [YT] | 5