shlogg · Early preview
Luca Liu @luca_datateam

Chat With Local LLM Model In Obsidian Using Copilot Plugin

Chat with local LLM model in Obsidian using Copilot plugin: Install, add custom model, enable CORS & open chat window for AI-generated insights.

Introduction

In Why You Should Try a Local LLM Model—and How to Get Started, I introduced how to set up a local LLM model using LM Studio.
In this article, I will show you how to chat with a local LLM model in Obsidian.

  
  
  Method

Obsidian’s Copilot plugin allows you to connect to a custom model, enabling you to use AI-generated insights directly within your markdown workspace. Here’s a step-by-step guide to setting it up.

  
  
  Step 1: Install the Copilot Plugin in Obsidian

Open Obsidian and go to Settings > Community Plugins.
Enable Community Plugins if you haven’t already.
Search...