How to Choose a Local AI in DEVONthink
devontechnologies.com·13h
Flag this post

The DEVONtechnologies Blog

November 11, 2025 — Jim Neumann

Screenshot showing LM Studio’s server settings.

If you want to use AI in DEVONthink, you can choose between server-based models (the big ones) and local models. Whether it’s for privacy reasons or just the novelty, running AI locally on your Mac may be something you’re considering. Here are some thoughts on choosing a local model.

First, you’ll need an AI application for hosting and running the model. The two directly supported in DEVONthink are LM Studio and Ollama. If you go to either site, look fo…

Similar Posts

Loading similar posts...