Post

How I do local AI

I answer the question of how I run my AI Assistant and give some tips on using desktop apps.

In this episode, Unkle Bonehead discusses the concept of “Digital Liberation” and how it can be achieved through self-hosted AI assistants. They share their personal experience with switching from using cloud-based AI services to local hosting options. The conversation covers tips for experimenting with locally hosted AI, including being mindful of app removals and exploring different features across various apps. Unkle Bonehead also shares their current setup, which involves a combination of Ollama, Open Web UI, Fabric, and SystemSculpt AI plugin for Obsidian.

Contact Information:

Subscribe on these awesome platforms:

This post is licensed under CC BY 4.0 by the author.