Tech Rants

Running AI Locally: Phi-3 with Alpaca & Ollama on Linux


Listen Later

In this episode, I walk you through how to run Microsoft’s Phi-3, a small but powerful open-source language model, entirely offline on Linux. Using Alpaca (a Flatpak GUI) with Ollama, we show how easy it is to load Phi-3 locally — no API keys, no subscriptions, just fast and private AI.


Whether you're a developer, AI enthusiast, or curious about open-source tools, this setup is perfect for getting started with local LLMs.


🔧 Tools featured:


Ollama (https://ollama.com)


Alpaca GUI (Flatpak)


Phi-3 Mini model (by Microsoft)


📺 Full YouTube tutorial: https://youtu.be/JMZ5llbvZQ8

💼 Need help? I offer setup & AI tutoring: https://ojamboservices.com/contact


#LocalAI #Phi3 #Ollama #Linux #Flatpak #OpenSource #ArtificialIntelligence #AIPodcast #TechTools #LLM

...more
View all episodesView all episodes
Download on the App Store

Tech RantsBy Edward Ojambo