- Install Ollama on Windows
- Common Windows Errors & Fixes
- Install Ollama on macOS (M1, M2, M3, Intel)
- Common macOS Errors & Fixes
- Install Ollama on Linux (Ubuntu, Debian, Arch, Mint)
- Common Linux Errors & Fixes
- Download Your First Model
- FAQ when installing Ollama on Mac, windows, Linux
- Conclusion

After understanding what Ollama is and how it works, the next question is obvious
How do you install it?
Luckily, installing Ollama is one of the easiest things you’ll ever do. Whether you’re using Windows, a MacBook, or a Linux system, the setup takes just a few minutes.
In this guide, I’ll walk you through each platform with simple steps and fixes for common errors.
Let’s get started.
Install Ollama on Windows
Ollama released official Windows support recently, and installation is now super simple.
Step 1: Download the Windows installer
Go to the official page:
👉 https://ollama.com/download
Click the Windows (.exe) installer.
Step 2: Run the installer
- Double click the downloaded
.exefile - Click Next → Next → Install
- Finish the setup
Step 3: Open PowerShell or Command Prompt
Windows Key → type "PowerShell" → EnterStep 4: Verify installation
ollama --versionIf you see a version number, you’re ready to use Ollama.
Common Windows Errors & Fixes
1. “Ollama is not recognized”
Cause: PATH not updated
Fix: Restart your system.
2. Model download stuck at 0%
Fix:
- Check internet connection
- Try running PowerShell as Administrator
- Disable VPN
3. Permission errors
Run PowerShell as Administrator and try again.
Install Ollama on macOS (M1, M2, M3, Intel)
Ollama works beautifully on Macs — especially Apple Silicon chips.
Step 1: Install using Homebrew (recommended)
If you have Homebrew installed:
brew install ollamaStep 2: Start Ollama
ollama serveStep 3: Verify installation
ollama –version
Alternate installation method (without Homebrew)
Download the macOS installer from:
https://ollama.com/download
Double click the .dmg file and drag Ollama into Applications.
Common macOS Errors & Fixes
brew: command not found
Install Homebrew first:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"Ollama not responding
sudo pkill -f ollama
ollama serveSlow model download
Mac’s network security may limit speed.
Switch to a different WiFi or mobile hotspot.
Install Ollama on Linux (Ubuntu, Debian, Arch, Mint)
Ollama provides a simple one line script for Linux.
Step 1: Install curl (if missing)
sudo apt install curl -yStep 2: Run the official install script
curl -fsSL https://ollama.com/install.sh | shStep 3: Verify installation
ollama --versionCommon Linux Errors & Fixes
1. Permission denied
sudo ollama serveFirewall blocking model download
Firewall blocking model download
sudo ufw allow out 443GPU issues
Most Linux users must run CPU mode unless they have a dedicated NVIDIA GPU + CUDA installed.
Download Your First Model
After installation, download a model to test everything.
ollama pull llama3.1Then run
ollama run llama3.1You’ll now see a local chat interface similar to ChatGPT, but running entirely on your computer.
FAQ when installing Ollama on Mac, windows, Linux
1. Is Ollama free?
Yes, completely free.
2. Does Ollama work offline?
Yes. After downloading a model, you don’t need internet.
3. Can it run on low end laptops?
Yes, but choose smaller models like 3B, 7B versions.
Conclusion
Installing Ollama is quick and beginner friendly no matter which operating system you use.
In just a few minutes, you can set up your own offline AI assistant that works faster and more privately than cloud based tools.