After understanding what Ollama is and how it works, the next question is obvious

How do you install it?

Luckily, installing Ollama is one of the easiest things you’ll ever do. Whether you’re using Windows, a MacBook, or a Linux system, the setup takes just a few minutes.

In this guide, I’ll walk you through each platform with simple steps and fixes for common errors.

Let’s get started.

Install Ollama on Windows

Ollama released official Windows support recently, and installation is now super simple.

Step 1: Download the Windows installer

Go to the official page:
👉 https://ollama.com/download

Click the Windows (.exe) installer.

Step 2: Run the installer

  • Double click the downloaded .exe file
  • Click Next → Next → Install
  • Finish the setup

Step 3: Open PowerShell or Command Prompt

Bash
Windows Key  type "PowerShell"  Enter

Step 4: Verify installation

Bash
ollama --version

If you see a version number, you’re ready to use Ollama.

Common Windows Errors & Fixes

1. “Ollama is not recognized”

Cause: PATH not updated
Fix: Restart your system.

2. Model download stuck at 0%

Fix:

  • Check internet connection
  • Try running PowerShell as Administrator
  • Disable VPN

3. Permission errors

Run PowerShell as Administrator and try again.

Install Ollama on macOS (M1, M2, M3, Intel)

Ollama works beautifully on Macs — especially Apple Silicon chips.

If you have Homebrew installed:

Bash
brew install ollama

Step 2: Start Ollama

Bash
ollama serve

Step 3: Verify installation

ollama –version

Alternate installation method (without Homebrew)

Download the macOS installer from:
https://ollama.com/download
Double click the .dmg file and drag Ollama into Applications.

Common macOS Errors & Fixes

brew: command not found

Install Homebrew first:

Bash
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

Ollama not responding

Bash
sudo pkill -f ollama
ollama serve

Slow model download

Mac’s network security may limit speed.
Switch to a different WiFi or mobile hotspot.

Install Ollama on Linux (Ubuntu, Debian, Arch, Mint)

Ollama provides a simple one line script for Linux.

Step 1: Install curl (if missing)

Bash
sudo apt install curl -y

Step 2: Run the official install script

Bash
curl -fsSL https://ollama.com/install.sh | sh

Step 3: Verify installation

Bash
ollama --version

Common Linux Errors & Fixes

1. Permission denied

Bash
sudo ollama serve

Firewall blocking model download

Firewall blocking model download

Bash
sudo ufw allow out 443

GPU issues

Most Linux users must run CPU mode unless they have a dedicated NVIDIA GPU + CUDA installed.

Download Your First Model

After installation, download a model to test everything.

Bash
ollama pull llama3.1

Then run

Bash
ollama run llama3.1

You’ll now see a local chat interface similar to ChatGPT, but running entirely on your computer.

FAQ when installing Ollama on Mac, windows, Linux

1. Is Ollama free?

Yes, completely free.

2. Does Ollama work offline?

Yes. After downloading a model, you don’t need internet.

3. Can it run on low end laptops?

Yes, but choose smaller models like 3B, 7B versions.

Conclusion

Installing Ollama is quick and beginner friendly no matter which operating system you use.
In just a few minutes, you can set up your own offline AI assistant that works faster and more privately than cloud based tools.

Leave a Reply

Your email address will not be published. Required fields are marked *