In earlier articles, we installed Ollama and learned how to run models from the terminal. Now it’s time to take the next step using Ollama with Python. This is where things get really powerful. By connecting Python with Ollama, you can build:

  • Local chatbots
  • Document summarizers
  • AI tools without internet
  • Offline coding assistants
  • Mini apps for personal use

And the best part?

No API keys, no billing, no rate limits. Everything runs locally.

Let’s begin.

Prerequisite

  • pip must be installed on your machine

Install the Official Ollama Python Package

Open your terminal and run:

Bash
pip install ollama

This installs the official Ollama Python client. This works on all the operating systems

  • Windows
  • macOS
  • Linux

Verify That Ollama Is Running

Before using Python, make sure Ollama is active.

Bash
ollama serve

Or check with:

Bash
ollama list

If a list of models appears, you’re ready.

Your First Python Script Using Ollama

Create a new file having name hello_ollama.py

Bash
from ollama import Client

client = Client()

response = client.chat(model='llama3.1', messages=[
    {"role": "user", "content": "Hello! How are you?"}
])

print(response['message']['content'])

Run it:

Bash
python hello_ollama.py

You will see a friendly reply just like ChatGPT, but offline.

How the Python Client Works

Ollama uses a simple message structure:

Bash
messages = [
  {"role": "user", "content": "question here"},
  {"role": "assistant", "content": "previous answer"}
]

This allows conversation memory.

Example: Build a Local Text Summarizer

Create a file named summarizer.py

Bash
from ollama import Client

client = Client()

text = """
Artificial intelligence is transforming industries by enabling automation,
predictive analytics, and personalized solutions...
"""

prompt = f"Summarize this text in simple words:\n\n{text}"

response = client.chat(model='llama3.1', messages=[
    {"role": "user", "content": prompt}
])

print("\nSummary:\n", response["message"]["content"])

Run:

Bash
python summarizer.py

Example: Build a Simple Coding Helper

Bash
from ollama import Client
client = Client()

question = "Write a Python function to reverse a string."

response = client.chat(model="phi3", messages=[
    {"role": "user", "content": question}
])

print(response["message"]["content"])

Using Phi 3 gives better coding results.

Load Text From a File and Summarize It

Bash
from ollama import Client
client = Client()

with open("notes.txt", "r") as f:
    content = f.read()

prompt = "Summarize the following text:\n" + content

response = client.chat(model="llama3.1", messages=[
    {"role": "user", "content": prompt}
])

print(response["message"]["content"])

Perfect for:

  • Meeting notes
  • College notes
  • Blog drafts
  • Research documents

Common Errors & Fixes

Error: “Connection refused”

Ollama isn’t running.

Bash
ollama serve

Error: “Model not found”

Bash
ollama pull llama3.1

Error: Slow response

Use a smaller model:

Bash
ollama pull phi3

Error: Python can’t find package

Bash
pip install --upgrade pip
pip install ollama

Mini Project: Create Your First Local Chatbot

Create a new file named chatbot.py

Bash
from ollama import Client
client = Client()

print("Local AI Chatbot (Type 'exit' to quit)\n")

history = []

while True:
    user_input = input("You: ")

    if user_input.lower() == "exit":
        break

    history.append({"role": "user", "content": user_input})

    response = client.chat(model="llama3.1", messages=history)

    answer = response["message"]["content"]
    print("Bot:", answer)

    history.append({"role": "assistant", "content": answer})

Run:

Bash
python chatbot.py

You’ve now built your own local ChatGPT.

FAQ

Can I use multiple models?

Yes, simply set:
model="mistral" or model="phi3"

Does Python work offline?

Yes, after you download the model.

Is it faster than cloud APIs?

For small & medium models — yes.

Conclusion

Using Ollama with Python opens the door to unlimited offline AI experiments. You can build chatbots, summarizers, coding helpers, and personal tools all without depending on cloud services.

References

Leave a Reply

Your email address will not be published. Required fields are marked *