Darshit's blogs

Run AI Models on Your Laptop Using Ollama

Ollama_Image

You’ve probably seen or used tools like ChatGPT. But most of these need internet and work through cloud servers. What if you could run something like ChatGPT right on your laptop — without internet or cloud?

That’s where Ollama helps. It makes it super easy to use powerful AI models directly on your system — Mac, Windows, or Linux — without any complicated setup.


What’s Ollama?

In simple words, Ollama is a small tool you install on your laptop. After that, you can download and use different AI models. These models work like ChatGPT — they can answer your questions, help with writing, explain code, and more.

Once a model is downloaded, it works even without internet. So it’s fast, private, and under your control.


Installing Ollama (very easy)

Here’s how to get started:

  1. Go to this site: ollama.com/download

  2. Download the file as per your OS (Mac, Windows, Linux)

  3. Install it like any normal software

  4. After installing, open your terminal and type:

    ollama --help
    

If you see some commands, it means everything is set.


Try a Model (like ChatGPT)

Now let’s try a model. Best one to start with is Llama 3.

Just open terminal and type:

ollama run llama3

That’s it. First time it’ll download (around 4–8 GB depending on model size). After that, you can type anything and get replies instantly.

Example:

>>> Who is India’s PM?
Narendra Modi.

Want to stop? Just press Ctrl + C.


Models You Can Explore

There are many models available in Ollama. Here are a few good ones based on what you want to do:

  • llama3 – Very good for normal conversation and general Q&A
  • mistral – Fast and works well for day-to-day questions
  • gemma – Google’s open model, works smoothly
  • deepseek-coder – For coding tasks, explaining code, writing scripts
  • deepseek-coder:instruct – Same as above, but better at following instructions
  • qwen2.5-coder – Great for multilingual and technical coding help
  • codellama – Another option for code-related queries
  • phi3 – Lightweight, works even on low-end systems

If you’re curious, you can see all models here: 👉 https://ollama.com/search


Few Things to Keep in Mind

  • Try to have at least 8 GB RAM (better if 16 GB)
  • Models take space — from 2 GB to 10 GB depending on what you use
  • Once downloaded, model runs without internet
  • If you want, you can connect it to your own apps using API too