Let's play locally with the smallest LLM models available. We'll go through Ollama server installation, model setup and code integration.
Ollama Installation
To install Ollama server, as your entry point for all OpenSource LLMs, run:
curl -fsSL https://ollama.com/install.sh | sh
If you're a poor macOS or Windows user, download it here :-)
Model Setup
Now let's download and run a model you're interested in playing with.
I will try two reasoning models: Gemma 3 and DeepSeek R1.
Gemma 3 (1b parameters; 815Mb):
ollama run gemma3:1b
When it finishes downloading model, it will ask for your prompt.
You may try something like what would happen if people stopped blogging?
to test reasoning.
DeepSeek R1 (1.5b parameters; 1.1Gb):
ollama run deepseek-r1:1.5b
Try the same prompt what would happen if people stopped blogging?
here and compare the results.
Subjectively, these two models give really different responses and I would rather use them both in my research.
By the way, you don't need to host the models locally, you can install Ollama on VPS with better GPU and RAM if you need it.
Integration
If you want to use local models in your TS/JS app, consider using Vercel AI SDK with ollama-ai-provider like so:
// index.ts
import { generateText } from "ai";
import { ollama } from "ollama-ai-provider";
async function main() {
const prompt = "what would happen if people stopped blogging?";
const { text: responseGemma } = await generateText({
model: ollama("gemma3:1b"),
prompt,
});
console.log(responseGemma);
console.log("\n\n----------\n\n");
const { text: responseDeepSeek } = await generateText({
model: ollama("deepseek-r1:1.5b"),
prompt,
});
console.log(responseDeepSeek);
}
main().catch(console.error);
Vercel AI SDK also supports text streaming and built-in frontend components, which can come in handy in real app. If you want to go deeper and develop your own tool for Vercel AI SDK, I've got a good example of it in my post on @suiware blog.
That's it for now. A quick experiment that I think has opened doors to unlimited opportunities which you can now explore yourself.