Skip to main page content

Your own AI: Running LLMs locally via Ollama - A simple guide for people who don't like to share (their data)

My Session Status

What:
Session
When:
12:00 PM, Wednesday 25 Feb 2026 (25 minutes)
Theme:
Speaker
If you’d rather not share your data with mysterious servers in faraway places, this session might be for you. We’ll start by clarifying what “open source” means and highlighting a few open-source LLMs. You’ll learn the key differences between locally run and cloud-based models, and why those differences matter for privacy and security. Then we’ll introduce Ollama, a simple tool that makes it easy to download small language models and run them locally. We’ll wrap up with a step-by-step demo on installing Ollama on Windows and running your first small, local model.

My Session Status

Send Feedback

Session detail
Allows attendees to send short textual feedback to the organizer for a session. This is only sent to the organizer and not the speakers.
When enabled, you can choose to display attendee lists for individual sessions. Only attendees who have chosen to share their profile will be listed.
Enable to display the attendee list on this session's detail page. This change applies only to this session.

Changes here will affect all session detail pages unless otherwise noted