0:00
/
0:00
Transcript

Introduction to Running Local LLMs - Ollama Basics (Step 2)

I re-recorded the Meetup segment on Ollama Basics. If you plan to run LLMs locally, Ollama is a key technology. Make sure you are able to download, install, and run. Don’t hesitate to seek help if you run into problems.

Some basic ollama commands to master:

ollama list

  • ollama run <model name>

  • ollama pull <model name>

Also, make sure you begin to learn the terminal shell environment.

Discussion about this video