Playback speed×Share postShare post at current timeShare from 0:000:00/0:00Transcript3Introduction to Running Local LLMs - Install and Setup MSTY (Step 3)Alfred EssaJun 20, 20253ShareTranscriptAfter you have installed Ollama, you want to install and setup MSTY. The open source can be downloaded from:https://msty.appDiscussion about this videoCommentsRestacksGenerative AI for Absolute BeginnersSubscribeAuthorsAlfred EssaRecent PostsBuild a Custom GPT: OpenAIJun 30 • Alfred EssaBuild Your First AI-Powered Web Application - Prompt EngineeringJun 26 • Alfred EssaBuild Your First AI-Powered Web Application - Overview & APIsJun 26 • Alfred EssaIntroduction to Running Local LLMs - Ollama Basics (Step 2)Jun 19 • Alfred EssaIntroduction to Running Local LLMs - Overview (Step 1)Jun 18 • Alfred EssaIntroduction to LLMs: The Fundamental MechanismJun 17 • Alfred Essa