How to run an LLM locally with Ollama
Today we will try running an LLM locally on a Macbook Pro with Apple Silicon. To do this we will use ollama, since it is one of the easiest ways. First, download the right installer for your OS from ollamma.com. This will also install the CLI command ollama, which you can use to run LLMs […]
How to run an LLM locally with Ollama Read More »