Running models locally
There's lots of generative AI models you can run locally, instead of using online services. Here's a couple of options!
Running generative models locally
All the big models run on external servers and are usually only available through a (paid) accoun...
Running Large Language Models locally
Ollama is currently a popular options for running LLMs locally. With the newer versions you can d...
Experiment with Dutch local LLMs
MacWhisper (Mac only) is a great local transcription tool that converts audio to text. It also ha...