Experiment with Dutch local LLMs
MacWhisper is a great local transcription tool that converts audio to text. It also has an option to add LLMs through Ollama so that you can summarize your transcripts, which would make a great complete suite for recording and summarizing meetings running only local models. For English this seems to work rather well, but local LLMs are known for not being all to reliable for Dutch. So we ran an experiment, summarizing transcripts of the HKU en AI podcast.
Setup
Setting up LLMs in MacWhisper is quite straightforward: install Ollama, install your model, open MacWhisper and go to Global (the settings menu), then AI, Services. If Ollama is running, you should be able to select all installed models in MacWhisper by clicking Ollama under Add another service. Now once you've made a transcript you can interact with ollama under the AI tab (three stars) at the top right.
Testing Dutch in MacWhisper (May '25)
Interacting with any model through MacWhisper in Dutch gives strange results. Replies are often in English, or seem to ignore the prompt completely.
- Gemma3 and Mistral give pretty accurate summaries, but in English only.
- Deepseek goes completely off the rails
- Llama3.2 gives a very short summary that misses key points.
Changing prompts or moving to chat mode does not seem to improve anything.
Testing Dutch in LLM directly (May '25)
I thought MacWhisper might be interfering in some way (as I could not get the LLM to react to anything else than 'summarize'), so I moved to Open WebUI. In this way I could still interact with the LLM and add a text file as imput. The textfile was the transcript export from MacWhisper.
While did this improve the interaction with the LLMs as I could ta