Skip to main content

Running models locally

All the big models run on external servers and are usually only available through a (paid) account. There are some alternatives available that you can run locally on your own machine. Installing these usually involves complex installation procedures in Python, but there's a trend for 'one-click-installers' that get you set up relatively painlessly. Below you can find a list of some of these Note! Most of these models need (recent and beefy) Nvidia graphics cards to run, or Apple M1/M2 silicon.

Image generation on your own computer

Stable Diffusion WebUI by Automatic11111: https://github.com/AUTOMATIC1111/stable-diffusion-webui (Win, Linux, Mac)

Easy Diffusion https://github.com/easydiffusion/easydiffusion (Win, Linux, Mac)

DiffusionBee https://diffusionbee.com/ (Mac)

Text generation ('ChatGPT')

Llama2: https://github.com/oobabooga/text-generation-webui (Win, Linux, Mac)