Running models locally
All the big models run on external servers and are usually only available through a (paid) account. There are some alternatives available that you can run locally on your own machine. Installing these usually involves complex installation procedures in Python,procedures, but there's a trend for 'one-click-installers' that get you set up relatively painlessly. Below you can find a list of some of these Note! Most of these models need (recent and beefy) Nvidia graphics cards to run, or Apple M1/M2 silicon.
Below you can find some simple installers for various generative AI's. At the moment there's also Pinokio, which strives to be something of an AI-app-store that can also automate processes between different AI types. You can find it here: https://pinokio.computer/
Image generation on your own computer
Stable Diffusion WebUI by Automatic11111: https://github.com/AUTOMATIC1111/stable-diffusion-webui (Win, Linux, Mac)
Easy Diffusion https://github.com/easydiffusion/easydiffusion (Win, Linux, Mac)
DiffusionBee https://diffusionbee.com/ (Mac)
Text generation ('ChatGPT')
Llama2: https://github.com/oobabooga/text-generation-webui (Win, Linux, Mac)