Guides
Run Stable Diffusion (V2) in 3 steps with Brev
The legends over at StabilityAI have outdone themselves yet again! In this guide, we show you how you can run Stable Diffusion V2 out of the box with Brev.

"Snoop dogg showing off his Stable Diffusion V2 instance"
To get started, hit this link to create a new Brev instance
Sign up for an account
It'll redirect you to an instance creation page pre-configured with the defaults you need. (We recommend sticking to the default GPU - it's the cheapest that'll work with Stable Diffusion).
At the bottom, add a payment method...we give you 30 minutes free but we just need to make sure people don't abuse our systems
Hit create!
Open your new Brev instance:
brev open stable-diffusion-v2 --wait
If you don't have the Brev CLI, you can install it here.
This may take about 5 minutes to run. (Models are quite big these days unfortunately 😅). Once done, it'll open VSCode inside the repo. We've pre-loaded the SD-768 checkpoint in the weights directory so you're ready to go!
Running the model:
To activate the conda environment run:
conda activate ldm
In your VSCode terminal, run the command below replacing "Snoop Dogg on Mars" with whichever prompt you want:
python scripts/txt2img.py --prompt "Snoop Dogg on Mars" --ckpt weights/768-v-ema.ckpt --config configs/stable-diffusion/v2-inference-v.yaml --H 768 --W 768
And boom! Give the model a couple of minutes to load and you should see what your model generated in the outputs folder.
You can open outputs with:
code outputs/txt2img-samples/samples/00000.png
code outputs/txt2img-samples/samples/00001.png
If you thought that was easy, check out our other docs. We've been building the easiest way to do develop on the cloud.