Run Stable LM with Brev

In this guide, we show you how you can run StabilityAI's new LLM with Brev! As always, we've pre-configured the environment for you so you won't have to do any config at all!

  1. Sign up for an account

  2. It'll redirect you to an environment creation page pre-configured with the defaults you need. (We recommend sticking to the 150GB of disk space we've set).

  3. At the bottom, add a payment method...we give you 30 minutes free but we just need to make sure people don't abuse our systems 🙂

  4. Hit create!

Open your new Brev environment:

brev open stable-lm-template --wait

If you don't have the Brev CLI, you can install it here.

Running the model:

Once VSCode opens, run this:


The model will then run with the system_prompt (which describes how the model should behave) and the prompt (which is the text you want to generate). You can change these in the file.

Happy LLMing!