Guides
Run Stable LM with Brev
In this guide, we show you how you can run StabilityAI's new LLM with Brev! As always, we've pre-configured the environment for you so you won't have to do any config at all!
To get started, hit this link to create a new Brev environment
Sign up for an account
It'll redirect you to an environment creation page pre-configured with the defaults you need. (We recommend sticking to the 150GB of disk space we've set).
At the bottom, add a payment method...we give you 30 minutes free but we just need to make sure people don't abuse our systems 🙂
Hit create!
Open your new Brev environment:
brev open stable-lm-template --wait
If you don't have the Brev CLI, you can install it here.
Running the model:
Once VSCode opens, run this:
python script.py
The model will then run with the system_prompt (which describes how the model should behave) and the prompt (which is the text you want to generate). You can change these in the script.py file.
Happy LLMing!