Let’s have you setup LLMstudio
Install the latest version of LLMstudio using pip
conda
Install bun
if you want to use the UI
Create a .env
file at the same path you’ll run LLMstudio
Now you should be able to run LLMstudio using the following command.
When the --ui
flag is set, you’ll be able to access the UI at http://localhost:3000
Using it in a Python notebook is also fairly simple! Just run the following cell:
The output will be a ChatCompletion, using the OpenAI format.
Let’s have you setup LLMstudio
Install the latest version of LLMstudio using pip
conda
Install bun
if you want to use the UI
Create a .env
file at the same path you’ll run LLMstudio
Now you should be able to run LLMstudio using the following command.
When the --ui
flag is set, you’ll be able to access the UI at http://localhost:3000
Using it in a Python notebook is also fairly simple! Just run the following cell:
The output will be a ChatCompletion, using the OpenAI format.