Skip to main content
Make chat calls using your LLM.

Parameters

The llm.chat method can have the following parameters.
ParameterTypeDescription
input strThe input message to send to the chat model.
is_streamboolThe temperature parameter for the model.
**kwargsdictAdditional parameters to pass to the chat model.
Refer to your provider-specific documentation for additional kwargs you can use.

Returns

OutputTypeDescription
ChatCompletionobjectA chat completion object in the OpenAI format + metrics computed by LLMstudio.

Usage

Here’s how to use .chat() to make calls to your LLM.
1

Start by importing LLM.
from llmstudio import LLM
2

Set up an LLM from your desired provider.
llm = LLM('openai/gpt-4o')
3

Create your message. Your message can be a simple string or a message in the OpenAI format.
  • String format
  • OpenAI format
message = "Hello, how are you today?"
4

  • Non-stream response
  • Stream response
Get your response.
response = llm.chat(message)
Vizualize your response.
print(response)
You are done chating with your LLMstudio LLM!
I