LLM
Chat
Make chat calls using your LLM.
Parameters
The llm.chat method can have the following parameters.
Parameter | Type | Description |
---|---|---|
input | str | The input message to send to the chat model. |
is_stream | bool | The temperature parameter for the model. |
**kwargs | dict | Additional parameters to pass to the chat model. |
Refer to your provider-specific documentation for additional kwargs you can use.
Returns
Output | Type | Description |
---|---|---|
ChatCompletion | object | A chat completion object in the OpenAI format + metrics computed by LLMstudio. |
Usage
Here’s how to use .chat()
to make calls to your LLM.
1
Start by importing LLM.
2
Set up an LLM from your desired provider.
3
Create your message. Your message can be a simple string
or a message in the OpenAI format
.
4
Get your response.
Vizualize your response.
You are done chating with your LLMstudio LLM!