Parameters
The llm.chat method can have the following parameters.| Parameter | Type | Description |
|---|---|---|
input | str | The input message to send to the chat model. |
is_stream | bool | The temperature parameter for the model. |
**kwargs | dict | Additional parameters to pass to the chat model. |
Refer to your provider-specific documentation for additional kwargs you can use.
Returns
| Output | Type | Description |
|---|---|---|
ChatCompletion | object | A chat completion object in the OpenAI format + metrics computed by LLMstudio. |
Usage
Here’s how to use.chat() to make calls to your LLM.
Create your message. Your message can be a simple
string or a message in the OpenAI format.- String format
- OpenAI format