Supported models
gpt-4ogpt-4-turbogpt-4gpt-3.5-turbogpt-3.5-turbo-instruct
Parameters
An OpenAI LLM interface can have the following parameters:| Parameter | Type | Description |
|---|---|---|
api_key | str | The API key for authentication. |
temperature | float | The temperature parameter for the model. |
top_p | float | The top-p parameter for the model. |
max_tokens | int | The maximum number of tokens for the model’s output. |
frequency_penalty | float | The frequency penalty parameter for the model. |
presence_penalty | float | The presence penalty parameter for the model. |
Usage
Here is how you setup an interface to interact with your OpenAI models.- w/ .env
- w/o .env