Azure
Interact with your Azure models using LLM.
Parameters
An Azure LLM interface can have the following parameters:
Parameter | Type | Description |
---|---|---|
temperature | float | The temperature parameter for the model. |
max_tokens | int | The maximum number of tokens to generate. |
top_p | float | The top-p parameter for the model. |
frequency_penalty | float | The frequency penalty parameter for the model. |
presence_penalty | float | The presence penalty parameter for the model. |
Usage
Here is how you setup an interface to interact with your Azure models.
Create a config.yaml
file in the same directory as your code.
- π src
- π PythonCode.py
- π PyNotebook.ipynb
- π config.yaml
Define your Azure OpenAI provider and models inside the config.yaml
file.
max_tokens
, input_tokens
and the other parameters as 0Create your llm instance.
Optional: You can add your parameters as follows:
Create a config.yaml
file in the same directory as your code.
- π src
- π PythonCode.py
- π PyNotebook.ipynb
- π config.yaml
Define your Azure OpenAI provider and models inside the config.yaml
file.
max_tokens
, input_tokens
and the other parameters as 0Create your llm instance.
Optional: You can add your parameters as follows:
Create a config.yaml
file in the same directory as your code.
- π src
- π PythonCode.py
- π PyNotebook.ipynb
- π config.yaml
Define your Azure provider and models inside the config.yaml
file.
max_tokens
, input_tokens
and the other parameters as 0Create your llm instance.
Optional: You can add your parameters as follows: