Interact with your Ollama models using LLM.
An Ollama LLM interface can have the following parameters:
Parameter | Type | Description |
---|---|---|
temperature | float | The temperature parameter for the model. |
top_p | float | The top-p parameter for the model. |
num_predict | int | The number of tokens to predict. |
top_k | int | The top-k parameter for the model. |
Here is how you setup an interface to interact with your Ollama models.
Create a config.yaml
in the same directory your code is in.
Define your Ollama provider and models inside the config.yaml
file.
Create your llm instance.
Optional: You can add your parameters as follows:
Learn how to send messages and recieve responses next!
Learn how to build a tool calling agent using llmstudio.