LLM
Ollama
Interact with your Ollama models using LLM.
Parameters
An Ollama LLM interface can have the following parameters:
Parameter | Type | Description |
---|---|---|
temperature | float | The temperature parameter for the model. |
top_p | float | The top-p parameter for the model. |
num_predict | int | The number of tokens to predict. |
top_k | int | The top-k parameter for the model. |
Usage
Here is how you setup an interface to interact with your Ollama models.
1
Create a config.yaml
in the same directory your code is in.
- src
- yourPythonCode.py
- yourPyNotebook.py
- config.yaml
2
Define your Ollama provider and models inside the config.yaml
file.
If you are not sure about any of these parameters, you can just leave them as 0
3
Create your llm instance.
4
Optional: You can add your parameters as follows:
You are done setting up your Ollama LLM!