Supported models
gemini-1.5-flashgemini-1.5-progemini-1.0-pro
Parameters
A VertexAI LLM interface can have the following parameters:| Parameter | Type | Description |
|---|---|---|
api_key | str | The API key for authentication. |
temperature | float | The temperature parameter for the model. |
top_p | float | The top-p parameter for the model. |
max_tokens | int | The maximum number of tokens for the model’s output. |
frequency_penalty | float | The frequency penalty parameter for the model. |
presence_penalty | float | The presence penalty parameter for the model. |
Usage
Here is how you setup an interface to interact with your VertexAI models.- w/ .env
- w/o .env