LLM API
Configuration
After completing the installation, follow these steps to configure the LLM API, using the OpenAI API as an example. This process is similar for other LLM APIs.
Steps
Initialize Configuration:
Execute strataai
--init-config
to generate~/.strataai/config2.yaml
. Edit this file with your configurations to avoid sharing your API key by accident.
Edit Configuration:
Update
~/.strataai/config2.yaml
according to the example below:
Note: Configuration priority is
~/.strataai/config2.yaml
>config/config2.yaml
.
Additional Configurations for OpenAI o1 Series:
With these steps, your setup is complete. For starting with Strata AI, check out the Quickstart guide or our Tutorials.
Additional Model Configurations
Strata AI supports a range of LLM models. Configure your model API keys as needed:
Anthropic / Claude API:
Azure OpenAI API:
Google Gemini:
Baidu QianFan API:
Amazon Bedrock API:
Last updated