LiteLLM Proxy
Learn how Flowise integrates with LiteLLM Proxy
How to use LiteLLM Proxy with Flowise
Step 1: Define your LLM Models in the LiteLLM config.yaml file
model_list:
- model_name: gpt-4
litellm_params:
model: azure/chatgpt-v-2
api_base: https://openai-gpt-4-test-v-1.openai.azure.com/
api_version: "2023-05-15"
api_key:
- model_name: gpt-4
litellm_params:
model: azure/gpt-4
api_key:
api_base: https://openai-gpt-4-test-v-2.openai.azure.com/
- model_name: gpt-4
litellm_params:
model: azure/gpt-4
api_key:
api_base: https://openai-gpt-4-test-v-2.openai.azure.com/Step 2. Start litellm proxy
Step 3: Use the LiteLLM Proxy in Flowise
Last updated