Support for Anthropic models as custom LLMs
# support
r
Hello, I’m wondering if VAPI natively supports responses from the Anthropic model (e.g., Claude 3.5 Sonnet) when connected as a custom LLM. So far, I’ve been using OpenAI models, and they have worked well. However, I’d like to switch to the Anthropic model on the backend. The official VAPI documentation only mentions OpenAI models, so I’d appreciate your clarification. Your response will help determine whether I can use your service for my business case. Thank you! #custom-llm
a
HeyRuslan Thank you for creating the ticket! A member of our team will reach out to you shortly.
v
You can continue this conversation by mentioning me in the message.
@Ruslan Hello! I'm glad to hear that you've had a positive experience using OpenAI models with VAPI. Regarding your query about the Anthropic model, such as Claude 3.5 Sonnet, VAPI is designed to be flexible and can integrate with various custom Large Language Models (LLMs). To connect a custom LLM like the Anthropic model, you would typically use the "Custom LLM" option in the Vapi Dashboard. Since you've already been using OpenAI models, the process should be similar. You would need to configure the integration with the necessary API endpoints and authentication details provided by Anthropic. If the VAPI documentation does not explicitly mention Anthropic models, it's possible that the integration might require some additional steps or considerations. I would recommend reaching out to the VAPI support team for detailed guidance on integrating the Anthropic model with VAPI. Would you like me to assist you with anything else regarding the setup or any other questions you might have?
r
I think this one - https://vapi.notion.site/Custom-LLM-Server-Guide-14d008c8393f807f86dbf61dcb0767ee?pvs=4 - answers my question. As long as I return responses following that format, VAPI will be able to process them.
6 Views