Batyr
05/22/2025, 10:25 AMVapi Support Bot
05/22/2025, 10:27 AMpipeline-error-openai-llm-failed
and the warning about "tools is not supported in this model" suggest that your current model (e.g., gpt-4o
) does not support function calling/tools.
To resolve this, update your assistant to use a model that supports tools/function calling, such as gpt-4-1106-preview
. You can find the list of supported models in the [OpenAI documentation](https://platform.openai.com/docs/guides/function-calling#models-supporting-function-calling).
Example PATCH request:
json
{
"model": {
"provider": "openai",
"model": "gpt-4-1106-preview"
}
}
This should prevent the error and allow tool use in your calls.
Sources:
- [Call ended reasons (Vapi documentation)](https://docs.vapi.ai/calls/call-ended-reason)
- [Custom LLM Tool Calling Integration (Vapi documentation)](https://docs.vapi.ai/customization/tool-calling-integration)Batyr
05/22/2025, 10:37 AMAndrei
05/22/2025, 1:23 PMVapi
05/22/2025, 2:21 PMRaj
05/22/2025, 2:46 PMVapi
05/23/2025, 2:22 AMBatyr
05/23/2025, 5:35 PMVapi Support Bot
05/23/2025, 5:37 PMpipeline-error-openai-llm-failed
again, check your assistant's model configuration and ensure the model supports all required features. For persistent or unexplained errors, contact [Vapi support](https://docs.vapi.ai/support) with your call IDs for direct assistance.
Sources:
- [Call ended reasons (Vapi documentation)](https://docs.vapi.ai/calls/call-ended-reason)
- [Custom LLM Tool Calling Integration (Vapi documentation)](https://docs.vapi.ai/customization/tool-calling-integration)
- [Vapi Support](https://docs.vapi.ai/support)Vapi
05/24/2025, 1:36 AM