slickVik
03/25/2025, 1:57 PMslickVik
03/25/2025, 1:57 PMVapi Support Bot
03/25/2025, 1:59 PMpipeline-error-custom-llm-llm-failed
error! Based on the call ID you provided, I need a bit more information to better assist you.
Could you please share:
1. What specific action were you performing when this error occurred? Were you using any particular function or API call related to Vapi?
2. What was the expected response, and what was the actual response you received?
3. Can you provide details about your custom LLM setup? This includes:
- Which custom LLM are you using?
- How is it configured in your Vapi pipeline?
- Are you using any custom tools with your LLM?
This information would be very helpful in diagnosing the issue, as the error indicates a failure related to a custom LLM within your AI processing pipeline.
While waiting for your response, I can tell you that this type of error is often related to:
- Token limits or incorrect token calculations
- API key configuration issues
- Formatting problems with tool use and tool result blocks
- Validation errors in tool definitions
You can also check the [Vapi dashboard](https://dashboard.vapi.ai) under logs > calls to find more details about this specific call.
Sources:
- [General Debugging Steps for Vapi Pipelines](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1321375442644762625)
- [Custom LLM Configuration and Integration Checks](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1323288623629471866)
- [Finding Call ID in Vapi Dashboard](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1327481613004701779)Kings Big
03/26/2025, 2:33 PM