pipeline-error-groq-llm-failed
# support
n
Hi, I'm repeatedly getting the above error during many calls. Example call id 7235a27e-877b-4d12-bbae-092d8d35a6d4. Any idea why this happens, and how to prevent it?
v
Hi @nisalr, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly.
----------------------------- **While I generate an answer, here are some sources that might be helpful**: * [Call Ended Reason]() * [Create Phone Call]() * [List Calls]() * [Get Call]() -----------------------------
### Answer The
pipeline-error-groq-llm-failed
error indicates specific failures within the processing pipeline of your Vapi application, particularly related to the Groq Large Language Model (LLM) integration. This error might occur due to various issues such as integration misconfigurations, issues at the LLM provider's end, exceeding quotas, or other unforeseen errors during the call processing phase [(1)]() . To prevent such errors, ensure that: - Your Groq LLM integration configurations are correct and up-to-date. - You stay within the quota limits provided by your LLM provider. - Check for any known issues or maintenance windows with your LLM provider that might affect service availability. If you're using your own provider key, validating that it's correctly configured and has not exceeded any usage limits might also help. Errors like
pipeline-error-first-message-failed
have been attributed to issues such as running out of credits when using your own key or using the wrong voice ID [(2)]() , which could be similar to the issues you're facing with Groq LLM. For a more detailed understanding and specific troubleshooting steps, consulting with the service provider for Groq LLM or engaging with the Vapi support team directly might be necessary, as the exact cause can vary based on the specifics of your implementation and the service status of the involved technologies.
Got it. Tagging @User for additional help.
n
Is this because of rate limiting on Groq? If so, how do I add my own API key for Groq models on VAPI? It's not listed under "Provider Keys"
s
Let me check.
Basically, we expected that Groq would send the response within 750ms but it didn't send the response so it got a timeout due to which this error occurred. šŸ”µ 03:27:22:650 LLM Sequence Timeout (have #0 to go). Cost: 0 (+0). Prompt: 0 (+0). Completion: 0 (+0) šŸŸ” 03:27:22:651 Model request failed (groq, llama3-70b-8192): pipeline-error-groq-llm-failed šŸ”µ 03:27:22:652 Model request failed šŸŸ” 03:27:22:653 CallUuid
2a3cac4e-c0ca-4806-bc9a-083b2ad95bb9
Websocket Error: pipeline-error-groq-llm-failed šŸŸ” 03:27:22:654 CallUuid
2a3cac4e-c0ca-4806-bc9a-083b2ad95bb9
Phone Call Error: pipeline-error-groq-llm-failed. Closing... šŸ”µ 03:27:22:655 CallUuid
2a3cac4e-c0ca-4806-bc9a-083b2ad95bb9
Waiting for Logs... šŸŸ” 03:27:22:655 CallUuid
2a3cac4e-c0ca-4806-bc9a-083b2ad95bb9
Websocket Error: pipeline-error-groq-llm-failed šŸŸ” 03:27:22:656 Phone Call Manager Error: pipeline-error-groq-llm-failed
Is it happening all the time?
n
yes. it is happening frequently. We switched back to GPT temporarily because of this
Any idea when it will be fixed?
@Sahil
a
Hey @nisalr & @Sahil I'm getting the same thing. Switching back to GPT temporarily too
s
Will let you know when it is fixed @nisalr and @araSh