pipeline-error-openrouter-llm-failed
# support
e
Call ID: b667bc10-cbe2-4d18-8e58-b5f7100e5934 The request shows up on the Openrouter dashboard, but not sure why Vapi is throwing this error
v
message has been deleted
@Ethan Tan I'm here to help you with the issue you're experiencing. To better assist you, could you please provide me with more details about the error message that VAPI is displaying? This will help me understand the nature of the problem and guide you towards a resolution.
You can continue this conversation by mentioning me in the message.
e
still would like help thanks
s
šŸ”µ 11:43:14:928 Completion Timeout openrouter google/gemini-pro-1.5 šŸ”µ 11:43:14:930 LLM Sequence Timeout (have #0 to go). Cost: 0 (+0). Prompt: 0 (+0). Completion: 0 (+0) 🟔 11:43:14:930 Timeout while receiving sequence for model provider: openrouter 🟔 11:43:14:934 Model request failed (openrouter, google/gemini-pro-1.5): pipeline-error-openrouter-llm-failed šŸ”µ 11:43:14:935 Model request failed 🟔 11:43:14:935 Web Call Error: pipeline-error-openrouter-llm-failed. Closing...
e
working now
3 Views