Felix WIlliam
05/22/2025, 3:03 PMVapi Support Bot
05/22/2025, 3:05 PMpipeline-error-openai-llm-failed
means there was a failure in the OpenAI LLM processing pipeline, often due to issues like invalid API keys, model unavailability, or provider-side problems ([see documentation](https://docs.vapi.ai/calls/call-ended-reason)).
To help you further, could you confirm if you have recently changed your OpenAI API key, model settings, or made any updates to your assistant configuration?
Sources:
- [Call Ended Reasons – Vapi Documentation](https://docs.vapi.ai/calls/call-ended-reason)
- [Assistant Hooks – Vapi Documentation](https://docs.vapi.ai/assistants/assistant-hooks)Kyle
05/22/2025, 3:12 PMFelix WIlliam
05/22/2025, 3:15 PMMo
05/22/2025, 4:18 PMMo
05/22/2025, 4:18 PMFelix WIlliam
05/22/2025, 5:58 PMKyle
05/23/2025, 2:42 AMFelix WIlliam
05/23/2025, 9:16 AMVapi Support Bot
05/23/2025, 9:17 AMKyle
05/24/2025, 2:25 AM