Maddy
05/17/2024, 9:28 PMVapi Support
05/17/2024, 9:28 PMVapi Support
05/17/2024, 9:28 PMpipeline-error-custom-llm-llm-failed, typically indicates a failure in the custom Large Language Model (LLM) integration within your Vapi setup. This could be due to various reasons such as configuration issues, connectivity problems with the LLM provider, or changes on the provider's side that are impacting the integration, even if no changes were made on your side.
To resolve this issue, you might need to verify several aspects:
1. **Model Configuration**: Ensure that the model's URL and any required authentication tokens or keys are correctly configured.
2. **Provider Status**: Check if the LLM provider is experiencing any outages or updates that might affect their service.
3. **Network Issues**: Confirm that there are no network issues that could be preventing your Vapi instance from communicating with the LLM provider.
Since this seems to be an urgent and potentially complex issue, directly reviewing detailed logs and configurations would be beneficial. If the problem persists, contacting Vapi's support with the specific error details and the call IDs provided could expedite the resolution. Unfortunately, the documents provided do not contain specific troubleshooting steps for this error [(1)]() .Sahil
05/17/2024, 9:31 PMSahil
05/17/2024, 9:31 PMMaddy
05/17/2024, 9:33 PMMaddy
05/17/2024, 9:36 PMSahil
05/17/2024, 9:40 PMMaddy
05/17/2024, 9:40 PMSahil
05/17/2024, 9:41 PMMaddy
05/17/2024, 9:43 PMMaddy
05/17/2024, 9:44 PMSahil
05/17/2024, 10:07 PMMaddy
05/18/2024, 1:35 AM