Harter
01/14/2025, 6:32 PMVapi
01/14/2025, 6:33 PMVapi
01/14/2025, 6:33 PMVapi
01/14/2025, 6:33 PMHarter
01/14/2025, 6:38 PMHarter
01/14/2025, 6:38 PMVapi
01/14/2025, 6:38 PMopenrouter
as a provider, we need to ensure that the PATCH request is correctly formatted with all the necessary parameters.
Could you please confirm if you included the Authorization
header with your Bearer token and the correct payload specifying openrouter
as the provider in your PATCH request? If you've already done this and are still encountering the error, it might be an issue on the server side or with the openrouter
provider integration.
If you have the exact payload you used for the PATCH request, please share it so I can assist you further. If not, I can guide you on how to structure your PATCH request correctly.Harter
01/14/2025, 6:40 PMVapi
01/14/2025, 6:40 PMpipeline-error-openrouter-llm-failed
error, it suggests that the issue may be specific to the openrouter
model you're trying to use or a compatibility issue between the model and the VAPI platform.
Since the error is specific to the openrouter
provider, I recommend the following steps:
1. Double-check the model identifier for openrouter
to ensure it's correctly specified.
2. Review any provider-specific configuration or limitations that might be causing the error.
3. Check the VAPI documentation or contact support to see if there are any known issues with the openrouter
provider.
If you've already verified the model identifier and there are no configuration issues, it might be necessary to reach out to VAPI support for further assistance. They can provide more detailed insights into the error and help resolve any compatibility issues.
Would you like me to assist you with contacting support or provide you with guidance on how to reach out to them?Harter
01/14/2025, 6:43 PMVapi
01/14/2025, 6:43 PM