422 Unprocessable Entity on Custom LLM
# support
d
1. Cloned this repo : https://github.com/VapiAI/advanced-concepts-custom-llm + ngrok tunneling 2. Set up in vapi console to the http tunnel endpoint after setting up openai keys. Receiving HTTP 422 on above API . INFO: "POST /chat/completions HTTP/1.1" 422 Unprocessable Entity
v
You can continue this conversation by mentioning me in the message.
@DankKnight It seems like you're encountering an HTTP 422 error, which typically indicates that the server understands the content type of the request entity, and the syntax of the request entity is correct, but it was unable to process the contained instructions. This could be due to various reasons such as invalid parameters or missing fields in the request body. To assist you better, I would need to check the request you're sending to the
/chat/completions
endpoint. Could you please provide me with the payload of the request you're sending? This will help me understand if there are any missing or incorrect parameters that might be causing the issue. In the meantime, please ensure that: 1. Your ngrok tunnel is running and the endpoint is correctly set in the Vapi console. 2. The OpenAI API keys are correctly configured and have the necessary permissions. 3. The request body to the
/chat/completions
endpoint is correctly formatted as per the API's expectations. Once you provide the request payload, I can help you troubleshoot the issue further. If you're unsure about the payload, please ask support for more info on this.
d
@User could you help with this when you get time , thanks?
found the issue , i think tools should be optional in https://github.com/VapiAI/advanced-concepts-custom-llm , could you patch the repo (app/types/vapi.py) . Thanks
s
Hey @DankKnight can you please create an issue over the github and list out the steps to reproduce it. It will be really helpful.
3 Views