Expected response structure from custom LLM
# support
j
What's the Vapi's expected response structure from custom LLM? Documentation doesn't give any information on this. I tried to return raw completions response from chatgpt, but it didn't work.
v
message has been deleted
@Javokhir I'm sorry to hear you're having trouble with the response structure for your custom LLM. The expected response structure from a custom LLM when integrating with Vapi should typically be in a JSON format that Vapi can parse and understand. It should match the API's expected schema for responses. Since the documentation you're referring to doesn't provide specific details on the response structure, it would be best to ensure that your response includes the necessary fields that Vapi expects. This usually involves a status code, a message or data field containing the result, and any other required metadata. If you're still unsure about the correct format, please reach out to the support team for more detailed guidance on the expected response structure for your custom LLM integration with Vapi. They should be able to provide you with the specifications needed to format your LLM's responses correctly.
You can continue this conversation by mentioning me in the message.
s
@Javokhir checkout this guide on using custom llm it will help you with request payload https://www.notion.so/vapi/Custom-LLM-Server-Guide-14d008c8393f807f86dbf61dcb0767ee
Do let me know if you require further help!!
s
I am facing the same issue
@Shubham Bajaj what is the expected output that we need to return?
Do you need to send back as chunks?
s
@slickVik go through this guide it will help you https://www.notion.so/vapi/Custom-LLM-Server-Guide-14d008c8393f807f86dbf61dcb0767ee In case you blockers after going through this do let me know!!
j
Thank you so much, tihs is really helpful and solved my problem, appreciated!
2 Views