Expected response structure from custom LLM
# support
j
What's the Vapi's expected response structure from custom LLM? Documentation doesn't give any information on this. I tried to return raw completions response from chatgpt, but it didn't work.
v
message has been deleted
@Javokhir The expected response structure for a custom LLM when integrating with VAPI should typically be in a format that VAPI can parse and understand. While the documentation you're referring to doesn't provide specifics, a common approach is to return JSON-formatted data with fields that VAPI expects. For instance, if VAPI is expecting certain fields like
status
,
message
, or
data
, your custom LLM response should include these fields. Here's a hypothetical example of what the structure might look like:
Copy code
json
{
  "status": "success",
  "message": "Response generated successfully.",
  "data": {
    "responseText": "Your custom LLM's response goes here.",
    "additionalInfo": "Any other relevant information."
  }
}
If you're returning a raw completion response from ChatGPT or another LLM, you'll likely need to wrap that response in a structure similar to the one above. Make sure that the
data
field contains the information VAPI is designed to handle. Since the documentation doesn't clarify the exact structure, it would be best to reach out to VAPI support for more detailed information on the expected response format. They should be able to provide you with the specifications needed to format your custom LLM's response correctly.
You can continue this conversation by mentioning me in the message.
Got it. Tagging @User for additional help.
2 Views