Join Discord
Powered by
response format for custom llm server
# support
d
DaBomb
03/25/2024, 10:08 PM
Hello! I've got a working server endpoint that correctly receives messages from vapi, passes them to my custom llm, and generates a response, however I don't know what format to send the response back to vapi. Can I get some help with this?
n
nikhil
03/26/2024, 2:18 AM
yes check example impl here:
https://github.com/VapiAI/server-side-example-serverless-vercel/blob/master/api/custom-llm/openai-sse.ts
Previous
Next