response format for custom llm server
# support
Hello! I've got a working server endpoint that correctly receives messages from vapi, passes them to my custom llm, and generates a response, however I don't know what format to send the response back to vapi. Can I get some help with this?