Hi i implemented a custom llm backend using langgraph
I'm supplying content back in the response for Vapi to consume
I can see in my longs the content is filled with some sort of response
But Vapi doesn't seem to pick up hence its just silence after the first intro message
Can someone help me? Litterally this is my last blocker before getting a end to end solution in place!
Thanks
Call ID 25de6020-2b3d-4278-96ca-6fa32606e434
k
Kings_big💫
05/18/2025, 8:11 AM
Make sure you’re returning a properly structured JSON (with toolCallId and result), or streaming SSE responses with the right headers and format (Content-Type: text/event-stream, each message prefixed by data:), or Vapi will stay silent after the intro.
Your Vapi setup points to a custom LLM endpoint, so to avoid silence after the intro, your /chat/completions route must stream responses using Server-Sent Events (SSE) with proper headers and a data: format ending in [DONE]..
https://docs.vapi.ai/customization/custom-llm/using-your-server
m
MK
05/18/2025, 9:09 AM
ok that worked !!!!
please update your docs and the youtube video to explicitly state that for custom llm you need to use sse endpoints.