techlead_05823
04/29/2024, 9:54 AMVapi Support
04/29/2024, 9:54 AMVapi Support
04/29/2024, 9:54 AMstream:true
. This is because Vapi is designed to handle streaming data for real-time voice AI interactions, which requires streaming support from the LLM server to function correctly. If your server is not configured to support streaming, this would explain the issues you're encountering with Vapi not being able to process the responses effectively.
Vapi expects responses in a streaming format from custom LLM servers to facilitate real-time processing and low-latency voice interactions. Thus, to integrate successfully with Vapi, adapting your server to support streaming responses is necessary [(1)]() [(2)]() .
While direct guidance on modifying your server to enable streaming is beyond the scope of the provided documentation, the general advice is to explore the server-side technologies you're using and see if they support streaming functionalities. Many web frameworks and server technologies offer streaming capabilities that you can leverage to meet Vapi's requirements.techlead_05823
04/29/2024, 9:59 AMVapi Support
04/29/2024, 10:07 AM