VAPI - Custom LLM - Streaming
# support
t
Hello, When using custom-llm-server, is it required to have streaming enabled ? as i see that VAPI is sending stream:true to my custom llm server, and for specific requirements, i m not able to support streaming as of now. my server is returning an OPENAI compatible response but VAPI is not able to Read it( Talk ), can you please advise on this?
v
Hi @techlead_05823, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly.
----------------------------- **While I generate an answer, here are some sources that might be helpful**: * [Custom LLM]() * [Introduction]() * [Frequently Asked Questions]() * [Introduction]() -----------------------------
### Answer For your custom LLM server to work with Vapi, enabling streaming is indeed crucial when Vapi makes requests with
stream:true
. This is because Vapi is designed to handle streaming data for real-time voice AI interactions, which requires streaming support from the LLM server to function correctly. If your server is not configured to support streaming, this would explain the issues you're encountering with Vapi not being able to process the responses effectively. Vapi expects responses in a streaming format from custom LLM servers to facilitate real-time processing and low-latency voice interactions. Thus, to integrate successfully with Vapi, adapting your server to support streaming responses is necessary [(1)]() [(2)]() . While direct guidance on modifying your server to enable streaming is beyond the scope of the provided documentation, the general advice is to explore the server-side technologies you're using and see if they support streaming functionalities. Many web frameworks and server technologies offer streaming capabilities that you can leverage to meet Vapi's requirements.
t
I see, thanks for the clarification. Just to confirm on VAPI UI when choosing Groq and LLama3, i noticed that the function calls are not being called. should i take it as that you don't support fcuntion calling with this combination yet ( Groq + LLama3 )
v
Glad I could be helpful. Feel free to create a new thread with any new questions you may have.
27 Views