Phone call latency is 4x what says at the Assistan...
# support
l
I went for the lowest latency options possible for the model, transcriber and voice. Twilio adds 700ms regardless. Total latency was 1275ms Yet, when i test the phone calls, it takes 5-6s for the Agent to respond. That includes when i talk over it and it should presumably stop speaking and listen to me I don't see much way to reduce the latency. Perhaps there is something at the workflow to help me out I do suspect the reason is that the phone calls take place in **Brazil **and the servers are in USA
m
I do have the same complaint. For example on this call: 9c662e0f-fbc6-4033-a683-8edf91255656 It takes 4 sec for the AI to respond to the caller. Seems to be because it's pulling the KB right after the caller speaks, LLM take 3sec to do I don't know what since the KB is supposed to provide the answer if any, then the AI asks the question where the caller is from just as instructed in the prompt, then it takes another 10sec to look through things. This back and forth makes no sense. Why pull the kb as soon as the user speaks instead of just going by it's prompt which is to gather the location of the caller THEN lookup it's info? I do see the kb logs in Trieve showing data getting fetched for every word that's spoken, seems like a setting issue maybe? Beyond that, you'll see after the caller says Los Angeles, the KB is getting pulled, ok, but then you see this: 17:23:10:287 [LOG] Model request started (attempt #1, claude-3-7-sonnet-20250219, anthropic) 17:23:16:962 [CHECKPOINT] Model sent start token 6seconds for the Model to send start token? Why? So the question is how do we improve latency?
@Shubham Bajaj @Mason
m
Hey man, I'll get with Shubham and we'll take a look at the call logs in depth when he gets online. I'll make sure I keep you in the loop as to the cause & fix
l
@Mason Please do so. Current latency is 5-6s, we have to get this down to 3s
m
@Mason Since I think the issue is related, here's another one for you: https://discord.com/channels/1211482211119796234/1349430102563487796 sorry
k
Hey guys, taking a deeper dive into these calls it appears the LLM provider took much longer than usual to respond. Please give it another test today and let me know if you're still experiencing high latency.