Prevent users from interrupting the assistant.
# support
l
I am currently using a VAPI assistant via a custom LLM integration. The problem I am facing is the following: 1) A User interacts with the assistant. VAPI makes a POST request to my custom LLM endpoint. My custom endpoint start processing the request .... 2) The user interrupts the assistant before the first POST request is done. A new POST request is made by VAPI to my custom LLM endpoint. This time the user waits until the LLM is done and and answer is sent back to the user. Is there a way I can prevent the second POST from happening until the first POST is completed. I can see a
interruptionsEnabled
exists in the assistant. I tried to set it but nothing changes. https://github.com/VapiAI/example-server-python-flask/blob/64fb07c880bec2844a26361eaad6361fb5decdce/app/types/vapi.py#L61
k
Hey, I was out of office for a couple of days, so couldn't reply to you earlier. Checking if this is resolved for you?
j
It seems like the new way of doing this is with the stopSpeakingPlan? I don't see an
interruptionsEnabled
setting in the docs. https://docs.vapi.ai/api-reference/calls/get#response.body.assistant.stopSpeakingPlan.numWords
k
This ticket has been marked as solved, and as a result, you will no longer receive any further responses. Kindly create a new support ticket to escalate your inquiry. 
2 Views