Tool result return not being spoken to the caller.
# support
l
AI asks if you are still there after triggering a tool instead of returning the result. Async is not enabled.
v
Hi there! I understand you're having issues with tool results not being spoken to the caller and the AI asking "Are you still there?" after triggering a tool. Let me explain what's happening 🔍 Why tool results aren't being spoken: When synchronous tool calls are made and no result is returned, the server treats this as a failure. The server sends a
tool_call_result
message with "No result returned" to the LLM, which doesn't provide verbal feedback to the user about the tool call outcome. [Source](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1342336047589359618) Why the AI asks "Are you still there?": This message is triggered when the tool call is still pending and there's been no new input. With synchronous tool calls, the AI expects to receive a response before proceeding. If no result is returned, it assumes a failure, which prompts additional user interaction. [Source](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1341585041624666122) Possible solutions: 1. Ensure your tool responses are correctly formatted as a string without breaks 2. Check that you're returning a properly structured JSON response:
Copy code
json
{
"results": [
  {
    "toolCallId": "your_tool_call_id_here",
    "result": "Your actual result here."
  }
]
}
3. Verify that your tool is actually returning results and not failing silently Let me know if you need more specific help with your implementation! Sources: - [Discord Thread about Tool Call Behavior](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1342336047589359618) - [Discord Thread about AI Prompting](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1341585041624666122)
n
Hello Support, @User. I have similar issue. I have a synchronous tool specified. The assisstant does not use the response from the tool directly as it supposed to. Here are messages from the Call Log: { "role": "tool_calls", "time": 1741903870655, "message": "", "toolCalls": [ { "id": "call_rItd5hs0FXHxe8iugYBKLKGa", "type": "function", "function": { "name": "FindPartnerfromInboundCall", "arguments": "{}" } } ], "secondsFromStart": 16.559 } { "role": "bot", "time": 1741903870966, "source": "", "endTime": 1741903874706, "message": "Ich prüfe kurz die Daten Okay, ich habe sie gefunden.", "duration": 3260, "secondsFromStart": 17.64 } { "name": "FindPartnerfromInboundCall", "role": "tool_call_result", "time": 1741903874204, "result": "Nikolai Raitsev", "toolCallId": "call_rItd5hs0FXHxe8iugYBKLKGa", "secondsFromStart": 20.108 } { "role": "user", "time": 1741903880685.999, "endTime": 1741903880876, "message": "Und", "duration": 190.0009765625, "secondsFromStart": 27.359999 } { "role": "bot", "time": 1741903884406, "source": "", "endTime": 1741903886226, "message": "Hallo Nikolai, wie kann ich Ihnen helfen?", "duration": 1820, "secondsFromStart": 31.08 } Even after assistant receives the result and says "ok, i found you" (message specified on tool), it stay just silent. First if I ask him, "whats up" or say anything, it processes as expected and uses the result from the tool. It is annoying. It was working before as expected.
l
yes exactly. I found this happening with tool calls taking longer than 7 seconds to process with a result. I have not put a delay message in the tool so that its a better experience. Its still not as good as it was though.
I have checked all these things. My JSON response structure is correct. The AI isn't giving the response back to the caller. It seems to ignore it until the users specificaly asks for the result.
k
Hello Lydo, please remove your request completed message and allow for the LLM to generate it's own response with the information.
l
Hi Mason, I don't have a request completed message configured. @User
Copy code
{
  "results": [
    {
      "tool_call_id": "call_TdX28cKEH5HGL0gHPWI4VhPZ",
      "result": "Great, your appointment has been booked. "
    }
  ]
}
Transcript: AI Perfect, Leiden. I'll now book your appointment for today at AI 5 30 PM, Brisbane time. Please hold on a moment while I finalize this. 1 moment while I check availability for that time slot. AI Are you still there? User Yes. AI Could you clarify what you mean by yes? Are you confirming something specific User Did you book the appointment? AI 1 moment while I check availability for that time slot.
n
For the reference, I've tried all variants with messages on the tool. without any message it stay just silinent all the time so I've added the status messages, just to be sure the call has worked. but it doesnt change anything on the end result. The assistant just stays silent after the response from our backend. Without user interaction it just hang up then. Please fix, it was working as expted some time ago.
l
@Vapi Support Bot @Vapi @VAPI Tech @Mason @Shubham Bajaj ^^^^^^ Any ideas?
s
@Lydo 1. The conversation flows normally until Eve (the AI assistant) needs to book an appointment 2. At timestamp 173.23 seconds, Eve says "Perfect, Leiden. I'll now 1 moment while I check availability for that time slot. Reply back with the webhook response." 3. Eve then makes a tool call to the
booking
function with the user's information 4. The
booking
tool successfully returns a result at timestamp 180.031: "Great, your appointment has been booked." The system is receiving a "request-complete" message type after the tool call, but with the wrong content. Instead of speaking the actual tool result ("Great, your appointment has been booked."), it's saying "Reply back with the webhook response." This explains why the conversation flow breaks: 1. The tool call executes successfully and returns "Great, your appointment has been booked." 2. However, instead of speaking this result to the user, the system speaks an internal instruction: "Reply back with the webhook response." 3. This message doesn't make sense to the user and doesn't communicate that their appointment was successfully booked. The issue is in how the tool's "request-complete" messages are configured. The system prioritizes a messages over the actual tool result. Hence you need to remove the "request-complete" message from the tools, and then the tool result will be used to generate the response spoken to the user. https://cdn.discordapp.com/attachments/1349714794243952650/1350795244400017418/Screenshot_2025-03-16_at_4.54.48_PM.png?ex=67d8099a&is=67d6b81a&hm=1404266cd84b196a02b62b02d700583a746b9438f634d033f2933e5304959a2a&
l
Thanks @Shubham Bajaj When the issue first started I didn't have anything configured in the tool's "request-complete" section. I then configured it during testing but have since removed it. I have tested it again this morning. It still isn't isn't providing the webhook response without asking me "are you still there?". Here is the tool code: (I couldn't fit all the code due to the character limit). { "id": "89a3000d-99a3-4a6f-9b73-1d0a5766382d", "messages": [ { "type": "request-start", "content": "One moment while I check availability for that time slot." }, { "role": "assistant", "type": "request-complete", "endCallAfterSpokenEnabled": false }, { "type": "request-failed", "endCallAfterSpokenEnabled": false }, { "type": "request-response-delayed", "content": "Sorry, bear with me, I won't be much longer.", "timingMilliseconds": 10000 } ], "orgId": "ordID", "server": { "url": "webhook" }, "async": false } Here is the call ID so you can see the transcript: 65f7f324-07b9-471f-adb1-7730cddb1370
n
same here the behavior is observable in all combinations (with or without configured tool messages)
k
Hey Lydo, can you please do a call with the request complete message removed and provide me with the call ID so I can take a look into it?
l
Hi @Mason I just did a fresh call with the 'request complete message removed'. Is is the call ID: 6b7a38c2-e1c8-475f-948f-b2cfe094992a This is the call ID from the previous attempt: 65f7f324-07b9-471f-adb1-7730cddb1370
@Lydo Let @Mason and me, know how it goes for you.
l
Thank you so much @Shubham Bajaj @Mason . That was the issue. Its now working as expected.
m
Sweet!!!
7 Views