Devon Humes
05/20/2025, 2:32 PMVapi Support Bot
05/20/2025, 2:34 PM"function"
message with the tool's name and result to your LLM's messages array after each tool call. For example:
js
const functionMessage = {
role: "function",
name: toolCallInfo.name,
content: JSON.stringify(result)
};
// Add this to your messages before the next LLM call
This way, your LLM receives the tool response as context for future replies. You need to implement this step in your integrationāVapi does not do it automatically.
[See the official guide for details.](https://docs.vapi.ai/customization/tool-calling-integration#how-it-works)
Source:
- [Vapi tool calling integration guide](https://docs.vapi.ai/customization/tool-calling-integration#how-it-works)Devon Humes
05/20/2025, 4:01 PMKings_bigš«
05/20/2025, 8:15 PMKings_bigš«
05/20/2025, 8:15 PMDevon Humes
05/20/2025, 8:27 PMDevon Humes
05/20/2025, 8:29 PM7701049e-7426-4867-82b6-132103a5764b
Aditya
05/20/2025, 11:07 PMDevon Humes
05/21/2025, 2:37 AMDevon Humes
05/21/2025, 2:38 AMAditya
05/21/2025, 11:52 AMDevon Humes
05/21/2025, 12:38 PM[
{
"name": "Ticket_Creator",
"toolCallId": "call_SCoUOCdQ1CFHoSaWoledhepI",
"result": [
{
"Updated on": "2025-05-21T12:34:39.745Z",
"Description": "The car is currently not working. The engine light is on, and there is a significant amount of oil leaking under the car. Immediate attention is needed to diagnose and repair the issue.",
"Created on": "2025-05-21T12:34:39.745Z",
"Updated by": "elroy@elementum.com",
"Title": "Car not working with engine light on and oil leak",
"ID": "SNOW-84",
"Created by": "elroy@elementum.com",
"id": "SNOW-84",
"App Version": 1
}
],
"error": null
}
]
VAPI still says that "No results returned."
Here is the call ID
25cd0a59-7619-46fe-849b-66de2241d72f
Devon Humes
05/21/2025, 1:40 PMresponses
instead of results
. It works as expected.Aditya
05/22/2025, 12:23 AM