Assistants fail to trigger tools
# support
m
I have had this problem for 2 days now. even old assistants that i have not touched in months fail to run tools even tho i have not changed anything to them. However for your better understanding i ran a test on 2 separate accounts to rule out account problems. I created a new assistant and gave it this prompt: "What you need to do is whenever the user says "I want test" to run the "test" function". Thats all. I need to mention that the "test" tool has been tested in the tools pannel and it works perfectly fine on its own. However as you can see in the attached picture my assistant fails to trigger the tool even with a straightforward prompt like this. On 2 separate accounts its the same... Please help because i cant find any explanation. Thanks in advance. https://cdn.discordapp.com/attachments/1367035803397521528/1367035803758493737/Screenshot_2025-04-30_at_10.01.12.png?ex=68131ecd&is=6811cd4d&hm=aa8f6266345050b74df31beacd6c0e5d139c7fabcaa562d7bab3be01ff2147c2&
v
Try using a more explicit and structured prompt format to trigger your tool:
Copy code
When the user says exactly "I want test", you must use the test function.

IMPORTANT: Do not respond conversationally when you hear this trigger phrase.
Instead, immediately call the test function without asking for confirmation.

Example:
User: "I want test"
Action: Call test function immediately
Make sure your tool is properly added to the model's
tools
array in your assistant configuration. Sources: - [Discord discussion about explicit tool calls](https://discord.com/channels/1211482211119796234/1333245023319494707) - [Discord discussion about clear tool triggers](https://discord.com/channels/1211482211119796234/1339689865440268328)
m
i just copied the prompt you gave me and ran another test. the result is as you see in the attached picture.. The tool i made sure is added to the assistant's tool array. https://cdn.discordapp.com/attachments/1367035803397521528/1367037592822087680/Screenshot_2025-04-30_at_10.18.05.png?ex=68132078&is=6811cef8&hm=d2c4f9a599d987ee724c5e96aae9ea9591ed71cc3182572389120ad206da80e1&
e
Update the Assistant's prompt to: "Call the 'test' function immediately when the user says 'I want test'." Verify the tool's server URL is accessible and responds correctly. Check call logs in the Vapi Dashboard for transcription errors
m
Again no response. The url is ok, i said i tested the tool separately and it works. the assistant just doesnt want to trigger it
e
The Vapi Assistant still isn't triggering the "test" tool because the LLM misinterprets "I want test" despite the updated prompt. Since the tool's server URL is confirmed working, the issue lies in the Assistant's logic. In the Vapi Dashboard, check call logs to confirm the transcribed text matches "I want test" exactly. If transcription is correct but no tool call is logged, the LLM isn't recognizing the trigger. Update the Assistant's model to OpenAI's GPT-4o via the Dashboard or API (PATCH /assistant/{id} with model.provider: "openai", model.model: "gpt-4o"). Retest.
m
The model is already gpt4o. There are no call logs to check because again, you can clearly see i am using the CHAT feature not speaking to the agent thru a call.
there is no transcription to check
e
Can I see the prompt you provided to the assistant?
m
i already tested 3 sepparate prompts, of which 2 of them were given to me here by support to try: 1: What you need to do is whenever the user says "I want test" to run the "test" function 2: When the user says exactly "I want test", you must use the test function. IMPORTANT: Do not respond conversationally when you hear this trigger phrase. Instead, immediately call the test function without asking for confirmation. Example: User: "I want test" Action: Call test function immediately 3: Call the 'test' function immediately when the user says 'I want test'. All of them failed
e
Update the Assistant's prompt to: "If the user types 'I want test', call the 'test' function immediately without responding." In the Vapi Dashboard, confirm the "test" tool is enabled under the Assistant's settings and its name matches exactly. Retest in the chat
Also, can you send a screenshot of the tool functions?
@Matei is it going through?
m
i just tried it, same result
this is the tool
but as you can see i think its a system problem, because if the prompting vas the problem, then my old assistants would work
I dont know what else is to do besides some vapi engineer check this out because its not on my side as you can see
e
This is where the problem is
Add a server URL in the tool settings. In the Vapi Dashboard, edit the "test" tool, add a server field with a valid URL, and ensure the server responds to POST requests. Example:"server": { "url": "https://your-server.com/test", "method": "POST" }
The tool’s description says, "this needs to be run whenever the user says 'I need test'," but you’re typing "I want test" in the chat. While the Assistant’s prompt aligns with "I want test," the tool’s description might confuse the LLM (GPT-4o) about when to trigger the tool. Fix: Update the tool’s description to match the exact trigger phrase: "Run this whenever the user says 'I want test'."
m
nope, still not working
AGAIN, the problem is bigger that this assistant and tool. ALL my assistants behave like this. If you think logically, since my old assistants stopped triggering tools without me changing anything to them, the problem must not be on My Side!
e
I understand you’re facing an issue with your Vapi Assistant—it’s not triggering the "test" tool as expected, even though you’ve confirmed the tool works independently. This must be frustrating, especially since it’s impacting both new and existing Assistants across multiple accounts. I’m confident I can help you fix this, and I’d like to offer my assistance under a contractual arrangement. Here’s what I propose: Step 1: Diagnose and Fix I’ll review your tool configuration (server URL, description, parameters) and troubleshoot why the Assistant isn’t triggering it properly. Step 2: Test and Verify We’ll test the Assistant together in the chat feature to ensure it works seamlessly. Step 3: Support if Needed If the issue persists due to a platform bug, I’ll help you escalate it to Vapi support for a resolution. In return, I’d like to work with you to complete your project under a simple contract. This way, you get a fully functional Assistant, and I’m compensated for my time and expertise. It’s a win-win! If you’re up for it, let me know, and we can hash out the details. I’m ready to do this and get this sorted for you quickly!
m
its ok, ill keep trying to see what i can do myself and if the need arises ill contact you
e
that's completely understandable! I respect that you want to tackle it yourself first.If you're troubleshooting, one quick tip that might save you some time: double-check the server URL in the tool configuration. It’s a common spot where things can go wrong—sometimes it’s not set up correctly or isn’t responding to POST requests, and that can cause issues without any obvious error messages.If you hit a wall or just need a second opinion later, don’t hesitate to reach out. I’m here to help, whether it’s a quick fix or something more involved—we can even discuss a contract if you need deeper support. Good luck, and I hope you get it sorted soon!
m
i found something. aparently all tools work on audio calls, but dont work while testing the assistants on chat...
so the cham messaging is the only place where its a problem
chat*
e
Yes, which is why I want to help and work with you to get this done
m
Its fine, probably just a bug. I am happy to see that the actual assistants work via phonecall
e
Most likely not a bug. It's probably the whole prompt you gave the assistant.
v
Hey, really sorry for your experience. If possible, could you record a loom video describing this error/behavior or inconsistency in the dashboard with the steps to reproduce it? That way, I can share this with my team, and we'll take a look. We'll even try to reproduce the same so we can fix it for you. Also, I'll request you to share your assistant ID.
e
@Matei is it working now?
v
This will be fixed by end of this week.
4 Views