Hi using a custom llm
# support
p
The TTS never vocalised my last message and the call timed out . Can you please investigate This is call id: c762d5b6-5094-4e95-8f13-71bdaa936510
This has happened again - the graph produces a message. Vapi receives it and it doesn't get volcalised ?
An error occurred in this call: call.in-progress.error-vapifault-custom-llm-llm-failed this is the error
v
🟔 19:07:32:650 No Tool Calls Matched User Provided. Model Provided: \[ { "id": "call_rEtyQ8BfOfLwW3D2HYl5L3Cg", "type": "function", "function": { "name": "create_client", "arguments": "{\\"first_name\\":\\"Sarah\\",\\"last_name\\":\\"Test\\",\\"phone_number\\":\\"0871486360\\",\\"gender\\":\\"FEMALE\\",\\"title\\":\\"Ms\\",\\"date_of_birth\\":\\"1997-12-07\\"}" } } \] 🟔 19:07:34:513 Model called a tool that does not exist { "id": "call_m9oqwVJmGFMylEftvy6TqJnc", "type": "function", "function": { "name": "transfer_to_booking_agent_new", "arguments": "{}" } } It's failing because your module is calling the tools which doesn't exist. I'll suggest you to check your custom LLM for the modules it has associations with and with the VAPI assistant as well. And remove the un-established or non-existing tools.
p
this tool does exist ?
it is a custom llm - > it generated a message. the issue is between sending message from graph to vapi
v
I completely understand that it's a custom LLM and issues on your side. Checking if you are able to resolve this.