Custom LLM endCall tool call
# support
p
Hi, I'm using the custom LLM integration and it's working great. I'm trying to invoke the endCall tool at the end of the conversation. I'm streaming back the result which looks like
Copy code
`data: ${JSON.stringify({
  id: ...,
  object: "chat.completion.chunk",
  created: Math.floor(Date.now() / 1000),
  model: modelName,
  choices: [
    {
      index: 0,
      delta: {
        tool_calls: [
          {
            index: 0,
            id: toolCallId,
            type: "function",
            function: { name: "endCall", arguments: "{}" },
          },
        ],
      },
    },
  ],
})}\n\n`
This seems to be correctly recorded on the VAPI side since I see
Model called tool: endCall()
in the call log. However, the call continues. Any help would be appreciated! Call ID: ce04b053-b250-4ebc-a202-53acd21a9c50 https://cdn.discordapp.com/attachments/1373619775124672594/1373619775346966700/Screenshot_2025-05-18_at_4.13.05_AM.png?ex=682b129b&is=6829c11b&hm=b6ca04c1bd06faf640f636ef5034cc9445189e77e75db386b22c2d7923580a60&
Btw, I'm aware that I can use the call control API. However, doing so sometimes results in hanging up before the spoken response is complete. I tried to do a sleep but that wasn't as seamless, so I'd like to use the stream response + tool call approach.
s
hey @Phil can you help me understand what issue your facing?
p
I think I figured it out. I needed to enable to endCall tool for the assistant
a
sounds, good. Closing this ticket.