getting empty response with endCall
# support
v
For Vapi-built tools like
endCall
, they might be filtered from the chat completions endpoint, resulting in an empty response. Try including these tools in your custom LLM tools:
Copy code
javascript
// Include endCall in your custom LLM tools
const tools = [
  { 
    type: "endCall"
  }
];

// When the model requests them, send a "done" stream back to Vapi with the function name
This approach should help resolve the empty webhook response issue. Source: - [Discord discussion about endCall filtering](https://discord.com/channels/1211482211119796234/1339871158975336498)
For Vapi-built tools like
endCall
, they might be filtered from the chat completions endpoint, resulting in an empty response. Try including these tools in your custom LLM tools:
Copy code
javascript
// Include endCall in your custom LLM tools
const tools = [
  { 
    type: "endCall"
  }
];

// When the model requests them, send a "done" stream back to Vapi with the function name
This approach should help resolve the empty webhook response issue. Source: - [Discord discussion about endCall filtering](https://discord.com/channels/1211482211119796234/1339871158975336498)
m
is there something I can add in my Prompt for this?
k
Could you please add a description to your support.
Could you please add a description to your support ticket\*.
m
you can close this ticket. we discovered the chat API has been disabled, which completed crippled us.
its pretty frustrating when things get completely turned off/disabled when we're trying to launch something to production
k
Hello @micek, The chat feature was in beta testing and was not yet ready for production. Therefore, we have temporarily halted our chat API. For your information, the chat API will be available for use again in three weeks.