Mahimai Raja
04/13/2025, 5:29 AMVapi Support Bot
04/13/2025, 5:31 AMtransferCall
from your custom LLM in Vapi, including implementing warm transfers!
Here's how you can do this:
### 1. Add the Default Tools to Your Model's Tools Array
First, you need to add the transferCall
tool to your model's tools
array in your configuration:
json
{
"model": {
"provider": "openai",
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "system",
"content": "You are an assistant. When the user asks to be transferred, use the transferCall function."
}
],
"tools": [
{
"type": "transferCall",
"destinations": [
{
"type": "number",
"number": "+1234567890"
}
]
}
]
}
}
### 2. Handle the Tool Call in Your Custom LLM
When your custom LLM detects the need for a call transfer, you need to send a function call payload back to Vapi. Here's how to handle it:
javascript
if (functionName === "transferCall" && payload.destination) {
const functionCallPayload = {
function_call: {
name: "transferCall",
arguments: {
destination: payload.destination,
},
},
};
res.write(`data: ${JSON.stringify(functionCallPayload)}\n\n`);
// Skip further processing for this chunk
continue;
}
### 3. Implementing Warm Transfer
To implement a warm transfer (which provides context to the recipient), you need to add a transferPlan
object to your configuration. Here's an example for a warm transfer with a message:
```json
{
"type": "transferCall",
"destinations": [
{
"type": "number",
"number": "+1234567890",
"transferPlan": {
"mode": "warm-transfer-with-message",
"message": "Hi, we have customer {{firstName}} calling about {{issueType}}. Can you assist them?"
}
}
],
"function": {
"name": "transferCall",Vapi Support Bot
04/13/2025, 5:31 AMMahimai Raja
04/13/2025, 6:14 AMMahimai Raja
04/13/2025, 6:14 AMKyle Brunker
04/13/2025, 6:22 AM