Trigger function call inside custom LLM
# support
d
How to call function transferCall from my custom LLM. Could I trigger it from response of chat completion?
v
Hi @Darkhan, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly.
----------------------------- **While I generate an answer, here are some sources that might be helpful**: * [Connecting Your Custom LLM to Vapi: A Comprehensive Guide]() * [Function Calling]() * [Custom LLM]() * [Call Ended Reason]() -----------------------------
### Answer Yes, you can trigger a function call from the response of a chat completion in your custom LLM. Vapi supports function calls within custom LLMs similarly to how it works with OpenAI's API. To set this up, follow these steps: 1. **Set Up Your Server**: Ensure your server endpoint is compatible with the OpenAI client and can handle the function calls. 2. **Configure the Assistant**: Define the function within your assistant configuration. 3. **Stream the Response**: Ensure your server can stream the responses back to Vapi. Here is an example of how to define a custom function for your assistant:
Copy code
json
{
  "name": "My Assistant",
  "model": {
    "provider": "custom-llm",
    "url": "<YOUR OPENAI COMPATIBLE ENDPOINT BASE URL>",
    "model": "my-cool-model",
    "messages": [
      {
        "role": "system",
        "content": "You are an assistant."
      }
    ],
    "functions": [
      {
        "name": "sendEmail",
        "description": "Used to send an email to a client.",
        "parameters": {
          "type": "object",
          "properties": {
            "email": {
              "type": "string",
              "description": "The email address of the client."
            },
            "subject": {
              "type": "string",
              "description": "The subject of the email."
            },
            "body": {
              "type": "string",
              "description": "The body of the email."
            }
          }
        }
      }
    ],
    "temperature": 0.7
  }
}
Make sure your server handles the function call properly and returns the results formatted according to Vapi’s expected structure [(1)]() , [(2)]() , [(3)]() , [(4)]() .
Got it. Tagging @User for additional help.
d
I want to call transferCall function from customLLM, I already sending response to chat completion. But how to call transferCall function from customLLM
@User
s
You don't need to add any other function you can normally use the call forwarding guide. We will automatically add the call forwarding tool to your assistant.
d
@Sahil I have custom LLM working with vapi, and want to use transferCall function. I am returning chat Completion response.
s
Can you check the metadata part? You will see some additional configurations related to callForwading
You just need to pass it to your customllm
3 Views