Custom LLM tool calling
# support
c
@Vapi @User I need help understanding why tool calling doesn’t work when using a custom LLM. It seems that the VAPI platform stops sending tool call data when the custom LLM endpoint is enabled. Can you explain why this happens or provide guidance?
v
@cclarke411 Certainly! When you enable a custom LLM (Large Language Model) on the VAPI platform, it's possible that the integration with the tool calling feature might not be configured correctly to work with the custom model. This could be due to a variety of reasons, such as: 1. The custom LLM might not be set up to handle external tool calls. 2. There could be a misconfiguration in the way the custom LLM is integrated with the tool calling feature. 3. The tool calling feature might require additional settings or parameters that are not set by default when using a custom LLM. To resolve this issue, you should check the following: - Ensure that the custom LLM is properly configured to make external calls to tools. - Verify that any necessary permissions or API keys are correctly set up to allow the custom LLM to interact with other tools. - Review the integration settings between the custom LLM and the tool calling feature to ensure they are compatible. If you've checked these aspects and the issue persists, it would be best to reach out to VAPI support for more detailed assistance. They can provide specific guidance based on your setup and help troubleshoot the problem. Would you like me to assist you with anything else regarding your custom LLM setup?
You can continue this conversation by mentioning me in the message.
c
is there an example of a custom llm that is configured to make external calls to tools. I didnt see any in the github repo. how is this done
s
Hey @cclarke411 apology somehow your ticket got missed. The tool calling works the similar way as they used it's just that you need to ensure the openai messages format is maintained from your side.
Now regarding the example allow me sometime to create and share the example.
m
@Shubham Bajaj can you please share function call example
s
@mukul Do you want to use tool calling within your LLM and/or use tools with a custom LLM model natively?
m
@Shubham Bajaj this is working in case having conversation with vapi using dashboard(Talk with Assistant) but not working when assigning this assistant to a phone number and returning result of tool call in same way.
s
Hey @mukul To help track down this issue, could you share the following in a new #1211483291191083018 ticket: - The call ID - When exactly this happened (the timestamp) - What response you expected to get - What response you actually got instead This would really help me figure out what went wrong!
4 Views