@cclarke411
Certainly! When you enable a custom LLM (Large Language Model) on the VAPI platform, it's possible that the integration with the tool calling feature might not be configured correctly to work with the custom model. This could be due to a variety of reasons, such as:
1. The custom LLM might not be set up to handle external tool calls.
2. There could be a misconfiguration in the way the custom LLM is integrated with the tool calling feature.
3. The tool calling feature might require additional settings or parameters that are not set by default when using a custom LLM.
To resolve this issue, you should check the following:
- Ensure that the custom LLM is properly configured to make external calls to tools.
- Verify that any necessary permissions or API keys are correctly set up to allow the custom LLM to interact with other tools.
- Review the integration settings between the custom LLM and the tool calling feature to ensure they are compatible.
If you've checked these aspects and the issue persists, it would be best to reach out to VAPI support for more detailed assistance. They can provide specific guidance based on your setup and help troubleshoot the problem. Would you like me to assist you with anything else regarding your custom LLM setup?