@Youssef_Gep
It sounds like there might be an issue with the integration or processing between your custom LLM and Vapi. This could be due to a variety of factors such as a misconfiguration in the integration setup, a processing error within Vapi, or an issue with the custom LLM's output interpretation.
To better assist you, I would need more specific details about how you've configured your custom LLM with Vapi and the steps you're taking that lead to this output discrepancy. If you could provide more context or any error logs, that would be very helpful in diagnosing the problem.
In the meantime, please ensure that:
1. Your custom LLM is properly connected to Vapi as per the guidelines in the documentation.
2. The input and output configurations are correctly set up.
3. There are no encoding issues or misinterpretations of the output from the custom LLM.
If the issue persists, you may need to review the integration guide again or reach out to support for more detailed assistance.