custom llm
# support
i
@User I'm facing issue where I'm trying to use custom LLM. It is not throwing any errors but I didn't hear anything back with my response emmited. Can you help me with response format or anything else I'm doing wrong? Thanks in advance
v
@Ikya Of course, I'd be happy to help you with your custom LLM issue. To assist you effectively, I'll need a bit more information about how you're implementing the LLM and the response format you're expecting. Could you please provide me with the following details? 1. How are you integrating the LLM into your system? Are you using any specific libraries or frameworks? 2. What is the expected response format you're looking for? Is it plain text, SSML, or some other format? 3. Can you share a snippet of the code or the steps you're following to emit the response? Once I have this information, I'll be able to guide you more accurately.
You can continue this conversation by mentioning me in the message.
i
@User My requirement is to send my custom responses using my own llm. So for testing purpose I have ran a file and the correspondin ngrok link was updated in the custom LLM Url. When i tested the assistant, My application is receiving callbacks and it is sending responses. Vapi assistant is not reading out those response and I'm not seeing any errors in the logs. Attaching my code snipet below. Can you provide a working code snipet that can be used to send responses in custom LLM? https://cdn.discordapp.com/attachments/1320832565308948532/1320979257610076332/test3.py?ex=676b914d&is=676a3fcd&hm=30467f15559fb216646c39815c3353344c907dfb9d0e347083d0a8fe5dad080d&
j
Hey @Ikya , I am having same issue, were you able to solve this? Thanks
s
Hey @Ikya To help track down this issue, could you share: - The call ID - When exactly this happened (the timestamp) - What response you expected to get - What response you actually got instead This would really help me figure out what went wrong!