Custom LLM with Dialpad function
# support
a
Is there a way to support the dialpad function with custom llm?
v
@asdf Absolutely! The dialpad function can be supported with a custom language model (LLM) by training the model to understand and process voice commands related to dialing numbers. This would involve recognizing spoken digits and commands, and then translating them into actions that trigger the dialKeypad function to enter the appropriate digits. If you need assistance with integrating or training a custom LLM for this purpose, please let me know, and I'd be happy to help guide you through the process.
You can continue this conversation by mentioning me in the message.
s
yeah it works the same way the othet tool call works. try it once and let me know @asdf
a
@User Not sure if it works as intended. This is what I am doing. Custom llm works, but do I need to add my own custom function to do the dialpad? { model: { provider: "custom-llm", model: "gpt-4-0125-preview", url:
${process.env.NEXT_PUBLIC_SERVER_BASE_URL}/api/custom-llm
, temperature: 0.4, messages: [ { content: initialPrompt, role: "system", }, ], maxTokens: 250, }, language: "en-US", dialKeypadFunctionEnabled: true, ... }
What happens is it just says "1" instead of dialing 1 even when I pass in the parameter dialKeypadFunctionEnabled.
s
simply use tool of type dtmf and invoke it and let Vapi handle from there.
@asdf you can ping me once your online will connect with you and help you out. also you can add the tool as:
Copy code
{
...assistantConfig
"tools": [{type: "dtmf"}]
}
because of no response closing this ticket.
4 Views