Function call issue with custom llm option
# support
l
Can someone please help me with this issue because currently the custom llm works but the function call is not being triggered?
Also, for concurrent calls, is it 3 calls allowed concurrently?
s
Does your Custom LLM support function calls? Yes, you can make three concurrent calls.
l
Yes it does support function calls, but does not work with Vapi for some reason
s
Can you show me your function configuration settings
l
s
Did you specific in your prompt to use your function when this X event occurred. Also, which LLM model are you using right now?
l
yes, 4turbo but in customllm
currently facing 2 issues, one is the customllm function calling fails, but it can respond correctly, just somehow keep failing to trigger function calls. Second issue is unable to add api key for my function call in my own aws-based api which requires header 'x-api-key'
s
Try once to trigger a very basic function like getName function with no parameters and try to run on new assistant. Also, put your server url in advance section to monitor whether the call is being made or not
l
function call isn't being made, I monitor inside my aws cloudwatch logs
s
It’s strange. Can you please share the your call_id someone from the admin team will check it out.
l
ffea6426-c66a-4da7-86a8-d5ee0d7dd11a
s
@User Could you please look into this issue?
l
@Sahil
j
Let me confirm custom LLMs support function calls on our end
Yes looks like we do support function call parsing from the custom LLM output
Are you streaming back the function calls to us as they come in from openai?
s
Copy code
json
{
  "model": "gpt-4-0125-preview",
  "temperature": 0.7,
  "presence_penalty": 0,
  "frequency_penalty": 0,
  "top_p": 0.9,
  "stream": false,
  "messages": [
    {
      "role": "system",
      "content": "You are ChatGPT, a large language model trained by OpenAI.\nCarefully heed the user's instructions.\nRespond using Markdown"
    },
    {
      "role": "user",
      "content": "1+1"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_calculation_result",
        "description": "Calculate a math expression. For example, "2 + 2" or "2 * 2". The expression must be a valid JavaScript math expression.",
        "parameters": {
          "type": "object",
          "properties": {
            "expression": {
              "type": "string",
              "description": "A valid JavaScript math expression for the calculation."
            }
          },
          "required": [
            "keyword"
          ]
        }
      }
    }
  ],
  "tool_choice": "auto"
}
// @jordan
They are following the exact same output format as openai.
l
currently the custom llm works, but only the function call part is not working (triggered by AI but failed, I monitor my aws cloudwatch logs, there's no call hitting my API endpoint for the custom functions)
s
My intuition tells me that function calls should be made from the custom LLM part using tools Instead of Vapi’s custom function part.
n
yes you can actually return function calls to us in custom llm
we call custom LLM through OpenAI SDK just by changing the base URL
so from our POV it's no different than calling OpenAI
function call etc. will work out of the box
the challenge is for you to figure out how to SSE back the function call
for that, write a script that uses OpenAI sdk to call your custom LLM
with stream enabled
if you're able to successfully return the function call to your script, it'll work with us
l
@nikhil pmed you with details about the situation, I don't think it's possible for what you described?
s
Im also not able to perform function calling. Its clearly a vapi server issue, as testing endpoint with postman is successfull
l
@nikhil any updates on the migration to tools & tool_choice?
n
hey these should be all fixed now, pls lmk if you still any issues
l
still not working @nikhil
n
come to office hours pls
l
can the office hours be earlier since it's 2am in my tz?
n
yes we need to run a poll and support more hours, sorry about that
cc@0xpranjal.eth
m
Running into similar issue with 3.5 … the function calls are not happening… Pl help
u
Hey @Maddy , Let me check
l
Same as @Maddy
u
Hey guys, mind giving me call ID's ? @l.tcham90_00549_28060 @Maddy
It should've been working fine from our end, let us investigate it
m
300a6ec9-1248-4cb4-872d-e256618c14ba
l
Call ID?
s
@0xpranjal.eth I'm having the same issue as well. The custom function isn't being called for me. The ai responds "AI: Just a Give me a moment. I apologize, but it seems that I'm unable to retrieve information about..." Call id: 3f9715c0-d854-465f-ba68-b73022363493
u
Yeah, the only fix right now seems to be using GPT 4 instead of 3.5
s
@0xpranjal.eth that's what I'm using. Any other suggestions or we just have to wait on the vapi team?
m
@0xpranjal.eth tried switching to GPT4 but didnt help ... still no calls to function
'id': 'e0ebd656-e41d-4db7-b29e-73987a500641'
s
@Maddy Same for me
u
@sandya5454 Maddy's issue seems to be related to promt. could you help me with the call id so that we can be certain?
s
@0xpranjal.eth 599c1d9a-51f7-46b3-b8c3-8cd81080904b
u
@sandya5454 Checking!
l
FYI, I had similar problem but I fixed by using deprecated API param of
function_call
or
functions
instead of
tools
in ChatGPT
s
Thanks for letting me know @Lacoste
@0xpranjal.eth Any luck? Tried again, same issue. I'm not seeing any attempts to make a function call
3 Views