Strict parameter doesn't work
# support
n
I see that y'all added the
strict
parameter to the function definition, but it still seems like something is not right, since I'm stil seeing incorrect tool calls (Invalid json) being passed around. If I change the model to
custom-llm
(proxying to openai), it does work normally, so I assume that is not being passed down to the model correctly when using the openai models. Another thing is that I noticed is that parallel tool calls still happens when
strict
is enabled, but that should not happen since structured output will not work in such case. From OpenAI docs: > Structured Outputs is not compatible with parallel function calls. When a parallel function call is generated, it may not match supplied schemas. Set parallel_tool_calls: false to disable parallel function calling. [ref](https://openai.com/index/introducing-structured-outputs-in-the-api/) So I assume you should set the
parallel_tool_calls
to false when
strict
is enabled.
v
@Nomadiction Thank you for bringing this to our attention. It seems there might be an issue with how the
strict
parameter is being handled, especially in relation to the
parallel_tool_calls
. To ensure that the structured output works correctly with the
strict
mode, it is indeed necessary to set
parallel_tool_calls
to false, as per the OpenAI documentation. For the incorrect tool calls resulting in invalid JSON, this should not be happening, and it's important that we address this promptly. It's possible that there is a misconfiguration or a bug that is causing the
strict
parameter not to be respected when proxying to OpenAI models. To resolve these issues, I would recommend reaching out to our support team with specific examples of the incorrect tool calls you're experiencing. They will be able to investigate the problem in more detail and work on a fix. Please provide them with as much information as possible, including any error messages and the context in which the issue occurs. In the meantime, if you need to make any immediate changes to your setup, such as disabling parallel tool calls manually, please let me know, and I can assist you with that process.
You can continue this conversation by mentioning me in the message.
Got it. Tagging @User for additional help.
n
Call ID:
c2f3d259-4a7d-4072-a9dc-5874a5c0ddd2
Ping @Shubham Bajaj
@User can I get some eyes here?
s
@Nomadiction Thanks for pinging me, please allow me sometime time to get deep into this and find out what can be done.
@Nomadiction It will take me some more time to share proper answer with you.
n
Any news @Shubham Bajaj ?
s
@Nomadiction If someone sets strict to true for the tools calling schema, we only ensure the input JSON schema is correctly passed. However, if a user accidentally does parallel tools calling through prompting, according to logic, the tools input parameters sent still have a strict effect.
@Nomadiction will you be available for discord office hours today will be happy to resolve it over the live call?
n
For sure
We just spoke on the stage and spoke to: 1. Avoid using nested properties as it might confuse the llm 2. @Shubham Bajaj to followup with Vapi's team to ensure that you are passing the parameters correctly to openai/azure (Especially the
strict
parameter)
Hey @Shubham Bajaj I've run into a scenario that makes me pretty confident that you're not passing the
strict
parameter down to open ai I created this tool:
Copy code
{
  "function": {
    "name": "SaveUserInformation",
    "strict": true,
    "parameters": {
      "type": "object",
      "properties": {
        "firstName": {
          "type": "string"
        },
        "lastName": {
          "type": "string"
        }
      },
      "required": [
        "firstName"
      ],
      "additionalProperties": false
    }
  }
}
> To use Structured Outputs, all fields or function parameters must be specified as required. https://platform.openai.com/docs/guides/structured-outputs?context=ex1#all-fields-must-be-required
I can use this tool without any issues (Except that even setting the lastName as required I can still force the llm to not pass it)
But if I call the openai directly with this tool, I get an validation error because of that rule specified abot
example:
Copy code
const openai = require('openai');

const client = new openai.OpenAI();

const result = client.chat.completions.create({
  model: 'gpt-4o',
  tools: [
    {
      type: 'function',
      function: {
        name: 'SaveUserInformation',
        strict: true,
        parameters: {
          type: 'object',
          properties: {
            firstName: {
              type: 'string',
            },
            lastName: {
              type: 'string',
            },
          },
          required: ['firstName'],
          additionalProperties: false,
        },
      },
    },
  ],
  messages: [{ role: 'user', content: 'Hello, world!' }],
});

result
  .then((res) => {
    console.log(res);
  })
  .catch((err) => {
    console.log(err);
  });
And the out is:
Copy code
node -r dotenv/config fix.js
BadRequestError: 400 Invalid schema for function 'SaveUserInformation': In context=(), 'required' is required to be supplied and to be an array including every key in properties. Missing 'lastName'.
    at APIError.generate (node_modules/.pnpm/openai@4.77.3_zod@3.23.8/node_modules/openai/error.js:45:20)
    at OpenAI.makeStatusError (node_modules/.pnpm/openai@4.77.3_zod@3.23.8/node_modules/openai/core.js:293:33)
    at OpenAI.makeRequest (node_modules/.pnpm/openai@4.77.3_zod@3.23.8/node_modules/openai/core.js:337:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
  status: 400,
  request_id: 'req_3ade456d5b96234816315e7c3049e1f7',
  error: {
    message: "Invalid schema for function 'SaveUserInformation': In context=(), 'required' is required to be supplied and to be an array including every key in properties. Missing 'lastName'.",
    type: 'invalid_request_error',
    param: 'tools[0].function.parameters',
    code: 'invalid_function_parameters'
  },
  code: 'invalid_function_parameters',
  param: 'tools[0].function.parameters',
  type: 'invalid_request_error'
}
So, if vapi was passing the
strict
parameter down to openai, the llm should be erroring out in this case (Because of the rule of all fields must be required in the schema)
Can you share this example with the development team to double check whats going on here?
Ping @User
s
@Nomadiction It is on our radar and we are reviewing it internally. I understand it's taking time, but sometimes things that look easy on the surface level turn out to be diffr. Given our other parallel work, we need some time for the final investigation and resolution.
n
np, I'm just following up to make sure this is being reviewed. Thanks for the reply
s
@Nomadiction We are doing verification on our end and have created a ticket for this issue. While things may look simple on the surface level, it could take some time since it involves work. It's not particularly difficult, but bear with us. This is definitely model-related and may be on our side.
n
Gotcha! Eagerly waiting for this 🤩
s
@Nomadiction Will update you once ticket is resolved.
n
@Shubham Bajaj following up on this, any news?
s
@Nomadiction The ticket has not been picked up yet, and that's on us. You can expect a resolution by the end of this week.
n
thank you
s
@Nomadiction Just an update the issue has been picked yet because of other parallel tasks, soon it will be resolved.
n
Gotcha Hoping to get this fixed this week 🙏
@Shubham Bajaj any news here? /
s
@Nomadiction I apologize for the delay in providing you with a report and updates. I have just followed up with the team regarding this matter, will let you once I hear back from them.
n
No news @Shubham Bajaj ?
Ping @Shubham Bajaj
@User
s
@Nomadiction Apologies for the delay, I have ping team again for this. Pushing this as P1 now.
n
Do you have an ETA @User
?
p
Well let you know regarding final ETA in sometime.
2 Views