pipeline-error-azure-openai-llm-failed
# support
p
Having issues where assistants that have a function are failing. I have tested my setup through Vapi to Pipedream and the API call works, but the following errors is observed each time. This has occurred since 04/05. Error:pipeline-error-azure-openai-llm-failed Example Calls: Call ID 1: 2bcf85f1-85a6-4924-a75d-aade33acfbc5 Call ID 2: 1c80ee9f-a14b-4cc6-b2b0-3ef91ae26e64
s
Yeah. I noticed this issue as well. It doesn’t occur that often but I will talk to Nikhil about this and If I found any fix I will help you to fix it.
So, it is a bug from the Vapi side. It will be fixed soon. // @pakachan
n
Copy code
🔵 14:11:42:865 Couldn't Complete Completion Request. Error: {
  "message": "Invalid schema for function 'PlaceTakeoutOrder': In context=('properties', 'items'), array schema missing items",
  "type": "invalid_request_error",
  "param": null,
  "code": null
}
You can try hitting the OpenAI endpoint directly to test
a
Hi, having the same issue. Here's the callid if it helps. ad08e9d1-863f-46b4-9679-dbe3e1bd3ca6