Google multimodel model v2v doesn't work
# support
b
The call gets doped as soon as I started talking call id 72f5bea4-f336-44c0-a1e0-bb3631e7e811 pls help 🙏🏻
k
Hi! Looks like you're using custom credentials with google's realtime models. Based on the code and the call logs, I'm guessing that your permissions aren't set up right. We'll add more descriptive errors in the future. Sorry!
\*to clarify - google's experimental and realtime models are deployed differently than regular gemini models, and iirc require specific setup
b
Hey, I ran gemni v2v on dailybots with the same api key and it worked fine
if there's anything i can do from my side, pls lmk
k
Can you try removing it and giving it a shot?
b
tried to create a new assistant with that model
i think it crashes when i try to call a function
here's a an example
of a crash
k
Hey, this is happening due to the same issue Ryan mentioned previously. Not tool specific, we just can't reach your LLM.
b
but my api key works with daily bots tho
and it does kinda work on vapi at least it's saying hello and stuff
the function call isn't working
k
Can you send me the call_id?
b
this one
k
??
it crashes when i ask it a question
k
Can you remove your google API key and give it a shot?
b
e8ced81f-e6cb-4901-b609-153661bf01b3
same problem
k
It is because of your tool schema. Can you remove all your tool and give it a shot. Btw, this is the error message: \[VertexAI.ClientError\]: got status: 400 Bad Request. {\\"error\\":{\\"code\\":400,\\"message\\":\\"Unable to submit request because required fields \['question'\] are not defined in the schema properties.\\",\\"status\\":\\"INVALID_ARGUMENT\\"}}