I'm using a custom LLM with Mistral and I'm gettin...
# support
h
Essentially its outputting the funciton but VAPI isn't seeming to activate a function call after. Call id: 7597455c-3b33-4bf1-be2d-a9f02c065615
@User
I looked at the stream of mistral function call and it's likely because it doesn't exactly match OpenAI's i.e. it just kind of doesn't stream function calls.
How does vapi expect function call streaming so that I can use Mistral model and function calls
k
Looking into it
h
Thank you brother
fixed. created a formatting operator specifcally for function calls to match openai
k
Glad it is fixed