Changing LLM per node in Workflows not working
# support
k
Everytime I try and create a new node with a different LLM behind it, I keep getting this error message. Has anyone figured out a solid solution? https://cdn.discordapp.com/attachments/1380407564730110004/1380407564927504455/Screenshot_2025-06-06_133742.png?ex=6843c43a&is=684272ba&hm=055cb2165a4b2e31b76bcd21c28ad56ab5ac3e48a16213fea33e79a29a4051a6&
v
Currently, Vapi Workflows only support "openai" and "anthropic" as model providers for each node. If you try to use a different provider, you'll get the "Save Failed" error you described. To resolve this, ensure every node's
model.provider
is set to either
"openai"
or
"anthropic"
. If you need support for other providers, you may want to submit a feature request or follow the [Vapi changelog](https://docs.vapi.ai/changelog) for updates. Sources: - [Vapi Workflows Documentation](https://docs.vapi.ai/workflows) - [Vapi Changelog May 10, 2025](https://docs.vapi.ai/changelog/2025/5/10)
k
Yes I am using OpenAI and if I open a new workflow and then try to change the LLM even for one node to a different OpenAI module I get the exact same error
k
Clear session data before switching, verifying SSE setup, checking model-specific configs, and testing with simple workflows..