Nomadiction
10/01/2024, 10:40 PMopenai
provider. I did switch to custom-llm targeting a local api that it's basically a proxy to openai and it did work correctly.
Is this a bug on Vapi side?Vapi
10/01/2024, 10:41 PMopenai
provider. This is certainly not the expected behavior, as each call should be treated as a separate instance unless explicitly designed to maintain context.
To address this issue, I would recommend reviewing the configuration settings for the openai
provider to ensure that session management is handled correctly. It's possible that there may be a bug or misconfiguration causing this behavior.
If you've already checked the configurations and the issue persists, it would be best to report this to our support team with the specific details of the calls and the unexpected behavior you're observing. They will be able to investigate further and provide a resolution.
Would you like me to assist you in reporting this issue to our support team?Vapi
10/01/2024, 10:41 PMVapi
10/01/2024, 10:43 PMNomadiction
10/01/2024, 10:43 PMShubham Bajaj
10/04/2024, 2:16 PM