Using Fine-Tuned GPT's in VAPI
# support
d
I've been trying to use my own fine-tuned model with vapi, and I keep getting this error after the assistant uses the cached first line greeting. It's not picking up the openai api key. I've added it previously, to the custom LLM provider. So I tried to add it again to make sure I had it, and I started getting this error. I created the assistant, and patched it, that is how I got this far. metadataSend is off... but when I try to save those api credentials again in the llm provider area, it is acting like the Oauth credentials are required and will not save my openai api key. And so every assistant that uses my fine tuned model fails because the authorization is not happening.
v
@Digi Fidelis I'm sorry to hear you're experiencing issues with setting up your fine-tuned model with VAPI. It sounds like there might be a problem with the API key configuration or a mismatch in the expected credentials. To assist you better, could you please confirm if you have the correct OpenAI API key ready to be set up? Also, it would be helpful to know if you are receiving a specific error message that could give us more insight into the issue you're facing. Once I have this information, I can guide you through the steps to resolve the issue or direct you to the appropriate support channel for more specialized assistance.
You can continue this conversation by mentioning me in the message.
The error is suggesting it expects the OAUth credentials
Can you help?
Is this in the wrong place?
I'm still trying to fix this issue with the api key not saving on custom LLMs.
v
@Digi Fidelis It seems like there might be a configuration issue with the API key for your custom LLM. To address this, let's try the following steps: 1. Double-check that you have the correct OpenAI API key for your fine-tuned model. 2. Ensure that you are entering the API key in the correct format and in the appropriate field without any leading or trailing spaces. 3. Verify that the API key has the necessary permissions and has not expired or been revoked. If you've confirmed these details and the issue persists, it might be necessary to look into the system's requirements for OAuth credentials. Since you mentioned that the error suggests it expects OAuth credentials, it's possible that the system has been updated to require these for additional security or functionality. If you're unsure about the OAuth credentials or how to proceed, please reach out to support for more detailed guidance on this matter. They will be able to provide you with specific instructions or help troubleshoot the issue further. Would you like me to guide you on how to contact support or provide any additional assistance?
d
HOw do i get more support for this. The credentials are openai credentials because it is a fine tuned model. I have my openai api key in the correct spot for open ai, and also added it on the custom LLM previously. It never worked. I tried to update it recently after adding it the first time, and it gave me an error saying a plan was needed or something.
Authentication plan must be an object, was the error message
So naturally it sounds like the javacript is looking for some Oauth credentials, but it is optional to enter them. Not all custom LLM have Oauth authentication
s
@Digi Fidelis Sorry for the late response. Can you please try again, the UI bug has been fixed.
Because of no response closing this ticket.
2 Views