Custom LLM not working
# support
m
Just added my custom llm - Finetuned OpenAI model. Doesn`t accept my CustomLLM Api Key - OpenAI Api Key Call gets ended Immediately. #1293690685781839933
v
@Matrick I'm sorry to hear that you're experiencing issues with your custom LLM. To assist you better, could you please confirm if you've followed these steps correctly? 1. Set the custom LLM URL to
https://api.openai.com/v1
. 2. Assigned the custom LLM key to the OpenAI key. 3. Updated the model to your fine-tuned model. If you've already done this and are still facing issues, it might be a problem with the API key or the model configuration. Please double-check that the API key is correct and has the necessary permissions. If everything seems in order, I would recommend reaching out to support for more detailed assistance on this matter.
You can continue this conversation by mentioning me in the message.
m
I`ve followed all these steps, how do i reach out to support?
s
@Matrick could you record a loom video of the error response your getting and also can you share the call id so I can take a look?
Because of no response closing this ticket.
j
I have the same problem. I set the custo url to https://api.openai.com/v1. I assing the OpenAI key into vapi Providers Key>>Openai and I set the model to my fined tuned model Something like this ft:gpt-4o-mini-2024-07-18:personal::BGraXAE7:ckpt-step-60 When I make a test call, I can liste then first sencente of my assintant but after my first question I get the error: An error occurred in this call: call.in-progress.error-vapifault-custom-llm-401-unauthorized
k
Hey Javi, Please create a new support ticket for your issue as this ticket has been marked SOLVED, and it doesn't come under SLA. You won't be getting any responses further.
2 Views