Fireworks.ai not working as Custom LLM
# support
e
Has anyone managed to use Fireworks.ai as the Custom LLM please? I entered, this but it doesnt work: Custom LLM URL: https://api.fireworks.ai/inference/v1 Model: accounts/fireworks/models/llama-v3-70b-instruct And I tried the Fireworks API key in the Custom LLM Provider key, both as: APIKEY and Bearer APIKEY
v
Hi @Ethan Tan, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly.
----------------------------- **While I generate an answer, here are some sources that might be helpful**: * [Introduction]() * [Custom LLM]() * [Frequently Asked Questions]() * [Connecting Your Custom LLM to Vapi: A Comprehensive Guide]() -----------------------------
### Answer Sorry, I wasn't able to find any information about your question.
s
Currently, To use it Fireworks.ai you need to configure your server and get the token from Vapi and send it to your custom server and respond back the information in the format we support. For a detailed explanation please use this documentation https://docs.vapi.ai/custom-llm-guide
e
Ah I see ok thank you. Any plans to integrate Fireworks as an option soon?
s
You can place the request on https://roadmap.vapi.ai/feature-requests
v
Glad I could be helpful. Feel free to create a new thread with any new questions you may have.