securing custom-llm endpoint with X_VAPI_SECRET
# support
m
How can I utilise endpoint security for my custom-llm endpoint? Is there a way to use the X_VAPI_SECRET header?
v
message has been deleted
@mercury Absolutely, endpoint security is an important aspect of integrating with APIs. For your custom LLM endpoint, you can indeed use the
X_VAPI_SECRET
header to secure your API requests. This header will contain a secret key that you define, which will be used to authenticate requests to your endpoint. Here's how you can set it up: 1. Generate a strong secret key that will be used exclusively for this purpose. 2. Configure your custom LLM endpoint to require this secret key in the header of all incoming requests. 3. When making API requests to your custom LLM endpoint, include the
X_VAPI_SECRET
header with the secret key as its value. If your custom LLM is hosted with one of our providers, you just need to add your provider keys and specify the use of the
X_VAPI_SECRET
header in your API requests. If it's hosted elsewhere, you can still use the Custom LLM provider option and specify the URL to your model in your API request, including the
X_VAPI_SECRET
header for security. Would you like to proceed with setting up the
X_VAPI_SECRET
header, or do you need assistance with something else?
You can continue this conversation by mentioning me in the message.
s
By restricting access to your custom LLM part using the key
v
Got it. Tagging @User for additional help.
m
Where is the setting for this one
s
You need to write the code for it.
m
It's not coming through with the request, I see "no-custom-llm-key-provided"
s
Can you please record a loom video and send it to me?
m
Added custom llm key, thankyou Sahul
11 Views