LearningAI
04/22/2024, 8:28 AMVapi Support
04/22/2024, 8:28 AMVapi Support
04/22/2024, 8:29 AM/credential
endpoint and specify your custom model in your API requests [(1)]() .
2. Using Your Server: For LLMs hosted on your server, you can create an endpoint compatible with the OpenAI Client. Your server can then interact with your knowledge base and vector store as needed. To implement this, post your server’s API key and URL via the /credential
endpoint and specify these in your assistant creation with the custom-llm
model provider.
This setup allows Vapi to provide transcripts to your specified endpoint, which can then utilize your knowledge base for responses.
For more detailed instructions on setting up your custom LLM with Vapi, refer to the Custom LLMs documentation.Sahil
04/22/2024, 8:32 AMLearningAI
04/22/2024, 4:17 PMAiBizBox
04/23/2024, 2:37 PMSahil
04/23/2024, 3:42 PMSahil
04/23/2024, 3:43 PMAiBizBox
04/23/2024, 7:12 PMSahil
04/23/2024, 7:14 PMAiBizBox
04/23/2024, 7:16 PMSahil
04/23/2024, 7:17 PMAiBizBox
04/23/2024, 7:18 PMSahil
04/23/2024, 7:19 PMAiBizBox
04/23/2024, 7:19 PM