Knowledge base not referred properly + hallucinati...
# support
s
Agent is not effectively utilising the knowledge base files. I do not understand how I can improve the prompt to direct the agent to use the files (these pages do not help https://docs.vapi.ai/customization/knowledgebase, https://docs.vapi.ai/prompting-guide). The agent also hallucinates a lot. I want it to only give answers from the knowledge base. Call id: 52ee429a-278d-4db9-914c-b061579a1ade
@User @Sahil
v
message has been deleted
@Samesh I'm sorry to hear that you're experiencing issues with the agent not effectively using the Knowledge Base. To ensure the agent utilizes the Knowledge Base more effectively and reduces hallucinations, you can try the following steps: 1. **Explicitly Reference the Knowledge Base**: In your prompts, explicitly mention that the response should be based on the Knowledge Base. For example, "According to our Knowledge Base, can you tell me...?" 2. **Use Specific Keywords**: Include specific keywords or phrases from the Knowledge Base in your prompts to guide the agent towards the relevant information. 3. **Limit the Scope**: Narrow down the scope of the question to a specific topic that is covered in the Knowledge Base to make it easier for the agent to find and use the correct information. 4. **Feedback Loop**: Provide feedback to the agent when it does not use the Knowledge Base correctly. This can help in adjusting its responses in future interactions. 5. **Update the Knowledge Base**: Regularly update the Knowledge Base with accurate and relevant information to ensure the agent has the latest data to draw from. 6. **Restrict Responses**: Configure the agent to restrict its responses to only what is available in the Knowledge Base, if such a setting is available. If you've already tried these steps and are still facing issues, it might be beneficial to reach out to our support team for a more in-depth look into your specific case. Would you like me to assist you with contacting support or is there anything else I can help you with right now?
You can continue this conversation by mentioning me in the message.
Got it. Tagging @User for additional help.
s
@Samesh can you check call id, the call was ended before by customer before even the conversation was started.
s
Sorry. This is the right call id a2601f3f-b4c8-4722-9898-f9b3d0746a0d
And the issue is the last message from the agent (check transcript). It is just hallucinating and making up the response even though nothing like that was mentioned in my knowledge base anywhere
@Shubham Bajaj
s
If you check the the messages history model generated response based on KB content.
s
How can I understand if the generated response was based on KB content through the messages history in the call logs? I don't think it was. I just want to understand how you are able to make the distinction
s
Actually, we don't expose it.
Like, we add the context from the KB via similarity search (based on the user response) + plus original response
due to which it causes some confusion
i would suggest you to use pinecone or any other vector db
and use it via tools calling
s
Ah okay
So would this work with just tools calling or would you suggest using a complete custom LLM? (and then retreive from vector db on my own server)
(Basically setup the entire RAG on my own server)
Because I don't think using tool calling will effectively add the retrieved KB content to the context?
s
we will be having a custom kb feature pretty soon
if you could wait then you can use it
otherwise you can go with the RAG flow
s
Right. So just doing custom RAG using the "custom LLM" feature seems best for now for my use case?
I don't see any other possible APIs on VAPI
s
yup.
will be available pretty soon.
not right now
s
Cool
Appreciate the quick help Sahil
I will check this out
But just to understand, what exactly is the difference between the upcoming "custom kb" feature and the existing "kb" feature? Is it on how you do retrieval?
Or you are basically going to be storing the kb content in a vector db?
s
custom kb basically will allow you to bring your own kb
just like custom llm
s
Ah okay, that would be super cool
9 Views