VAPI Blocks - Knowledge Files and Global Prompt?
# support
c
Hello! Really loving VAPI Blocks, however to make it perfect we would need the ability to utilise a Global Prompt of some sort - to teach the LLM how to handle certain situations that may not be "on flow" for example if user says "Wait, is this a human or AI?" etc Also - we would love this to have more knowledge on our customers - how do we provide files/knowedge to the VAPI agents?
v
@Chris Laupama Hello! I'm delighted to hear that you're enjoying VAPI Blocks! For handling off-flow situations like when a user asks if they're interacting with a human or AI, you can use the Building Blocks of Effective Prompts to create a Global Prompt. This would involve setting up a prompt that can be triggered regardless of the current flow, to provide a consistent response to such questions. As for providing more knowledge about your customers to the VAPI agents, you can integrate your customer data into the VAPI system. This can be done by using the Explicit Tool Integration feature, which allows you to feed external data into the system for the AI to use. You would need to format your customer data files in a way that's compatible with VAPI and then integrate them into the system. If you need detailed instructions on how to set up a Global Prompt or integrate customer data, please let me know, and I can guide you through the process or direct you to the appropriate resources.
You can continue this conversation by mentioning me in the message.
Got it. Tagging @User for additional help.
c
@Vapi Can you please provide an example of the Global Prompt? Also, when it comes to knowledge that sounds like a slow solution. You mean make an external cool to obtain text and then provide that output to the agent? Would I do that immediately upon the workflow starting? Can you explain how that would work?
v
@Chris Laupama Certainly! Here's an example of a Global Prompt that you could use to handle unexpected user queries like "Wait, is this a human or AI?":
Copy code
I am an advanced AI designed to assist you. Whether you need help with a specific task or have general questions, I'm here to provide support. Just so you know, I'm not a human, but I'm programmed to be as helpful and informative as possible. If at any point you need to speak with a human, please let me know, and I can assist you with that.
This prompt sets the context for the AI's role and capabilities, and it can be triggered whenever a user expresses confusion about the nature of the agent they're interacting with. Regarding the integration of customer knowledge, the process typically involves: 1. Preparing your customer data in a structured format that the VAPI system can understand. 2. Using an API endpoint or a webhook to make an external call to your database or CRM system at the start of the workflow or when specific information is needed. 3. The VAPI system receives the data and uses it to inform the agent's responses, ensuring they are personalized and relevant to the customer's history or profile. This process does not necessarily slow down the interaction, as the data retrieval can be optimized for speed, and the system can cache frequently accessed information to reduce latency. To implement this, you would: - Set up an API endpoint or webhook that the VAPI system can call. - Configure the VAPI workflow to make this call at the appropriate time, such as at the beginning of the interaction or when the agent needs specific customer information. - Ensure that the retrieved data is formatted in a way that the VAPI agent can use it effectively in the conversation. If you need further assistance with setting up the Global Prompt or integrating customer knowledge into the VAPI system, please let me know, and I can provide more detailed guidance or direct you to the right resources.
c
@Vapi Are VAPI blocks conversations provided to each conversation step? Or is each step not aware of the previous conversation? I’m trying to figure out if I can provide the knowledge as context at the start and not having to keep supplying it?
v
@Chris Laupama In VAPI Blocks, each conversation step can be aware of the previous conversation context. This means that you can provide the necessary knowledge or context at the start of the conversation, and it will be carried through to each subsequent step. This is typically managed through context management features within the VAPI system, which maintain state across the conversation. Here's how it works: 1. At the beginning of the conversation, you can make an external call to retrieve customer knowledge and provide it as context. 2. The VAPI system stores this context and makes it available to each step in the conversation. 3. As the conversation progresses, the context is updated with new information from the user's inputs or system processes. 4. Each block in the conversation flow can access this context to provide relevant and personalized responses based on the user's history or profile. By setting up the conversation this way, you won't need to repeatedly supply the same knowledge at each step. The initial context provided will persist and be accessible throughout the conversation, allowing for a seamless and efficient interaction. If you need more information on how to set up context management in VAPI Blocks or have any other questions, feel free to ask!
c
@Vapi if I supply a conversation step with context via data from an API call is there a way to ensure that the LLM won’t just say it out loud and be aware that it’s just context? Like using XML tags?
v
@Chris Laupama Yes, when you supply context to a conversation step from an API call, you can format the context in a way that the Language Learning Model (LLM) understands it's meant for internal use and not for verbalization. This is typically done by structuring the context data in a way that separates it from the conversational prompts. Here's how you can manage it: 1. **Metadata Tags**: Use metadata tags or a specific schema to denote information that is context-only. This metadata won't be part of the spoken response but will inform the LLM's understanding and responses. 2. **Structured Context**: Organize the context in a structured format, such as JSON, where you can clearly define which parts are for context and which are for user-facing dialogue. 3. **Instructions**: Include explicit instructions in the API call's response or in the context that indicate certain data is not to be spoken aloud. 4. **Separate Fields**: Use separate fields for context and dialogue within your data structure. For example, have a "context" field that the LLM knows is for background information and a "response" field for the information to be communicated to the user. Here's a simplified example of how you might structure the data:
Copy code
json
{
  "context": {
    "customerData": {
      "name": "John Doe",
      "accountStatus": "Premium",
      // Other internal context data...
    }
  },
  "response": {
    "prompt": "How can I assist you today?"
  }
}
In this example, the LLM would understand that the information within the "context" object is for internal use to inform the conversation, while the "response" object contains the data that can be spoken to the user. By clearly delineating context from user-facing content, you can ensure that the LLM uses the provided data appropriately to enhance the conversation without reciting sensitive or internal information. If you need further clarification on how to implement this in VAPI Blocks, please let me know!
c
@Vapi I’m not using the API, but I want to provide this context via the visual builder, how would I do that?
s
@Chris Laupama Blocks is currently in beta stage, so you can expect some features to not work as intended and to encounter bugs. It is not recommended for production use. For now, you can use Assistant (Prompting) instead.
c
Is there somewhere I can provide feedback?
I understand it's in Beta, I'm just getting my head around the functionality and how it may work. The suggestions I have feel like some small improvements that may be beneficial.
s
@Chris Laupama We are currently doing a complete rework of blocks, and new features will be released soon. Because of this, I suggest an alternative to save your time and effort. For sharing feedback, Discord tickets are the best method, and I will ensure your feedback is passed to the team.
c
Thank you so much for respondin to me @Shubham Bajaj - I really appreciate it
4 Views