I want to build a journal calling agent which can ...
# support
s
I want guidance on how to build a simple voice agent. Please help.
a
Heysaketh\_1201 Thank you for creating the ticket! A member of our team will reach out to you shortly.
v
message has been deleted
@saketh_1201 Absolutely, I'd be glad to help you with that! Building a simple voice agent involves several key steps: 1. **Define the Purpose**: Determine what you want your voice agent to do. Will it answer questions, control smart home devices, provide customer support, etc.? 2. **Choose the Technology**: Decide on the technology or platform you'll use. This could be a voice API like VAPI, or a development framework like Google's Dialogflow or Amazon's Alexa Skills Kit. 3. **Design the Interaction**: Sketch out how you want the interactions to flow. This includes designing the conversation paths, prompts, and expected user responses. 4. **Develop the Voice Agent**: Write the code to handle the voice input and output. This will involve setting up intents, utterances, and possibly entities to capture the meaning of user queries. 5. **Integrate with a Voice API**: If you're using a service like VAPI, you'll need to integrate their API to handle the voice recognition and synthesis. 6. **Test and Iterate**: Test your voice agent with real users, gather feedback, and make improvements. 7. **Deploy**: Once you're satisfied with the performance, deploy your voice agent to the desired platform or device. If you need more specific guidance on any of these steps or if you have any particular questions, feel free to ask!
You can continue this conversation by mentioning me in the message.
s
The voice agent should call the user everyday at a specific time and talk to the user in a friendly manner, The user inputs should be stored somewhere and should be used in further conversations for a hyper personalised conversation
s
@saketh_1201 Developing a voice agent that engages users in daily conversations and retains memory of prior interactions is an excellent initiative. By utilizing cron jobs, you can schedule your assistant to make daily calls at a designated time. However, this must be accomplished externally through programming, as it is not a native feature available to us. It is essential to incorporate warm and engaging script dialogues in your prompt.
Indeed, you can enable your voice agent to recall previous call details by integrating historical context into the conversation. Below are the steps to implement this: 1. Prior to each call, retrieve the customer's previous call history from your database. 2. Format this historical information in a manner that is comprehensible to the AI. 3. Include this context in your API calls to the language model. __Here is an example of how to structure your API request:__
Copy code
json
{
  "model": {
    "provider": "openai",
    "model": "gpt-4",
    "messages": [
      {
        "role": "system",
        "content": "You are a voice assistant managing a call with a customer. Utilize the following previous call history to deliver more personalized and contextual responses."
      },
      {
        "role": "system",
        "content": "Previous call history: [Include formatted call logs, outcomes, and significant details from past interactions with this customer]"
      }
    ]
  }
}
This methodology enables your voice agent to maintain continuity across multiple conversations and offer a more personalized service by referencing past interactions and issues discussed with the customer.
@saketh_1201 Do let me know your thoughts on this.
s
I made this flow in make.com where context from google sheets(the summary of the previous conversation) could be fed into outbound caller. In the system prompt also I am trying to inject variables dynamically. After the call ends the flow will get transcript from end of call report which will be summarised by chatgpt and then stored in google sheets. Will this work? https://cdn.discordapp.com/attachments/1334393915071135825/1334811192010739773/image.png?ex=679de34b&is=679c91cb&hm=c9f0dee52e59c5a8a3b0433775bb052d34c028a9804f18b8b0cdc2cf193029dc& https://cdn.discordapp.com/attachments/1334393915071135825/1334811192362799175/image.png?ex=679de34b&is=679c91cb&hm=0d5cdeca561dedaa86aff5c91612df66d3e50993227005a34355dad0590fc9c7&
s
Make sure summaries are descriptive so can easily be used by LLM.
s
yes , the summaries are made by chatgpt itself after vapi watch end of call report (transcript) and then added in google sheet
s
Great, then its supposed to work smooth.
4 Views