Make assistant to speak after 5 seconds, in Custom...
# support
m
Hi, I am trying to building a solution using custom llm, I see that I can say the first message using the gui in the dashboard. After the first message I want my assistant to wait for 10 seconds. and if the user is inactivate, I want the llm to speak or prompt the user to speak. Can I do this with custom llm in python - fastapi - webhook.
v
idleTimeoutSeconds: float = 10 ``` This gives you an idea of how timing parameters are structured in Vapi. ## Monitoring Inactivity According to [this source](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1342410769232302122), configuring your assistant to send server messages to your webhook allows you to track inactivity using metadata like duration and timestamps, which would be useful for your implementation. Would you like me to help with any specific part of this implementation or provide more details on any of these components? Sources: - [Discord discussion on Custom LLM with webhooks](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1330803502615625738) - [Discord discussion on implementing delays](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1333783762579755020) - [Discord discussion on monitoring inactivity](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1342410769232302122) - [Discord discussion on idle message configuration](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1344633027925377064)
Hi there! 👋 I wasn't able to find specific code examples for implementing the 10-second delay with a prompt for inactive users, but I can definitely help with whether this is possible and the general approach! Yes, you can implement this functionality with a Custom LLM using Python, FastAPI, and webhooks. Here's how you can approach it: ## General Architecture According to [Vapi documentation](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1330803502615625738), you can configure your Custom LLM to receive call events via webhooks, process these events, and trigger appropriate responses. This is perfect for your use case where you need to track inactivity. ## Setting Up Your Webhook Endpoint You can create a FastAPI endpoint to handle incoming call data. Here's a simplified example of how you might set up a webhook endpoint:
Copy code
python
@app.post("/twilio/inbound_call", response_class=PlainTextResponse)
async def receive_call_data(
        AccountSid: str = Form(...),
        # Other parameters...
):
    # Process the call data
    # ...
    # Forward to Vapi
    # ...
## Implementing the Delay Logic For the 10-second delay and inactivity check, you would need to implement a timer or delay function in your Custom LLM's backend logic. According to [this source](https://discord.com/channels/1211482211119796234/1211483291191083018/threads/1333783762579755020), the general approach would be: 1. Play the initial message to the user 2. Start a timer (10 seconds in your case) 3. Listen for a response 4. If no response is detected within the specified time, send the follow-up message ## Configuring Idle Messages While you mentioned you don't want to use idle messages specifically, the configuration approach is similar. In Vapi, you can configure message plans with parameters like: ```python class MessagePlan(BaseModel): idleMessages: List[str] = ["Are you still there?"] idleMessageMaxSpokenCount: float = 3
s
You can do it using either through idle messages or live call control, but not through custom LLM as Vapi sent and request to your custom LLM, and you cannot send response by yourself. Send any response back to us, unless there is any request originated for it.
p
idle messages are working with custom llm 🙂
So if initial message is spoken, customllm doesn't get any response, idle message is spoken
accca80e-11ac-418d-b82a-718501f04021
But can we do custom Idle messages? UI dont seem to have such feature, just choice list.
s
You can use APIs to use custom idle messages as well.
m
@User Does inbound call supports live call control?
The intented use case is to add user message to the converation - https://docs.vapi.ai/calls/call-features
k
Yes, you can obtain the specific control URLs for each live call by triggering the /call endpoint, which returns the control Url
m
https://docs.vapi.ai/api-reference/calls/create This is for outbound calling only right? @Kings_big💫
@User
k
For both inbound and outbound calls
m
Perfect! Thanks @Kings_big💫
One more quick question @Kings_big💫 https://docs.vapi.ai/calls/call-features 2. Add Message to Conversation
Copy code
py
curl -X POST 'https://aws-us-west-2-production1-phone-call-websocket.vapi.ai/7420f27a-30fd-4f49-a995-5549ae7cc00d/control' 
-H 'content-type: application/json' 
--data-raw '{
  "type": "add-message",
  "message": {
    "role": "system",
    "content": "New message added to conversation"
  },
  "triggerResponseEnabled": true
}'
Can do the same to add message for
user
to simulate?
k
Yeah you can simulate a user message during a live call by using the add message control with the role set to user..
m
Okay!
s
checking if this is resolved/solved for you?
k
@Mahimai Raja
m
Yes it is resolved
s
Marking this ticket as Solved ✅