VR
06/11/2024, 7:02 AMtypescript
app.post("/chat/completions", async (req: Request, res: Response) => {
req.on("close", () => {
res.end();
});
Speak(res, uuidv4(), "Hello, how are you doing today? I'm doing fantastic - it's such a beautiful day outside");
res.end();
});
Speak()
is straightforward:
typescript
function Speak(res: Response, id: string, content: string, end = false) {
const resp: OpenAI.ChatCompletionChunk = {
id,
object: "chat.completion.chunk",
created: moment().unix(),
model: "gpt-4o",
system_fingerprint: undefined,
choices: [
{ index: 0, delta: { role: "assistant", content }, logprobs: null, finish_reason: end ? "stop" : null },
],
};
res.write(`data: ${JSON.stringify(resp)}\n\n`);
res.write("data: [DONE]\n\n");
}
Problem
Vapi can ping multiple times for a user response. ie Yeah, let's do it
will ping once for Yeah
and once for Yeah, let's do it
.
This hits my endpoint two times which leads to the agent speaking both responses one after another.
Expected Behavior
The agent should cut off the stale request speech and only speak from the latest request.
Data
Call ID: b79b1b8c-cb49-4241-92bd-f22822b77a6dVapi
06/11/2024, 7:03 AMVapi
06/11/2024, 7:04 AMSpeak()
function.
5. If it matches, proceed with the Speak()
function.
Here's how you could modify your endpoint to include this logic:
typescript
let latestRequestId: string = "";
app.post("/chat/completions", async (req: Request, res: Response) => {
const requestId = uuidv4();
latestRequestId = requestId; // Update the latest request ID
req.on("close", () => {
res.end();
});
// Check if the current request ID is the latest before speaking
if (requestId === latestRequestId) {
Speak(res, requestId, "Hello, how are you doing today? I'm doing fantastic - it's such a beautiful day outside");
}
res.end();
});
This should ensure that only the response to the latest request is spoken by the agent. If you encounter any further issues or need more assistance, please let me know!Vapi
06/11/2024, 7:04 AMVapi
06/11/2024, 7:05 AMSahil
06/11/2024, 8:15 AMVR
06/11/2024, 7:44 PMSahil
06/12/2024, 8:28 AMVR
06/13/2024, 12:45 AMSahil
06/13/2024, 6:38 AM