Metadata not working
# support
e
I created an example assistant {"id":"12aff13c-16b8-4d2c-b12d-77a7f34e4c4b","orgId":"c6897ea3-33f5-4ab4-a579-fb4f7ac3847f","name":"metadata","voice":{"voiceId":"248be419-c632-4f23-adf1-5324ed7dbf1d","provider":"cartesia"},"createdAt":"2024-08-29T13:21:58.593Z","updatedAt":"2024-08-29T16:15:43.824Z","model":{"model":"gpt-3.5-turbo","messages":[{"role":"system","content":"This is a blank template with minimal defaults, you can change the model, temperature, and messages."}],"provider":"openai","temperature":0.7},"clientMessages":["metadata"],"serverMessages":[],"metadata":{"test":"hello"},"isServerUrlSecretSet":false} Any call I place with the assistant there is no metadata showing up, for example this call ID b6716729-0a4b-4ad8-b60e-f37065d5f83d How can I get metadata into the call?
v
@eric11 It seems like you've already set up an assistant with metadata included in its configuration. However, if the metadata isn't showing up in the calls you're making, there might be an issue with how the metadata is being passed or retrieved during the call process. To ensure that metadata is included in your calls, you should pass the metadata object along with your call request. The metadata should be structured as a key-value pair, and it should be included in the body of your API request when you initiate the call. Here's a general example of how you might include metadata in your call request:
Copy code
json
{
  "metadata": {
    "key1": "value1",
    "key2": "value2"
    // ... additional metadata key-value pairs
  }
  // ... other necessary call parameters
}
Make sure that when you're making the call, you're including the metadata in the correct format and that your system is configured to capture and display this metadata. If you're still having trouble, please provide more details about how you're making the call, and I can assist you further. If there's a specific function you need to call to include metadata, please let me know, and I'll help you with that.
You can continue this conversation by mentioning me in the message.
e
The format you provided does not work See this call ID: 64edaa7e-946a-41e4-aad3-615fcdb9fe51 the assistant {"id":"12aff13c-16b8-4d2c-b12d-77a7f34e4c4b","orgId":"c6897ea3-33f5-4ab4-a579-fb4f7ac3847f","name":"metadata","voice":{"voiceId":"248be419-c632-4f23-adf1-5324ed7dbf1d","provider":"cartesia"},"createdAt":"2024-08-29T13:21:58.593Z","updatedAt":"2024-08-29T16:24:40.727Z","model":{"model":"gpt-3.5-turbo","messages":[{"role":"system","content":"This is a blank template with minimal defaults, you can change the model, temperature, and messages."}],"provider":"openai","temperature":0.7},"clientMessages":["metadata"],"serverMessages":["end-of-call-report"],"metadata":{"test":"key1: value1"},"isServerUrlSecretSet":false}
@User
s
@eric11 please check https://docs.vapi.ai/server-url/events#retrieving-assistants and pass the metadata as assistant override.
e
I now managed to update the call with metadata, for example I sent this as a patch request when the call starts [ { "ca": null, "qs": [], "url": "https://api.vapi.ai/call/1becfae3-a22f-4c21-97b6-845f1629c259", "data": "{\n \"metadata\": {\n \"callid\": \"1becfae3-a22f-4c21-97b6-845f1629c259\"\n }\n}", "gzip": true, "method": "patch", "headers": [ { "name": "Authorization", "value": "Bearer xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" }, { "name": "Content-Type", "value": "application/json" } ], "timeout": null, "useMtls": false, "authPass": null, "authUser": null, "bodyType": "raw", "contentType": "application/json", "serializeUrl": false, "shareCookies": false, "parseResponse": false, "followRedirect": true, "useQuerystring": false, "followAllRedirects": false, "rejectUnauthorized": true } ] In the log the metadata now shows up, so it is working, however in the web snippet SDK the metadata is still not being sent?
s
Can you share your processing sending metadata using websnippet.