Passing metadata values into an assistants respons...
# support
h
I wanted to create a dedicated thread to this. I'm looking to pass metadata into each assistant response in order to track the system prompt used to generate that response (I need this for evals and production)
v
Thanks for your question! While Vapi supports assistant-level metadata and dynamic variables, I couldn't find documentation for automatically including the system prompt or metadata in each custom LLM response. Could you clarify if you want the system prompt to be returned in every API response from your custom LLM endpoint, or are you looking for a way to log it separately for each interaction? Sources: - [Vapi Custom LLM Tool Calling Integration](https://docs.vapi.ai/customization/tool-calling-integration) - [Vapi Assistant Personalization](https://docs.vapi.ai/assistants/personalization) - [Vapi Assistant API Reference](https://docs.vapi.ai/api-reference/assistants/update) - [Vapi Call Analysis](https://docs.vapi.ai/assistants/call-analysis)
h
No I will log it in my own db I just want to be able to pass an UUID to be able to retrieve the system prompt used for that message
a
While initiating the call, just pass the metadata property which takes key value pair.
h
Yes but this would be like call metadata is it possible to do that per message
per assistant response
a
you can do both, if you want on call level then pass it on /call part and if you want on assistant then pass it during assistant.
h
🤣 ok sahil clearly we're struggling to understand each other I dont want per call or per assistant I want per response/chat completion. Its fine I've done it within my code outside of vapi 🙂
a
Hugo, if you add it on call level then the same data will be provided during each chat completion under the call object.
But glad you are able to figure it out.