Replacing System Prompt during the call
# support
j
I need a checkbox for replacing the system prompt with the responses which meet certain criteria, e.g. those that contain the string "LLM PROMPT", without speaking, but adding it to the context and generating another response.
v
Hi @jims, I'm an AI assistant for Vapi. I'll analyze available sources and get back to you shortly.
----------------------------- **While I generate an answer, here are some sources that might be helpful**: * [Web Calling]() * [Outbound Calling]() * [Inbound Calling]() * [Dashboard]() -----------------------------
### Answer Currently, Vapi does not support dynamically altering the system prompt during a call based on the content of responses or other criteria such as containing specific strings. System prompts are set prior to the call and are part of the initial configuration of the assistant [(1)]() . For custom behaviors like modifying the system prompt dynamically based on the conversation, you would typically need to handle this logic outside of Vapi's standard configuration, perhaps by managing the conversation flow within your own application logic and interfacing with Vapi for voice capabilities. If you need real-time decision-making capabilities in response handling, consider utilizing custom server logic where your application assesses and directs the flow of the conversation. This approach would involve your server making decisions based on the assistant's responses and then pushing updates or new prompts as needed. However, this kind of functionality would require custom development outside of the standard Vapi features detailed in the documentation. For more detailed guidance on setting up and configuring assistants, you might consider reviewing the existing documentation on creating and managing assistants [(2)]() .
Got it. Tagging @User for additional help.
j
Case sensitive, please.
s
Can you add this request here https://roadmap.vapi.ai/feature-requests ?
8 Views