Add a suffix to all user messages before sending t...
# support
m
We noticed that adding a specific suffix to the user's message before sending to the LLM improves the quality of the generated text a lot. For example "think step by step", or "respond in this format ..." I want this to be in the user's message not in the system prompt. Is there a way to achieve this?
v
Yes, you can definitely add a specific suffix to the user's message before sending it to the LLM in Vapi! Here’s how you can do it: 1. **Using the
.send()
Method**: You can modify the user's message directly in your code by appending the desired suffix before sending it. For example, if you want to add "think step by step" as a suffix, you can do it like this:
Copy code
javascript
   vapi.send({
     type: "add-message",
     message: {
       role: "user",
       content: "This is the user's message [think step by step]"
     }
   });
In this example, replace
[think step by step]
with whatever suffix you want to add. 2. **Controlling Responses**: If you want to add the suffix without triggering an immediate response from the LLM, you can use the
triggerResponseEnabled
flag set to false:
Copy code
javascript
   vapi.send({
     type: 'add-message',
     message: {
       role: 'user',
       content: "This is the user's message [think step by step]"
     },
     triggerResponseEnabled: false
   });
This way, you can customize the user's message effectively before it reaches the LLM. If you have any more questions or need further assistance, feel free to ask! 😊 Sources: - [Modifying User Messages in Vapi to Include a Specific Suffix](https://docs.vapi.ai/customization/custom-llm/fine-tuned-openai-models) - [Vapi SDK Documentation](https://docs.vapi.ai/sdk/web)
m
@Vapi Support Bot I don't want to use a Custom Model
v
I understand you don't want to use a custom LLM. Could you clarify if you're looking for a way to add a suffix to user messages using Vapi's standard features without involving any custom LLM setup? This will help me provide a more accurate solution for your needs. Sources: - [Modifying User Messages in Vapi to Include a Specific Suffix](https://docs.vapi.ai/customization/custom-llm/fine-tuned-openai-models) - [Vapi SDK Documentation](https://docs.vapi.ai/sdk/web)
m
@Vapi Support Bot Yes I want to add a suffix to all user messages without using a Custom Model
v
Great! You can definitely add a suffix to all user messages in Vapi without using a custom model. Here’s how you can do it: 1. **Using the
.send()
Method**: You can modify the user's message directly in your code by appending the desired suffix before sending it. For example, if you want to add "think step by step" as a suffix, you can do it like this:
Copy code
javascript
   vapi.send({
     type: "add-message",
     message: {
       role: "user",
       content: "This is the user's message [think step by step]"
     }
   });
Just replace
[think step by step]
with whatever suffix you want to add. 2. **Controlling Responses**: If you want to add the suffix without triggering an immediate response from the LLM, you can use the
triggerResponseEnabled
flag set to false:
Copy code
javascript
   vapi.send({
     type: 'add-message',
     message: {
       role: 'user',
       content: "This is the user's message [think step by step]"
     },
     triggerResponseEnabled: false
   });
This way, you can effectively customize the user's message before it reaches the LLM. If you have any more questions or need further assistance, feel free to ask! 😊 Sources: - [Modifying User Messages in Vapi to Include a Specific Suffix](https://docs.vapi.ai/customization/custom-llm/fine-tuned-openai-models) - [Vapi SDK Documentation](https://docs.vapi.ai/sdk/web)
m
Hey Mohab, .send won't work for what you're saying, it's more for physical interactions like with a button or something.
m
Hey Mason, thanks! Yeah I figured. Do you know if adding a suffix/prefix to user message is doable without a custom model?
m
I don't believe there is unfortunately. @Shubham Bajaj can confirm
What I can offer though to help you if there isn't; is that because of the way transformers process tokens, putting what you would want the suffix/prefix at the end of your system prompt makes a noticeable difference. Especially if your prompt is longer. I'm certain with more work on your prompt you can get the desired outcome you're looking for from adding that to each message sent
--
If you want some help with prompting I'm more than happy to help you out
m
Thanks! We've done lots of work already on the system prompt, and we were able to get that behavior working in early messages in the conversation, but as the conversation grows longer, the effect of system prompt keeps decaying until it completely vanishes. The only we got this to work is to add a suffix using the custom model. But this adds a lot of latency, so we'd like to achieve this without the custom model.
m
if you give me the specific prefixs you want and the context I can make you a dataset for that too, just lmk
m
I got the idea from your other answers. Will do a similar trick, thanks!
k
Hey @Mohab checking if this is resolved for you?
m
yes, thank you.
k
Marking this ticket as Solved ✅
n
@Mohab may I ask what your solution was to this?
m
Using a custom model was the best we could do, it adds some latency, but it allows you alter the
messages
object before senidng to the LLM
2 Views