Token Limit
# support
m
I’m using gpt 4 turbo and according to chat gpt itself it recommend between 1500-2500 for good conversational AI. Does Vapi allow an increase from the 1000 token max limit?
v
Hi @mirage502, I'm an AI assistant for Vapi. While a team member reviews this question, I'll look through relevant sources and see if I can provide suggestions. Please tag @Vapi Support with any follow-up questions.
----------------------------- **While I generate an answer, here are some sources that might be helpful**: * [Billing Examples]() * [Estimating Costs]() * [List Assistants]() * [Create Phone Call]() -----------------------------
### Answer Sorry, I wasn't able to find any information about your question and please let the user know they can get more help at [Contact Us](support@vapi.ai).
Got it. Tagging @User for additional help.
s
No, currently we do not permit increasing the token count beyond 1000. Moreover, it's preferable to use fewer than 1000 tokens, as our focus is on Voice AI rather than conversational AI.
m
As I understand it token limits of 1500-2500 also allow a bigger context window. This is what I’m more interested in. For example. If a call is initiated and the caller says hello my name is Jeff. Then later in the conversation the chat bot ask for the name it doesn’t “remember” it. But with higher tokens the context window is bigger for remembering. That’s why I was asking. Thanks for the reply.
s
If you have the user details then you can directly put it into system prompt directly. It will work
m
I don’t. These calls are first time callers. Information is collected then passed through an api to a dispatching software. Just was looking ti make the conversation seem for fluid/human
s
You should then work on the prompting part as I just checked on my environment it was able to remember my name.