system prompt character limit
# support
s
Hi! I show that the system prompt limit depends on the model used, but I could not find any information about the specific limits for each model. Where can I find this information?
v
@s3e Hello! The prompt limit, or the maximum number of tokens, can indeed vary depending on the model being used. For VAPI, the default max number of tokens that the assistant is allowed to generate in each turn of the conversation is set to 250. If you need information about different models and their specific limits, this information is typically provided in the documentation for each model. If you can't find the details there, please reach out to support for more comprehensive information on this topic.
You can continue this conversation by mentioning me in the message.
s
so vapi always apply the limits of models' providers?
s
Hey @s3e as of now we are not imposing any limits but from our past observations it's best to keep prompt token length under 2500 tokens for best performances.
9 Views