Message data too large. Max size is 4096
# support
m
v
@Maclean It seems like you're encountering a limitation related to the message size rather than the token limit of the model itself. The error message indicates that the system you're using has a maximum message size of 4096 characters. This limit is separate from the GPT model's token limit and is likely a constraint set by the underlying API or communication protocol you're using. To resolve this issue, you can consider the following options: 1. **Split the Message**: Break down your message into smaller chunks, each under 4096 characters, and send them sequentially. This way, you can convey the entire content without hitting the size limit. 2. **Compress the Message**: If applicable, you could compress the message data before sending it and decompress it on the receiving end. This might be more complex and is not always feasible, depending on the capabilities of the client and server. 3. **Check Configuration**: Review the configuration of the API or service you're using to see if there's a way to increase the message size limit. This might involve changing settings or upgrading to a different service tier that allows for larger messages. 4. **Contact Support**: If you're unable to find a solution within the documentation or configuration, reaching out to the support team of the service you're using might provide more insight or alternative solutions. Remember to handle the message splitting or any other solution you choose gracefully, ensuring that the user experience remains smooth and the message context is preserved. If you need further assistance with implementing a specific solution, please provide more details, and I'll be happy to help!
You can continue this conversation by mentioning me in the message.
Got it. Tagging @User for additional help.
m
also my system prompt is only 130 tokens so I don't think it's a token limit
s
can you send the latest call_id
it is not related to your system prompt
if you could increase the payload size
then it will fix your issue
9 Views