prompt size limit on 1M context models?
# support
a
I am using GPT 4.1 which has 1M context window, but when i try putting in a system prompt thats around 300k tokens, i get an error "Payload too large. Hot tip, try reducing the size of yoru request body. What is the max size for the system prompt? do i need to use files instead?
k
Your 300k-token system prompt exceeds the actual context limit likely 128k, so you need to reduce its size or switch to retrieval or file-based input instead..