Context length limit
LLMs evolve rapidly and you should get the latest models and limitations from official sources
As you may know, each chat model has a different context length limit:
GPT-3.5: 4,096 tokens
GPT-3.5-16k: 16,385 tokens
GPT-4: 8,192 tokens
GPT-4-32K: 32,768 tokens
Claude: 100,000 tokens
Last updated