Support investigative journalism — donate to IRE →

Context Window

noun
Foundational concepts Using AI as a tool

The maximum amount of text an AI model can read and reference at once when generating a response. Think of it as the model's working memory — anything that fits inside the context window can inform the answer, but anything outside it is effectively invisible. Context windows are measured in tokens, which roughly correspond to words or word fragments.

Context windows matter for data reporters because they determine how much material you can hand an AI in a single prompt. If you want a chatbot to summarize a 200-page city budget, extract names from a stack of court filings, or compare multiple drafts of a bill, you need a context window large enough to hold all that text. When your documents exceed the limit, the model may silently drop earlier content — a recipe for incomplete analysis.

Context windows have grown dramatically: early models like GPT-3.5 could handle about 4,000 tokens (roughly 3,000 words), while models released in 2025 routinely accept one million tokens or more — enough to hold a 750,000-word corpus. But bigger isn't always better. Stuffing a context window with irrelevant material can add noise and reduce answer quality, which is why techniques like retrieval-augmented generation (RAG) remain popular for searching large document collections.

Strike up a conversation with a chatbot and you may run into a frustrating limitation: It can forget what you're discussing. This happens as earlier parts of the conversation fall out of the large language model's context window, which is the largest chunk of text it can consider when generating a response. IEEE Spectrum
LLM _context window_s are growing, and fast. The initial version of OpenAI's GPT-3.5 had a context window of just 4,096 tokens. Today, GPT-4o has a context window of 128,000 tokens. That's a thirty-fold improvement in less than two years. IEEE Spectrum
...particularly in terms of its ability to write code based on written instructions and the size of its 'context window,' which means users can now input entire books and ask Claude 2 questions based on their content. TIME
Entry by Ryan Serpico
About this glossary — who's behind this site and how you can contribute.