LLM Fundamentals
What is Context Window?
AI Systems Instructor • Real Estate Technologist
A context window is the total amount of text an AI can process in a single conversation — its working memory. Bigger windows (Claude's 200K tokens, Gemini's 2M tokens) let you work with more information at once, but how you organize that information matters more than how much fits.
Understanding Context Window
A context window is the maximum amount of text an AI model can see and process at one time — both your input and its output combined. Think of it as the AI's working memory. Everything inside the window, the AI can reference. Everything outside it, the AI has forgotten. When you start a new conversation, the window is empty. As you exchange messages, it fills up. When it's full, the AI either stops responding, starts dropping earlier messages, or compresses the conversation.
The real estate analogy: your context window is the size of your desk. A small desk means you can only spread out a few documents at a time — maybe one listing sheet and a comp. A bigger desk lets you lay out the full CMA, all six comps, the seller's disclosure, HOA docs, and your notes. But here's the thing most people miss: having a bigger desk doesn't mean you read every paper on it equally well. You still focus on what's right in front of you. AI works the same way.
Current context window sizes: Claude offers approximately 200K tokens (~150,000 words), GPT-4 supports up to 128K tokens (~96,000 words), and Gemini leads with up to 2 million tokens (~1.5 million words). These numbers are impressive, but they create a false sense that bigger is always better. Research shows that AI models perform best on information at the beginning and end of the context window, with accuracy dropping for content buried in the middle — the so-called "lost in the middle" problem.
This is exactly why context engineering matters more than context window size. Instead of dumping everything into one conversation and hoping the AI sorts it out, you're better off being strategic: provide the right information at the right time using techniques like Context Cards and just-in-time context. A well-organized 10K-token prompt will outperform a sloppy 100K-token dump every time. The window is a constraint to work with, not just a limitation to overcome.
Key Concepts
Token-Based Measurement
Context windows are measured in tokens, not words. One token is roughly 3/4 of a word in English. A 200K-token window holds about 150,000 words — or roughly 500 pages of text. Both your input and the AI's output count against this limit, so a long conversation gradually fills the window from both sides.
The 'Lost in the Middle' Problem
AI models pay the most attention to information at the beginning and end of the context window, with reduced accuracy for content in the middle. This means organization matters: put your most critical instructions and context at the top and bottom, not buried in a wall of text. This is a known limitation across all major LLMs.
Context Engineering vs. Context Size
Having a massive context window doesn't guarantee better results. Context engineering — strategically selecting and organizing what goes into the window — consistently outperforms simply stuffing more text in. This is where techniques like Context Cards, just-in-time context, and the 5 Essentials framework make the difference between AI that works and AI that impresses.
Context Window for Real Estate
Here's how real estate professionals apply Context Window in practice:
Full-Document Analysis
Upload entire documents — HOA bylaws, inspection reports, lease agreements — for comprehensive AI analysis.
Drop a 40-page HOA document into Claude and ask it to extract every rule that would affect a buyer with two dogs and plans to build a deck. With a large context window, the AI can process the entire document in one pass rather than you breaking it into chunks. The key is giving a specific question so the AI knows what to look for, not just 'summarize this.' Specific questions plus full documents equals useful answers.
Multi-Property Comparisons
Compare multiple listings, CMAs, or market reports side by side within a single conversation.
Paste six comparable sales with full details — square footage, lot size, condition, days on market, sold price, concessions — into one prompt and ask Claude to build a CMA narrative explaining your recommended list price. The context window is large enough to hold all the data simultaneously, so the AI can draw connections between comps that you might reference in a listing presentation. Include your target property details at the top for strongest attention.
Long Conversation Continuity
Maintain context across extended back-and-forth conversations for complex projects.
You're building a 90-day marketing plan for a new listing. Over 20 messages, you discuss target buyer demographics, pricing strategy, staging notes, photography direction, ad copy, and social media scheduling. A large context window keeps all of this context alive so message 20 still reflects decisions made in message 3. But if the conversation gets very long, be aware of context rot — earlier details may lose influence. For critical details, restate them periodically or use a Context Card at the start of each session.
Transaction File Review
Process multiple transaction documents together to identify issues, missing items, or preparation needs.
Upload your purchase agreement, seller's disclosure, inspection report, and title commitment into one Claude conversation. Ask it to cross-reference the documents: does the disclosure mention the roof issue the inspector flagged? Are the property dimensions consistent? Are there any title exceptions that conflict with the buyer's intended use? The context window lets the AI hold all documents simultaneously, mimicking what a transaction coordinator does manually.
When to Use Context Window (and When Not To)
Use Context Window For:
- When your task requires the AI to reference multiple documents or large amounts of data simultaneously
- When you need conversation continuity over many back-and-forth messages for complex projects
- When cross-referencing information across different sources — comps, disclosures, inspection reports
- When building on previous context within a session, like refining a marketing plan or iterating on copy
Skip Context Window For:
- Don't treat the context window as unlimited storage — be selective about what you include for best results
- Don't assume the AI remembers everything equally — critical details buried in the middle of long inputs may be overlooked
- Don't rely on a previous conversation's context for a new task — start fresh with a clear Context Card instead
- Don't paste entire databases or spreadsheets when you only need a specific subset — more noise means less signal
Frequently Asked Questions
What is a context window in AI?
A context window is the maximum amount of text an AI can process in a single conversation — including both your messages and the AI's responses. Think of it as the AI's working memory. Claude's context window is about 200K tokens (roughly 150,000 words), GPT-4 supports 128K tokens, and Gemini offers up to 2 million tokens. Once the window fills up, older parts of the conversation may be dropped or compressed.
Does a bigger context window mean better AI?
Not necessarily. A bigger context window means the AI can hold more information at once, but research shows that AI accuracy drops for information in the middle of very long inputs — the 'lost in the middle' problem. A well-organized 5,000-word prompt often produces better results than a messy 50,000-word dump. How you organize information matters more than how much you can fit. That's why context engineering techniques like Context Cards matter.
What happens when I exceed the context window?
It depends on the platform. Most AI tools either stop you from sending more text, automatically drop the oldest messages in the conversation to make room, or compress earlier parts of the conversation into a summary. In Claude, very long conversations may trigger context compaction — where earlier messages get summarized to free up space. The practical takeaway: for important long-running projects, periodically restate critical context rather than assuming the AI still holds every detail from the beginning.
How do I make the most of the context window for real estate work?
Three strategies: First, lead with your most important context (instructions, constraints, brand voice) at the very top of your prompt — AI pays the most attention to the beginning. Second, use Context Cards to provide structured, consistent context rather than free-form text dumps. Third, for long conversations, periodically restate key decisions and parameters rather than relying on the AI's memory of message #3 in a 30-message thread. The 5 Essentials framework naturally organizes your context for maximum impact.
Sources & Further Reading
Pages That Link Here
Other glossary terms that reference Context Window.
AI Token
Learn what AI tokens are, how they affect cost and output limits, and why real estate agents need to understand tokens to use ChatGPT and Claude effectively.
Our ConceptsContext Compaction
Learn what context compaction is and how the technique of fitting more relevant information into fewer tokens dramatically improves AI output quality for real estate professionals.
Our ConceptsContext Rot
Learn what context rot is, why AI accuracy degrades in long conversations, and the context compaction strategies that fix it for real estate professionals.
AI ParametersMax Tokens
Learn what max tokens means in AI and how this parameter controls the length of AI responses—helping real estate agents get outputs that are the right size for every use case.
LLM FundamentalsTransformer
Learn what a transformer is in AI—the neural network architecture behind ChatGPT, Claude, and Gemini that revolutionized how AI understands and generates human language.
Master These Concepts
Learn Context Window and other essential AI techniques in our workshop. Get hands-on practice applying AI to your real estate business.
View Programs