LLM Fundamentals
What is Context Window?
Context Window is the maximum number of tokens an AI can consider in a single conversation. It includes both your input and the AI's output. When exceeded, AI "forgets" earlier parts of the conversation.
Understanding Context Windows
Think of the context window as AI's working memory. Everything in the conversation—your messages, AI's responses, any documents you upload—must fit within this window. Once you exceed it, older information drops out.
This is different from human memory. When you talk to a colleague, they don't forget your first sentence after talking for 10 minutes. But AI operates within strict token limits—every word counts toward a fixed budget.
For real estate agents, context window size determines whether you can upload a full contract for analysis, maintain consistent voice across a long content session, or ask follow-up questions without losing earlier instructions.
Context Window Sizes by Platform
~96,000 words or ~384 pages. Sufficient for most real estate documents and conversations.
~150,000 words or ~600 pages. Largest among major AI assistants. Best for long documents.
~1.5 million words. Can process entire codebases or book-length documents. Enterprise use cases.
Quick Math: 1 token ≈ 0.75 words. So 100K tokens ≈ 75,000 words ≈ 300 pages. A typical real estate contract is 10-30 pages—easily within any modern context window.
Why Context Window Matters for Real Estate
Document Analysis
Upload full contracts, inspection reports, or disclosures for AI to summarize and analyze.
Voice Consistency
Keep your Context Card and style guides loaded throughout a content creation session.
Iterative Refinement
Make multiple rounds of edits while AI remembers original instructions and context.
Complex Prompts
Include detailed instructions, examples, and constraints without hitting limits.
What is Context Rot?
Context Rot is the degradation of AI output quality as a conversation approaches or exceeds the context window. Even before hitting the hard limit, AI may start to "forget" or deprioritize earlier instructions, resulting in inconsistent responses.
Signs of Context Rot:
- → AI stops following your initial instructions
- → Voice or tone becomes inconsistent
- → AI "forgets" details from earlier in the conversation
- → Responses become more generic
Prevention: Start new conversations for new tasks. Keep your most important instructions (Context Card, style guides) at the beginning. For long sessions, periodically re-state key requirements.
Frequently Asked Questions
How do I know if I've exceeded the context window?
Most AI interfaces will tell you explicitly. ChatGPT shows a warning when approaching limits. Claude will indicate when it can't process all content. You'll also notice it when AI starts "forgetting" earlier instructions or details.
Does a bigger context window mean better AI?
Not necessarily. A larger context window means the AI can hold more information, but it doesn't improve the quality of reasoning or writing. Claude has superior writing with 200K context; GPT-4 has different strengths with 128K. Choose based on the task, not just context size.
Does the AI's response count toward the context window?
Yes. Both your inputs AND the AI's outputs consume context. If you ask for 10 long listing descriptions, both your prompt and all 10 responses eat into the limit. This is why long conversations eventually hit limits.
What if my document is larger than the context window?
Break it into sections and process sequentially. Or use AI tools specifically designed for large documents (like Claude's long-context features). You can also summarize sections and work with summaries instead of full text.
Sources & Further Reading
Master AI Fundamentals
Learn to leverage context windows effectively and avoid context rot in our workshop.
View Programs