AI Context Management Glossary
Standard terminology for AI-assisted development
As AI coding tools mature, the industry needs precise language for context optimization. This glossary defines emerging standard terms for managing AI attention, context scope, and focus discipline.
About This Glossary
This glossary documents terminology for AI context management: the practice of deliberately controlling what context AI tools receive to optimize accuracy, cost, and performance.
These terms address a gap in current AI discourse: we have words for what AI does (generation, inference, embeddings) but lack standard language for how humans should provide context.
Note: These terms are platform-agnostic and apply to tools like ChatGPT, Claude, Gemini, Copilot, Cursor, Aider, Cody, Codeium, and all transformer-based AI coding assistants. They describe mechanisms, not products.
Core Terminology
Context Aperture
noun • /ˈkän-tekst ˈa-pər-ˌchu̇r/
Definition:
The controlled scope of code, files, or information provided to an AI tool, analogous to a camera's aperture controlling depth of field. A narrow aperture (fewer files) produces sharp focus on specific context; a wide aperture (many files) dilutes attention across broader scope.
Technical Basis:
Transformer models use attention mechanisms with quadratic complexity (O(n²)). Doubling context quadruples computation, leading to attention degradation beyond optimal thresholds (~10-15K tokens for most models).
Usage:
Related Metrics:
- • f/15 aperture: 15 files, ~3,000 lines, ~12K tokens (optimal)
- • f/30 aperture: 30 files, ~6,000 lines, ~24K tokens (acceptable)
- • f/100+ aperture: 100+ files, 20K+ lines, 80K+ tokens (degraded)
Why it matters: Unlike camera aperture (hardware limit), context aperture is a discipline - a deliberate choice to optimize AI performance through constraint.
Attention Budget
noun • /ə-ˈten(t)-shən ˈbə-jət/
Definition:
The maximum amount of context (measured in tokens or files) an AI model can process while maintaining high-quality attention across all inputs. Exceeding the attention budget results in attention dilution.
Technical Basis:
Research on transformer attention shows degradation beyond ~10-15K tokens, even with 200K context windows. The budget represents effective capacity, not theoretical maximum.
Usage:
Default Budgets (Recommended):
- • Files: 15 maximum per context
- • Lines: 3,000 maximum total
- • Tokens: ~12,000 optimal range
- • Areas: 2 concurrent focus areas
Industry adoption: "Attention budget" is already emerging in AI research papers and internal engineering discussions at major AI labs.
Focus Discipline
noun • /ˈfō-kəs ˈdi-sə-plən/
Definition:
The practice of deliberately constraining AI context to maximize precision, accuracy, and cost-efficiency. Encompasses techniques like context scoping, aperture control, and attention budget management.
Philosophical Basis:
Like "security discipline" or "code discipline," focus discipline is a professional practice - a learnable skill that separates effective AI users from ineffective ones.
Usage:
Core Principles:
- • Intentionality: Choose context deliberately, not reflexively
- • Minimalism: Include only what's necessary
- • Boundaries: Respect attention budget limits
- • Verification: Measure context size before sending
Future prediction: By 2026, "focus discipline" will appear in job descriptions for AI-assisted development roles.
Attention Dilution
noun • /ə-ˈten(t)-shən dī-ˈlü-shən/
Definition:
The measurable decrease in AI accuracy and precision when context exceeds the model's attention budget. Symptoms include generic answers, missed details, and hallucinations.
Technical Mechanism:
Attention scores are normalized across all tokens. With 240K tokens (176 files), each token receives ~0.0004% attention. With 12K tokens (15 files), each receives ~0.008% (a 20x improvement).
Usage:
Observable Symptoms:
- • Generic responses: "You could add a function..." instead of specific guidance
- • Missed context: AI overlooks relevant files or functions
- • Hallucinations: Invents non-existent APIs or patterns
- • Surface-level analysis: Skims code instead of deep understanding
Research gap: "Attention dilution" fills a terminology gap. We currently lack a standard term for context-induced AI degradation.
Context Scoping
noun/verb • /ˈkän-tekst ˈskō-piŋ/
Definition:
The deliberate process of selecting, filtering, and organizing code context before providing it to AI tools. Effective context scoping respects attention budgets and maintains appropriate context aperture.
Usage:
Scoping Strategies:
- • By directory: src/auth/** only
- • By naming pattern: *Account*.test.js
- • By role: Model files only, exclude controllers
- • By imports: Files that import UserModel
Industry trend: "Context scoping" naturally extends familiar "variable scoping" concepts from programming.
Usage in Practice
Code Review Comment:
"This PR shows poor focus discipline - you sent 176 test files to the AI. Try narrowing your context aperture to the 15 Account test files. You'll get better suggestions and reduce attention dilution."
Technical Discussion:
"We're seeing attention dilution at scale. Team needs to practice better context scoping - stay within the f/15 aperture, respect the 12K token attention budget."
Documentation:
"When refactoring authentication, set context aperture to src/auth/** (14 files). This maintains focus discipline and avoids exceeding the attention budget."
References & Further Reading
- Vaswani et al. (2017) - "Attention Is All You Need"
Foundational paper that introduces the transformer architecture and the scaled dot product attention mechanism used by modern AI coding tools. - Liu et al. (2023) - "Lost in the Middle: How Language Models Use Long Contexts"
Shows that model accuracy often drops for information placed in the middle of long prompts, which motivates bounded and well scoped context windows. - ContextDigger Documentation - Core Concepts
Practical application of attention budget and aperture control
Help Shape Industry Language
These terms are emerging as AI-assisted development matures. Use them, share them, and help standardize the vocabulary.