Lab: Checkout API Rate Limiting
Compare AI-assisted development without ContextDigger vs with governed context bundles.
Youβll add rate limiting to a checkout API in a repo that has multiple versions of the checkout flow. The goal is to keep AI focused on the canonical path while staying inside a Context Aperture of f/15 and an Attention Budget of 3,000 lines.
Scenario
- Monorepo with multiple checkout flows:
checkout_v1,checkout_v2, and experiments. - Requirement: add request rate limiting to the production checkout endpoint.
- Risk: AI edits deprecated
checkout_v1or experimental code instead of the live path.
Flow 1 - Without ContextDigger (Ungoverned)
This is the typical βjust ask the assistantβ experience when you open the repo and prompt Claude, Cursor, Copilot, or ChatGPT directly.
Steps
- Open the monorepo in your IDE and start an AI chat.
- Ask: "Add rate limiting to the checkout API."
- The AI scans many files (
checkout*, services, models, tests) trying to infer which flow is live. - It may mix
checkout_v1,checkout_v2, and experiments in its internal context. - You review a large, noisy diff and manually check that the right version was edited.
Conceptual sequence (ungoverned):
```mermaid
sequenceDiagram
participant Dev as Developer
participant AI as AI Tool
Dev->>Dev: Open repo in IDE
Dev->>AI: "Add rate limiting to the checkout API"
AI->>AI: Scan many files: checkout*, services, models, tests
AI->>Dev: "I see multiple checkout flows,\nwhich one is canonical?"
Dev->>AI: Explain architecture and correct version
AI->>AI: Infer target files from mixed context
AI-->>Dev: Propose large diff across several files
Dev->>Dev: Review diff to ensure checkout_v2\nnot checkout_v1 was changed
```
Pain: Discovery happens inside the model context window. There is no explicit Context Aperture or Attention Budget, so the AI may over scan and still touch the wrong files.
Flow 2 - With ContextDigger CLI (Governed Bundle)
Here, you use ContextDigger to discover the checkout-api area once and build a governed context bundle before involving AI.
Steps
- Initialize ContextDigger in the repo:
$ cd path/to/your/repo $ contextdigger init $ contextdigger list # find "checkout-api"
- Generate a governed context bundle for
checkout-api:$ contextdigger dig checkout-api Budget: 12/15 files, 2,340/3,000 lines π Context file: .cdg/context/checkout-api.txt
- Now prompt your AI:
βRead
.cdg/context/checkout-api.txtand add request rate limiting to the main checkout endpoint, updating tests as needed while staying within this scope.β - The AI reads only the listed files (canonical
checkout_v2, dependencies, tests) and proposes a focused diff.
Conceptual sequence (governed bundle):
```mermaid
sequenceDiagram
participant Dev as Developer
participant CDG as ContextDigger
participant FS as Filesystem
participant AI as AI Tool
Dev->>CDG: contextdigger init
CDG->>FS: Discover areas, write .cdg structure
Dev->>CDG: contextdigger list (see checkout-api)
Dev->>CDG: contextdigger dig checkout-api
CDG->>FS: Load area files, apply Aperture/Budget
CDG-->>FS: .cdg/context/checkout-api.txt
Dev->>AI: "Use this bundle to add rate limiting"
AI->>FS: Read only files in checkout-api bundle
AI-->>Dev: Focused diff + test updates\nwithin governed scope
```
Governance: Context Aperture is explicitly f/15 (15 files). Attention Budget is 3,000 lines. Discovery happens once in .cdg/, not inside the model.
Reviewability: The diff is small and scoped to the live checkout flow, making code review and testing straightforward.
Flow 3 - With ContextDigger MCP (AI First)
In this future integration flow, you start from the AI tool. The AI talks to a local ContextDigger MCP server to obtain governed context on demand.
High-Level Flow
- You open your AI tool in the project workspace and say:
βUse ContextDigger to work on the
checkout-apiarea and add rate limiting to the main endpoint, updating tests as needed.β - The AI calls the ContextDigger MCP server:
- Lists areas and selects
checkout-api. - Builds a governed bundle using the same Aperture/Budget rules.
- Lists areas and selects
- The AI uses the bundle to propose changes and an explanation, still within the governed slice.
Conceptual sequence (AI first with MCP):
```mermaid
sequenceDiagram
participant Dev as Developer
participant AI as AI Tool
participant MCP as ContextDigger MCP
participant CDG as CDG Core
participant FS as Filesystem
Dev->>AI: "Use ContextDigger on checkout-api\nand add rate limiting"
AI->>MCP: list_areas(project_root)
MCP->>CDG: load_areas(.cdg/areas)
CDG-->>MCP: areas including checkout-api
AI->>MCP: dig_area(\"checkout-api\")
MCP->>CDG: build_context_bundle(\"checkout-api\")
CDG->>FS: Read governed slice, apply Aperture/Budget
CDG-->>MCP: governed bundle (files, budgets, bookmarks)
MCP-->>AI: bundle as MCP resource
AI-->>Dev: Diff + explanation\nbased on governed context
```
Key difference: the AI never scrapes the repo directly. It always obtains context through governed MCP resources, so Context Aperture and Attention Budgets are enforced even when the workflow starts from the assistant side.
What You Gain
Sharper AI
AI sees only the governed slice (e.g. 12 files, ~2,300 lines) instead of 100+ files. Less noise, more accurate suggestions.
Reviewable Diffs
Rate limiting changes land in the right endpoint and tests, producing a diff you can review in one sitting.
Reusable Governance
The same checkout-api bundle can be reused for future tasks (bugfixes, docs, refactors) without redoing discovery.
Next Steps
Run this lab in your own repo: define a checkout-api area, generate a governed bundle, and compare AI behavior with and without ContextDigger.