context_editing_guide
Managing context window, token optimization, summarization strategies for long conversations.
$ Installer
git clone https://github.com/sigridjineth/interview-copilot-with-skills /tmp/interview-copilot-with-skills && cp -r /tmp/interview-copilot-with-skills/skills/cdp_context_editing ~/.claude/skills/interview-copilot-with-skills// tip: Run this command in your terminal to install the skill
SKILL.md
name: context_editing_guide description: Managing context window, token optimization, summarization strategies for long conversations.
Context Editing Skill
When to Use
- Questions about managing long conversations
- Token cost concerns
- "Context window filling up"
- "Claude forgets earlier messages"
- "How to handle 20+ turn conversations"
- Summarization and compression strategies
Key Feature: Context Editing
Context Editing lets you intelligently manage what stays in Claude's context window.
Core Capabilities
- Summarization: Compress older turns while preserving meaning
- Fact Extraction: Pull out key facts to persistent storage
- Selective Retention: Keep important messages verbatim
- Dynamic Management: Adjust context based on conversation flow
Recommended Pattern for Long Conversations
Turns 1-5: Keep verbatim (recent context)
Turns 6-15: Summarize (compressed context)
Persistent: Extracted facts (always present)
Token Savings
- Typical reduction: 60-70% for conversations over 20 turns
- Better user experience: Claude "remembers" key facts
- Cost savings scale with conversation length
Response Guidelines
- Identify the problem: Long conversations? Token costs? Lost context?
- Explain the pattern: Verbatim + Summarized + Persistent
- Give specific numbers: 60-70% savings, turn thresholds
- Offer to dive deeper: Architecture details if they want
Repository

sigridjineth
Author
sigridjineth/interview-copilot-with-skills/skills/cdp_context_editing
1
Stars
0
Forks
Updated2d ago
Added1w ago