streaming-llm-responses

Implement real-time streaming UI patterns for AI chat applications. Use when adding responselifecycle handlers, progress indicators, client effects, or thread state synchronization.Covers onResponseStart/End, onEffect, ProgressUpdateEvent, and client tools.NOT when building basic chat without real-time feedback.

$ Installieren

git clone https://github.com/Asmayaseen/hackathon-2 /tmp/hackathon-2 && cp -r /tmp/hackathon-2/.claude/skills/streaming-llm-responses ~/.claude/skills/hackathon-2

// tip: Run this command in your terminal to install the skill