streaming-llm-responses
Implement real-time streaming UI patterns for AI chat applications. Use when adding responselifecycle handlers, progress indicators, client effects, or thread state synchronization.Covers onResponseStart/End, onEffect, ProgressUpdateEvent, and client tools.NOT when building basic chat without real-time feedback.
$ インストール
git clone https://github.com/Asmayaseen/hackathon-2 /tmp/hackathon-2 && cp -r /tmp/hackathon-2/.claude/skills/streaming-llm-responses ~/.claude/skills/hackathon-2// tip: Run this command in your terminal to install the skill
Repository

Asmayaseen
Author
Asmayaseen/hackathon-2/.claude/skills/streaming-llm-responses
0
Stars
0
Forks
Updated6d ago
Added6d ago