Marketplace
transformers
Transformer architecture fundamentals. Covers self-attention mechanism,multi-head attention, feed-forward networks, layer normalization, andresidual connections. Essential concepts for understanding LLMs.
$ Installieren
git clone https://github.com/atrawog/bazzite-ai-plugins /tmp/bazzite-ai-plugins && cp -r /tmp/bazzite-ai-plugins/bazzite-ai-jupyter/skills/transformers ~/.claude/skills/bazzite-ai-plugins// tip: Run this command in your terminal to install the skill
Repository

atrawog
Author
atrawog/bazzite-ai-plugins/bazzite-ai-jupyter/skills/transformers
0
Stars
0
Forks
Updated1w ago
Added1w ago