Marketplace

transformers

Transformer architecture fundamentals. Covers self-attention mechanism,multi-head attention, feed-forward networks, layer normalization, andresidual connections. Essential concepts for understanding LLMs.

$ Installer

git clone https://github.com/atrawog/bazzite-ai-plugins /tmp/bazzite-ai-plugins && cp -r /tmp/bazzite-ai-plugins/bazzite-ai-jupyter/skills/transformers ~/.claude/skills/bazzite-ai-plugins

// tip: Run this command in your terminal to install the skill