sitemap-robots

Automated sitemap generation for all locale URLs, robots.txt configuration, and llms.txt for AI crawler optimization. Use when setting up sitemap.xml, configuring crawling rules, or improving discoverability for search engines and AI systems.

$ Instalar

git clone https://github.com/majiayu000/claude-skill-registry /tmp/claude-skill-registry && cp -r /tmp/claude-skill-registry/skills/data/sitemap-robots ~/.claude/skills/claude-skill-registry

// tip: Run this command in your terminal to install the skill