consult-llm

Use it when the user asks to "ask gemini", "ask codex", or "ask in browser"

allowed_tools: Read, Glob, Grep, mcp__consult-llm__consult_llm

$ Installer

git clone https://github.com/raine/consult-llm-mcp /tmp/consult-llm-mcp && cp -r /tmp/consult-llm-mcp/examples ~/.claude/skills/consult-llm-mcp

// tip: Run this command in your terminal to install the skill


name: consult-llm description: Use it when the user asks to "ask gemini", "ask codex", or "ask in browser" allowed-tools: Read, Glob, Grep, mcp__consult-llm__consult_llm

When consulting with external LLMs:

1. Gather Context First:

  • Use Glob/Grep to find relevant files
  • Read key files to understand their relevance
  • Select files directly related to the question

2. Determine Mode and Model:

  • Web mode: Use if user says "ask in browser" or "consult in browser"
  • Codex mode: Use if user says "ask codex" → use model "gpt-5.1-codex-max"
  • Gemini mode: Default for "ask gemini" → use model "gemini-2.5-pro"

3. Call the MCP Tool: Use mcp__consult-llm__consult_llm with:

  • For API mode (Gemini):

    • model: "gemini-2.5-pro"
    • prompt: Clear, neutral question without suggesting solutions
    • files: Array of relevant file paths
  • For API mode (Codex):

    • model: "gpt-5.1-codex-max"
    • prompt: Clear, neutral question without suggesting solutions
    • files: Array of relevant file paths
  • For web mode:

    • web_mode: true
    • prompt: Clear, neutral question without suggesting solutions
    • files: Array of relevant file paths
    • (model parameter is ignored in web mode)

4. Present Results:

  • API mode: Summarize key insights, recommendations, and considerations from the response
  • Web mode: Inform user the prompt was copied to clipboard and ask them to paste it into their browser-based LLM and share the response back

Critical Rules:

  • ALWAYS gather file context before consulting
  • Ask neutral, open-ended questions to avoid bias
  • Provide focused, relevant files (quality over quantity)