13. Third Party Tools (GitHub, Firecrawl)

This blog is part of the ADK Masterclass - Hands-On Series. Beyond built-in tools, ADK allows us to integrate specialized third-party services directly into our agents using the Model Context Protocol (MCP).

This ecosystem of tools enables our agents to interact with external platforms like GitHub for code management and Firecrawl for advanced web scraping.

View Code on GitHub

Table of Contents

1. What is MCP?

graph LR User[User Query] --> Agent[ADK Agent] Agent --> MCPToolset[MCP Toolset] MCPToolset --> |HTTP| GitHubMCP[GitHub MCP Server] MCPToolset --> |Stdio| FirecrawlMCP[Firecrawl MCP Server] GitHubMCP --> GitHub[GitHub API] FirecrawlMCP --> Firecrawl[Firecrawl API] GitHub --> |Data| GitHubMCP Firecrawl --> |Data| FirecrawlMCP GitHubMCP --> |Results| MCPToolset FirecrawlMCP --> |Results| MCPToolset MCPToolset --> Agent Agent --> Response[Response] Response --> User style Agent fill:#e3f2fd,stroke:#1565c0,stroke-width:2px style MCPToolset fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px style GitHubMCP fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px style FirecrawlMCP fill:#fff9c4,stroke:#fbc02d,stroke-width:2px

MCP (Model Context Protocol) is an open standard that allows agents to connect to external services and tools through a standardized protocol. This enables agents to access a wide variety of third-party services and APIs without custom integration code.

MCP Tools provide a standardized way to connect ADK agents to external services like GitHub, Firecrawl, and other third-party APIs through the Model Context Protocol.

Key benefits of MCP:

  • Standardized Protocol: One interface to connect to many services.
  • Growing Ecosystem: Hundreds of MCP servers available for various services.
  • Two Connection Types: HTTP for remote servers, Stdio for local processes.
  • Automatic Tool Discovery: Tools are dynamically discovered from MCP servers.

2. Tutorial

In this tutorial, we will build two different agents: one that can interact with GitHub and another that can scrape web content using Firecrawl.

Prerequisites

  • Python 3.11 or later
  • Google ADK installed (see Getting Started for installation instructions)
  • Google API key from Google AI Studio
  • GitHub Personal Access Token (PAT) with appropriate permissions
  • Firecrawl API key from Firecrawl
  • Node.js and npx (for Firecrawl MCP server)

2.1. Building a GitHub Agent

This agent connects to the GitHub Copilot MCP server to help users get information from GitHub repositories.

Step 1: Get Your GitHub Token

To create a GitHub Personal Access Token (PAT):

  1. Go to GitHub Settings > Developer settings > Personal access tokens > Tokens (classic).
  2. Click Generate new token > Generate new token (classic).
  3. Give your token a descriptive name (e.g., "ADK MCP Agent").
  4. Select the following scopes:
    • repo (Full control of private repositories) - for accessing repositories
    • read:org (Read org and team membership) - optional, for organization access
  5. Click Generate token.
  6. Copy the token immediately - you won't be able to see it again!

Note: For read-only operations, you can use a token with minimal permissions. The agent uses read-only mode by default.

Step 2: Create the Project

Use the ADK CLI to create a new agent project:

adk create github_agent
cd github_agent

Set up our environment variables:

cp .env.example .env
# Edit .env and add your GOOGLE_API_KEY and GITHUB_TOKEN

Step 3: Define the Agent

Open agent.py and replace its content with the following code to configure the MCPToolset for GitHub:

import os
from google.adk.agents import Agent
from google.adk.tools.mcp_tool.mcp_session_manager import StreamableHTTPServerParams
from google.adk.tools.mcp_tool.mcp_toolset import MCPToolset

GITHUB_TOKEN = os.environ.get("GITHUB_TOKEN", "")

root_agent = Agent(
    model="gemini-2.5-flash",
    name="github_agent",
    instruction="Help users get information from GitHub repositories. You can search repos, get file contents, list issues, and more.",
    tools=[
        MCPToolset(
            connection_params=StreamableHTTPServerParams(
                url="https://api.githubcopilot.com/mcp/",
                headers={
                    "Authorization": f"Bearer {GITHUB_TOKEN}",
                    "X-MCP-Toolsets": "all",
                    "X-MCP-Readonly": "true"
                },
            ),
        )
    ],
)

Step 4: Run the Agent

Run our GitHub agent:

adk web

Try these example queries:

  • "List the top 5 trending Python repositories on GitHub."
  • "Get the README of the google/adk-python repository."
  • "What are the recent issues in the langchain-ai/langchain repo?"

2.2. Building a Firecrawl Agent

This agent uses the Firecrawl MCP server to turn any website into LLM-ready data.

Step 1: Get Your Firecrawl API Key

To create a Firecrawl API key:

  1. Go to Firecrawl and sign up for an account.
  2. Navigate to your Firecrawl Dashboard.
  3. Go to the API Keys section.
  4. Click Create API Key or copy your existing API key.
  5. Copy the API key - you'll need it for the .env file.

Note: Firecrawl offers a free tier with limited requests. Check their pricing page for details.

Step 2: Create the Project

Create a new directory for our Firecrawl agent:

cd ..
adk create firecrawl_agent
cd firecrawl_agent

Configure the environment:

cp .env.example .env
# Edit .env and add your GOOGLE_API_KEY and FIRECRAWL_API_KEY

Step 3: Define the Agent

Open agent.py and use the following code to set up the Firecrawl tool using npx:

import os
from google.adk.agents import Agent
from google.adk.tools.mcp_tool.mcp_toolset import MCPToolset
from google.adk.tools.mcp_tool.mcp_session_manager import StdioConnectionParams
from mcp import StdioServerParameters

FIRECRAWL_API_KEY = os.getenv("FIRECRAWL_API_KEY", "")

root_agent = Agent(
    model="gemini-2.5-flash",
    name="firecrawl_agent",
    description="A helpful assistant for scraping websites with Firecrawl",
    instruction="Help the user scrape and extract content from websites. Use the Firecrawl tool to get clean, LLM-ready text from any URL.",
    tools=[
        MCPToolset(
            connection_params=StdioConnectionParams(
                server_params=StdioServerParameters(
                    command="npx",
                    args=["-y", "firecrawl-mcp"],
                    env={"FIRECRAWL_API_KEY": FIRECRAWL_API_KEY}
                ),
                timeout=30,
            ),
        )
    ],
)

Step 4: Run the Agent

Start the Firecrawl agent:

adk web

Try these example queries:

  • "Scrape the content of https://example.com and summarize it."
  • "Extract the main article from https://news.ycombinator.com"
  • "Get all the links from https://google.github.io/adk-docs/"

3. Understanding MCP Architecture

MCP provides two connection types, each suited for different scenarios:

HTTP Connection (StreamableHTTPServerParams)

Used for remote MCP servers that expose an HTTP endpoint. This is ideal for:

  • Cloud-hosted MCP servers (like GitHub Copilot MCP)
  • Services that require authentication headers
  • Production deployments
from google.adk.tools.mcp_tool.mcp_session_manager import StreamableHTTPServerParams

MCPToolset(
    connection_params=StreamableHTTPServerParams(
        url="https://api.example.com/mcp/",
        headers={"Authorization": f"Bearer {TOKEN}"},
    ),
)

Stdio Connection (StdioConnectionParams)

Used for MCP servers that run as local processes. This is ideal for:

  • NPM-based MCP servers (like Firecrawl)
  • Local development and testing
  • Servers that don't have a hosted endpoint
from google.adk.tools.mcp_tool.mcp_session_manager import StdioConnectionParams
from mcp import StdioServerParameters

MCPToolset(
    connection_params=StdioConnectionParams(
        server_params=StdioServerParameters(
            command="npx",
            args=["-y", "mcp-server-name"],
            env={"API_KEY": API_KEY}
        ),
        timeout=30,
    ),
)

Finding More MCP Servers

The MCP ecosystem is growing rapidly. You can find more servers at:

Next Steps

We have now completed the Tools section of the Masterclass! We've covered:

  • Built-in tools (Google Search, Code Executor, RAG, Vertex AI Search)
  • Custom Function Tools
  • OpenAPI Tools
  • Multi-Tool Agents
  • Third-Party MCP Integrations

In the next section, we will dive into Protocols & Communication to enable context sharing and multi-agent collaboration.

Resources

Comments