Connect Google Search Console to Claude Code with an MCP Server
A step-by-step guide to building a lightweight MCP server that lets Claude Code read your Google Search Console data directly. No more copy-pasting analytics into chat.
I used to pull Google Search Console data by running a Python script, copying the JSON output, and pasting it into Claude Code. It worked, but it was brittle. The script broke when tokens expired. Context windows filled with raw API dumps. And every time I wanted a different cut of the data, I was back in the terminal editing query parameters.
There is a better way. An MCP server turns GSC into a native tool that Claude can call on demand, with structured inputs and clean outputs. You ask for last week's non-branded keywords. You get a table. No paste. No context bloat.
Here is exactly how to build it.
What You Need
- A Google Cloud project with the Search Console API enabled
- OAuth 2.0 credentials (client ID and secret)
- A refresh token with
webmasters.readonlyscope - Node.js installed locally
- Claude Code or Claude Desktop
If you have already connected GA4 or Google Calendar to Claude through an MCP, you already have the Cloud project. You just need to add the Search Console API and request a new refresh token with the right scope.
Quick Start Prompt
Once your MCP server is connected, copy and paste this into Claude Code:
Pull a Google Search Console performance report for [your-domain.com] over the last 28 days. Show me total clicks, impressions, CTR, and average position compared to the previous 28-day period. Then list the top 25 non-branded queries driving traffic, flag any pages ranking 8-20 with high impressions as near-page-1 opportunities, and highlight any queries with CTR under 1% that need title or meta description work.
Claude will call gsc_search_analytics with the right parameters, compare the two periods, and surface actionable opportunities. No script writing. No JSON parsing. Just a report.
Step 1: Get a Refresh Token
The Search Console API uses a separate OAuth scope from Analytics or Calendar. You need webmasters.readonly.
The fastest way is a small Python script that walks you through the browser flow:
import os
import google_auth_oauthlib.flow
SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly']
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(
'client_secret.json', SCOPES)
credentials = flow.run_local_server(port=8080)
print(f"Refresh token: {credentials.refresh_token}")Save that refresh token. You will not need to run this again. The MCP server exchanges it for an access token automatically.
Step 2: The Server
Create a new directory and initialize a Node project:
mkdir gsc-mcp && cd gsc-mcp
npm init -y
npm install @modelcontextprotocol/sdk zodCreate index.ts:
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { z } from "zod";
const CLIENT_ID = process.env.GOOGLE_CLIENT_ID!;
const CLIENT_SECRET = process.env.GOOGLE_CLIENT_SECRET!;
const REFRESH_TOKEN = process.env.GOOGLE_GSC_REFRESH_TOKEN!;
async function getAccessToken(): Promise<string> {
const params = new URLSearchParams({
client_id: CLIENT_ID,
client_secret: CLIENT_SECRET,
refresh_token: REFRESH_TOKEN,
grant_type: "refresh_token",
});
const res = await fetch("https://oauth2.googleapis.com/token", {
method: "POST",
headers: { "Content-Type": "application/x-www-form-urlencoded" },
body: params.toString(),
});
if (!res.ok) throw new Error("Token refresh failed");
const data = await res.json() as { access_token: string };
return data.access_token;
}
async function gscApi<T>(endpoint: string, body: unknown, token: string): Promise<T> {
const res = await fetch(
`https://searchconsole.googleapis.com/webmasters/v3${endpoint}`,
{
method: "POST",
headers: {
Authorization: `Bearer ${token}`,
"Content-Type": "application/json",
},
body: JSON.stringify(body),
},
);
if (!res.ok) throw new Error(`GSC API error: ${res.status}`);
return (await res.json()) as T;
}
const server = new McpServer({ name: "gsc-mcp", version: "1.0.0" });
server.registerTool(
"gsc_list_sites",
{
title: "List GSC Verified Sites",
description: "List all sites verified in Google Search Console.",
inputSchema: z.object({}),
},
async () => {
const token = await getAccessToken();
const res = await fetch(
"https://searchconsole.googleapis.com/webmasters/v3/sites",
{ headers: { Authorization: `Bearer ${token}` } },
);
const data = await res.json() as { siteEntry?: Array<{ siteUrl: string }> };
return {
content: [{ type: "text", text: JSON.stringify(data.siteEntry || []) }],
};
},
);
server.registerTool(
"gsc_search_analytics",
{
title: "GSC Search Analytics Query",
description: "Query clicks, impressions, CTR, and position. Supports filtering by query or page.",
inputSchema: z.object({
siteUrl: z.string(),
startDate: z.string(),
endDate: z.string(),
dimensions: z.array(z.enum(["query", "page", "device", "country", "searchAppearance"])).optional(),
dimensionFilter: z.string().optional(),
rowLimit: z.number().min(1).max(25000).optional().default(25),
}),
},
async ({ siteUrl, startDate, endDate, dimensions, dimensionFilter, rowLimit }) => {
const token = await getAccessToken();
const body: Record<string, unknown> = { startDate, endDate, rowLimit: rowLimit ?? 25 };
if (dimensions?.length) body.dimensions = dimensions;
if (dimensionFilter) {
const match = dimensionFilter.match(
/^(query|page|device|country|searchAppearance)\s+(contains|equals|notContains|notEquals)\s+(.+)$/i,
);
if (match) {
const [, dim, op, expr] = match;
const operatorMap: Record<string, string> = {
contains: "contains", equals: "equals",
notcontains: "notContains", notequals: "notEquals",
};
body.dimensionFilterGroups = [{
filters: [{
dimension: dim.toLowerCase(),
operator: operatorMap[op.toLowerCase()] || "contains",
expression: expr.trim(),
}],
}];
}
}
const data = await gscApi<{ rows?: Array<{
keys: string[]; clicks: number; impressions: number;
ctr: number; position: number;
}> }>(`/sites/${encodeURIComponent(siteUrl)}/searchAnalytics/query`, body, token);
const rows = (data.rows || []).map((r) => ({
keys: r.keys,
clicks: r.clicks,
impressions: r.impressions,
ctr: Math.round(r.ctr * 10000) / 100,
position: Math.round(r.position * 10) / 10,
}));
return {
content: [{ type: "text", text: JSON.stringify({ siteUrl, rows }) }],
};
},
);
server.registerTool(
"gsc_url_inspection",
{
title: "GSC URL Inspection",
description: "Inspect a specific URL for indexing status, last crawl time, and coverage state.",
inputSchema: z.object({
siteUrl: z.string(),
inspectionUrl: z.string(),
}),
},
async ({ siteUrl, inspectionUrl }) => {
const token = await getAccessToken();
const res = await fetch(
"https://searchconsole.googleapis.com/v1/urlInspection/index:inspect",
{
method: "POST",
headers: { Authorization: `Bearer ${token}`, "Content-Type": "application/json" },
body: JSON.stringify({ inspectionUrl, siteUrl, languageCode: "en" }),
},
);
const data = await res.json();
return {
content: [{ type: "text", text: JSON.stringify(data) }],
};
},
);
async function main() {
const transport = new StdioServerTransport();
await server.connect(transport);
console.error("GSC MCP server running on stdio");
}
main();Compile and run:
npx tsc index.ts --esModuleInterop --module NodeNext --moduleResolution NodeNext --target ES2022
node index.jsOr add a build script to package.json and run npm run build.
Step 3: Wire It Into Claude Code
Add a block to your .mcp.json (or Claude Desktop config):
{
"gsc": {
"type": "stdio",
"command": "node",
"args": ["/absolute/path/to/gsc-mcp/dist/index.js"],
"env": {
"GOOGLE_CLIENT_ID": "your-client-id.apps.googleusercontent.com",
"GOOGLE_CLIENT_SECRET": "your-client-secret",
"GOOGLE_GSC_REFRESH_TOKEN": "your-refresh-token"
}
}
}Use the absolute path. Relative paths break when Claude Code launches the server from different working directories.
Download the Skill
If you want Claude to run this workflow automatically - with the date math, opportunity flagging, and formatted report built in - you can add a skill file to your Claude Code workspace.
> Download the GSC Report Skill
Save it to .claude/skills/gsc-report/SKILL.md in your project. The skill has no hardcoded domains or credentials. It reads your property configs and handles the report logic so you can just type /gsc-report or ask for a GSC check by domain name.
What You Get
Once connected, your conversation changes. Instead of asking Claude to write you a Python script to pull GSC data, you ask for the data directly:
- "Show me the last 28 days of search performance for echalupa.com"
- "What non-branded queries drove impressions last week?"
- "Is my pricing page indexed?"
Claude calls the tool, receives structured JSON, and formats it into a readable report. Here is what that looks like with generic data:
## GSC Report - example.com - 2026-03-25 to 2026-04-21
### Summary
| Metric | Current | Previous | Change |
|--------|---------|----------|--------|
| Clicks | 4,320 | 3,680 | +17.4% up |
| Impressions | 58,400 | 52,100 | +12.1% up |
| CTR | 7.4% | 7.1% | +0.3pp up |
| Avg Position | 12.8 | 14.3 | -1.5 up |
### Top Non-Branded Queries
| Query | Clicks | Impressions | CTR | Position |
|-------|--------|-------------|-----|----------|
| mcp server tutorial | 312 | 2,840 | 11.0% | 4.2 |
| claude code seo automation | 198 | 1,650 | 12.0% | 5.1 |
| google search console api | 156 | 3,200 | 4.9% | 8.7 |
| structured data audit tool | 134 | 980 | 13.7% | 3.8 |
| nextjs sitemap indexing | 97 | 1,420 | 6.8% | 6.4 |
### Top Pages
| Page | Clicks | Impressions | CTR | Position |
|------|--------|-------------|-----|----------|
| /blog/mcp-servers-for-marketers | 890 | 8,200 | 10.9% | 3.5 |
| /tools/ngram-analyzer | 654 | 5,100 | 12.8% | 2.9 |
| /blog/google-ads-ngram-script | 432 | 6,400 | 6.8% | 7.2 |
| /blog/reddit-lead-gen-n8n | 310 | 4,800 | 6.5% | 6.8 |
| /tools/automation-roi | 287 | 2,100 | 13.7% | 2.4 |
### Opportunities
- [Near page 1] /blog/google-ads-ngram-script - pos 7.2, 6,400 imp, 6.8% CTR. Add a backlink from the MCP post or refresh the intro.
- [Low CTR] /blog/reddit-lead-gen-n8n - pos 6.8, 4,800 imp, 6.5% CTR. Rewrite title to include "n8n Reddit automation" and update the meta description.
- [Trending down] /blog/sales-stopped-believing-your-leads - clicks down 22% vs prev period. Check if the headline still matches search intent.
``` The token cost is lower than dumping raw API responses into context because the tool descriptions tell the model exactly what each parameter does.
## Why This Beats Scripts
I used to maintain a half-dozen Python snippets for GSC. Each one parsed arguments differently. Each one dumped different JSON shapes into the chat. Each one required me to manually handle token refresh when the OAuth session expired.
An MCP server solves all of that. The interface is standardized. The auth is automatic. And because Claude understands the tool schema, it can decide which tool to call and how to format the arguments without me writing intermediate glue code.
The upfront cost of building the server is maybe twenty minutes. The ongoing cost of using it is zero. If you are pulling GSC data more than once a week, this pays for itself immediately.