Build an MCP server
Turn your persona.json into an MCP server any AI client can query.
What MCP buys you here
The Model Context Protocol is a small open standard for letting AI clients call tools and read resources from external servers. Once your portfolio is behind an MCP server, any MCP-aware client — Claude Desktop, Antigravity, Gemini agents — can ask about you and get structured answers. Your portfolio stops being a static page and becomes an API.
Prerequisites
- Node 20 or later (
node --versionto check) - Your
persona.jsonfrom step 2 - An MCP-aware AI client (Claude Desktop, Antigravity, or any other MCP host)
Set up the server project
- Create a new folder for the server:
mkdir mcp-persona && cd mcp-persona - Initialize a Node project:
npm init -y - Install the SDK:
npm install @modelcontextprotocol/sdk - Copy your
persona.jsoninto this folder - Set
"type": "module"inpackage.jsonso the example below can use ES module syntax
Write the server
Create index.js with the following content. It registers four tools and wires them to a stdio transport, which is what most MCP clients speak by default.
// index.js — MCP server exposing persona.json
// Verify against latest @modelcontextprotocol/sdk docs.
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import {
CallToolRequestSchema,
ListToolsRequestSchema,
} from "@modelcontextprotocol/sdk/types.js";
import { readFileSync } from "node:fs";
import { fileURLToPath } from "node:url";
import { dirname, join } from "node:path";
const __dirname = dirname(fileURLToPath(import.meta.url));
const persona = JSON.parse(
readFileSync(join(__dirname, "persona.json"), "utf8")
);
const server = new Server(
{ name: "persona-mcp", version: "0.1.0" },
{ capabilities: { tools: {} } }
);
const tools = [
{ name: "get_skills", description: "Return the persona's skills.", inputSchema: { type: "object", properties: {} } },
{ name: "get_projects", description: "Return the persona's projects.", inputSchema: { type: "object", properties: {} } },
{ name: "get_experience", description: "Return the persona's experience.", inputSchema: { type: "object", properties: {} } },
{
name: "query_persona",
description: "Answer a free-form question about the persona using keyword matching.",
inputSchema: {
type: "object",
properties: { question: { type: "string" } },
required: ["question"],
},
},
];
server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools }));
function asText(payload) {
return { content: [{ type: "text", text: JSON.stringify(payload, null, 2) }] };
}
function queryPersona(question) {
const q = String(question || "").toLowerCase();
const haystack = JSON.stringify(persona).toLowerCase();
// Dumb v1: split into sentences, return the ones that share words with the question.
const words = q.split(/\W+/).filter((w) => w.length > 3);
const flat = JSON.stringify(persona);
const hits = flat
.split(/(?<=[.!?])\s+/)
.filter((s) => words.some((w) => s.toLowerCase().includes(w)))
.slice(0, 5);
return {
matched: hits.length,
excerpts: hits.length ? hits : [haystack.slice(0, 400)],
};
}
server.setRequestHandler(CallToolRequestSchema, async (req) => {
const { name, arguments: args } = req.params;
switch (name) {
case "get_skills": return asText(persona.skills ?? []);
case "get_projects": return asText(persona.projects ?? []);
case "get_experience": return asText(persona.experience ?? []);
case "query_persona": return asText(queryPersona(args?.question));
default:
throw new Error(`Unknown tool: ${name}`);
}
});
const transport = new StdioServerTransport();
await server.connect(transport);
console.error("persona-mcp ready on stdio");
The MCP SDK has evolved quickly. The structure above (Server + request handlers + StdioServerTransport) reflects the public API at the time of writing. If imports fail, check node_modules/@modelcontextprotocol/sdk/dist or the SDK README for the current export paths.
Run it locally
node index.js
You should see persona-mcp ready on stdio on stderr. The process waits for an MCP client to connect over stdin/stdout — it will not respond to direct typing.
Register with your AI client
Claude Desktop (macOS)
Edit ~/Library/Application Support/Claude/claude_desktop_config.json and add a server entry:
{
"mcpServers": {
"persona": {
"command": "node",
"args": ["/absolute/path/to/mcp-persona/index.js"]
}
}
}
Restart Claude Desktop. Your tools should appear in the tools menu.
Antigravity
Antigravity uses the same MCP server config shape. Open the MCP settings panel and add a stdio server pointing at node /absolute/path/to/mcp-persona/index.js. See the Antigravity docs for the exact config file location on your platform.
Test it
From your AI client, ask:
- "What are this person's top skills?"
- "Tell me about their projects."
- "Have they worked with Kubernetes?" — exercises
query_persona
The client should pick the right tool, call it, and answer using the JSON your server returned.
This works in any MCP client, not just the workshop's setup. The same index.js plugs into Claude Desktop, Antigravity, custom Gemini agents, and command-line MCP hosts without changes.
Where to take it next
- Deploy as a remote MCP server over HTTP/SSE so it's not tied to your laptop — see the MCP transport docs
- Add more tools:
get_contact,get_talks,get_writing - Replace the keyword matcher in
query_personawith a Gemini call so it returns natural-language answers grounded inpersona.json - Add an MCP resource (not just tools) that exposes the raw JSON to clients that prefer to read it directly
- MCP turns a JSON file into a callable API surface for any AI client.
- A useful server is small: a few tools, a stdio transport, no infrastructure.
- Tools are typed inputs and outputs — keep them narrow and well-named.
- Start dumb (keyword matching) and only add an LLM where the dumb path fails.
- One implementation, many clients: same server works in Claude Desktop, Antigravity, and custom agents.
- Getting Started with Google MCP Servers — Google's official deeper dive.
- Deploy a Secure MCP Server on Cloud Run — host your server on the cloud with auth.