How I Connected a Custom MCP Server to ChatGPT’s Agent Builder Using a Placeholder API
A step-by-step guide to wrapping your React Router Node backend as an MCP server, deploying it and securely integrating it into ChatGPT’s Agent Builder for real API interactions.
ChatGPT just released Agent Builder last week and honestly, it’s huge. I can already see thousands of possible use cases where end users could benefit from this AI feast.
I tinkered around a bit, spun everything up, and got a fresh taste of what it looks like when integrated into my own project.
For the demo, I’m using a simple placeholder API: https://jsonplaceholder.typicode.com/posts/${id} You can plug in any API endpoint you want, this is just to test the wiring. Eventually, I’m going to bundle this setup into my product, but first, I wanted to try it out.
The frontend is built with React Router v7 (framework mode) and a custom Node entry, which I already use to integrate API endpoints.
The main libraries that I am using here are openai and @modelcontextprotocol/sdk and also zod for the validation.
Inside your server folder (where your main app.ts file sits), create a new file called mcpAttach.ts.
You’ll first import everything and set up the basic structure.
⚠️ Important #1:
Instead of directly creating a route, wrap your MCP logic in a function so you can import and use it cleanly in your main server file later.
// server/mcpAttach.ts
import type { Express, Request, Response } from ‘express’;
import { McpServer } from ‘@modelcontextprotocol/sdk/server/mcp.js’;
import { StreamableHTTPServerTransport } from ‘@modelcontextprotocol/sdk/server/streamableHttp.js’;
import { z } from ‘zod’;
import OpenAI from ‘openai’;
export function attachMCP(app: Express): void {
const server = new McpServer({ name: ‘demo’, version: ‘1.0.0’ });
const PORT = process.env.PORT || ‘3000’;
const BASE = `http://000.0.0.1:${PORT}`;
const makeAuthHeaders = (): MCPHeaders => {
const h: MCPHeaders = { ‘x-mcp’: ‘1’ };
if (process.env.MCP_BEARER)
h.Authorization = `Bearer ${process.env.MCP_BEARER}`;
return h;
};
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY! });
app.get(
‘/api/demo/post/:id’,
async (req: Request, res: Response) => {
try {
const id = Number(req.params.id);
if (!Number.isInteger(id) || id < 1 || id > 100) {
return res.status(400).json({ error: ‘id must be an integer 1–100’ });
}
const r = await fetch(
`https://jsonplaceholder.typicode.com/posts/${id}`,
{
signal: makeTimeoutSignal(8000),
} as RequestInit
).catch((e: unknown) => {
const msg = e instanceof Error ? e.message : String(e);
throw new Error(`upstream fetch failed: ${msg}`);
});
//@ts-ignore
if (!r || !(r as unknown as Response).ok) {
const status = (r as any)?.status ?? 502;
return res.status(status).json({ error: `upstream ${status}` });
}
const post = await (r as any).json();
res.set(’Cache-Control’, ‘public, max-age=60’);
return res.json({ post });
} catch (err) {
// eslint-disable-next-line no-console
console.error(’demo/post error:’, err);
return res.status(500).json({ error: ‘internal error’ });
}
}
);
}This looks like a normal Express endpoint, except it’s wrapped inside the attachMCP() function.
Keep reading with a 7-day free trial
Subscribe to In Silico, In Soul. to keep reading this post and get 7 days of free access to the full post archives.

