Integration
LangChain Integration (REST)
LangChain does not support MCP directly. Call the SearchMCP REST /v1/search
endpoint via a DynamicTool with X-API-Key
.
Install
Add LangChain (and your chat model binding of choice):
pnpm add @langchain/core @langchain/openai
# or
npm install @langchain/core @langchain/openai
# or
yarn add @langchain/core @langchain/openai
Node 18+ includes fetch
globally, so no extra HTTP client is required. You can also use Axios if you prefer.
Create a Search tool (REST)
Wrap POST https://api.searchmcp.io/v1/search
into a LangChain DynamicTool
. Authenticate with X-API-Key
.
// tools/webSearch.ts
import { DynamicTool } from "@langchain/core/tools";
type SearchArgs = {
query: string;
numberOfResults?: number;
location?: string;
country?: string; // ISO-3166-1 alpha-2 (e.g., "US", "GB")
dateRange?: "ANYTIME" | "PAST_YEAR" | "PAST_MONTH" |
"PAST_WEEK" | "PAST_24_HOURS" | "PAST_HOUR";
};
// Helper to safely parse JSON input from LangChain
function safeJson(input: string): any | null {
try { return JSON.parse(input); } catch { return null; }
}
export function createWebSearchTool(apiKey: string) {
return new DynamicTool({
name: "web_search",
description:
"Search the web (Google)." +
"Input JSON: { query, numberOfResults?, location?, country?, dateRange? }",
async call(input: string) {
const args = safeJson(input) as SearchArgs | null;
if (!args || typeof args.query !== "string" || args.query.trim().length === 0) {
throw new Error("web_search: 'query' (string) is required");
}
const body = {
query: args.query,
numberOfResults: args.numberOfResults ?? 5,
location: args.location,
country: args.country,
dateRange: args.dateRange,
};
const res = await fetch("https://api.searchmcp.io/v1/search", {
method: "POST",
headers: {
"Content-Type": "application/json",
"X-API-Key": apiKey,
},
body: JSON.stringify(body),
});
if (!res.ok) {
const text = await res.text().catch(() => "");
throw new Error(`web_search: HTTP ${res.status} ${res.statusText} \n${text}`);
}
const json = await res.json();
return JSON.stringify(json, null, 2);
},
});
}
Request schema matches your Search endpoint (e.g., query
, numberOfResults
, location
, country
, dateRange
). See REST Overview for details.
Use in a chain/agent
Here’s a minimal example with
@langchain/openai
. Replace with your preferred model binding.// index.ts
import "dotenv/config";
import { ChatOpenAI } from "@langchain/openai";
import { HumanMessage, SystemMessage } from "@langchain/core/messages";
import { RunnableSequence } from "@langchain/core/runnables";
import { createWebSearchTool } from "./tools/webSearch";
const OPENAI_API_KEY = process.env.OPENAI_API_KEY!;
const SEARCHMCP_API_KEY = process.env.SEARCHMCP_API_KEY!;
// 1) Model
const llm = new ChatOpenAI({
apiKey: OPENAI_API_KEY,
model: "gpt-4o-mini"
});
// 2) Tool
const webSearch = createWebSearchTool(SEARCHMCP_API_KEY);
// 3) Simple pattern: call the tool when prompt suggests a search
const chain = RunnableSequence.from([
async (input: { question: string }) => {
// naive heuristic
if (/\bsearch\b|\bgoogle\b|\blook up\b/i.test(input.question)) {
const toolResult = await webSearch.call(JSON.stringify({
query: input.question.replace(/^(search|google)\s*/i, ""),
numberOfResults: 5
}));
return new SystemMessage(`Search results (JSON):${toolResult}`);
}
return new SystemMessage("No external search performed.");
},
async (sysMsg: SystemMessage) => {
const res = await llm.invoke([
new SystemMessage("You are a helpful assistant."),
sysMsg,
new HumanMessage("Provide a concise answer.")
]);
return res;
}
]);
const answer = await chain.invoke({ question: "search latest iPhone announcement" });
console.log(answer.content);
Next steps
- Review REST API Overview for request/response fields
- See MCP Overview if you later adopt MCP-capable clients
- Try AI SDK (Node.js) for MCP-native usage in non-LangChain apps