OpenAPI Specs to MCP Servers Instantly
Empower Your APIs with MCP Servers
Organize all your OpenAPI specs in one central dashboard
Runtime secret passing ensures safe MCP deployments
Generate MCP servers from OpenAPI specs in seconds
Use MCP servers with OpenAI, Anthropic, and LangChain SDKs
Extend generated MCP servers with custom handlers
MCP to Langchain Integration
Best For
Desktop AI assistants, no-code solutions
Key Benefits
Easy setup, works with Claude Desktop, cursor etc
Best For
Custom development, multi-agent systems
Key Benefits
Full programmatic control, framework compatibility, advanced workflows
Integration Method | Best For | Key Benefits |
---|---|---|
MCP Server | Desktop AI assistants, no-code solutions | Easy setup, works with Claude Desktop, cursor etc |
SDK Integration | Custom development, multi-agent systems | Full programmatic control, framework compatibility, advanced workflows |
Configure MCP App
# Create MCP application
from mcp import ListToolsResult
import streamlit as st
import asyncio
from mcp_agent.app import MCPApp
from mcp_agent.agents.agent import Agent
from mcp_agent.workflows.llm.augmented_llm import RequestParams
from mcp_agent.workflows.llm.augmented_llm_openai import OpenAIAugmentedLLM
app = MCPApp(name="mcp_basic_agent")
await app.initialize()
Step 1: Configure MCP App
# Create MCP application
from mcp import ListToolsResult
import streamlit as st
import asyncio
from mcp_agent.app import MCPApp
from mcp_agent.agents.agent import Agent
from mcp_agent.workflows.llm.augmented_llm import RequestParams
from mcp_agent.workflows.llm.augmented_llm_openai import OpenAIAugmentedLLM
app = MCPApp(name="mcp_basic_agent")
await app.initialize()
Step 2: Create Agent with Server Config
finder_agent = Agent(
name="finder",
instruction="""You are an agent with access to the filesystem,
as well as the ability to fetch URLs...""",
server_names=["fetch", "filesystem"],
)
await finder_agent.initialize()
Step 3: Configure YAML Files
$schema: ../../schema/mcp-agent.config.schema.json
execution_engine: asyncio
logger:
type: console
level: debug
mcp:
servers:
fetch:
command: "uvx"
args: ["mcp-server-fetch"]
filesystem:
command: "npx"
args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
Step 4: List and Format Tools
def format_list_tools_result(list_tools_result: ListToolsResult):
res = ""
for tool in list_tools_result.tools:
res += f"- **{tool.name}**: {tool.description}\n\n"
return res
tools = await finder_agent.list_tools()
tools_str = format_list_tools_result(tools)
Step 1: Initialize Adapter
from reacter_openapitools import LangChainAdapter
adapter = LangChainAdapter(
api_key="your_openapi_key_here",
verbose=True
)
Step 2: Get Tools
# One line to get all your tools
tools = adapter.get_langchain_tools()
Step 3: Create Agent
from langchain_anthropic import ChatAnthropic
from langgraph.prebuilt import create_react_agent
model = ChatAnthropic(model_name="claude-3-sonnet-20240229")
agent_executor = create_react_agent(model, tools)
Step 4: Use Your Agent
from langchain_core.messages import HumanMessage
response = agent_executor.invoke({
"messages": [HumanMessage(content="can u generate an OTP for 78678634561?")]
})
print(response["messages"])
Cross Platform Integration
Connect your tools to any AI platform with just a few lines of code
- Function calling compatible
- Single line integration
- All models supported
- Tool use compatible
- JSON format support
- All Claude models
- Ready-to-use tools
- Chain integration
- Agent compatibility
OpenAPI SDK
Our sdk to integrate with any LLM platform
Easy Integration with AI Platforms
Connect your APIs directly to AI models with our simple SDK that works with OpenAI, Anthropic Claude, and LangChain.
Unified API
One SDK to connect all your tools to any LLM platform
Simple Setup
Just a few lines of code to integrate all your tools
Framework Compatibility
Works with OpenAI, Claude, and LangChain out of the box
from anthropic import Anthropic from reacter_openapitools import AnthropicAdapter # Initialize the Anthropic client anthropic_client = Anthropic( api_key="your-anthropic-key") adapter = AnthropicAdapter( folder_path="./openapitools", verbose=True ) def login(): # login logic to get auth_token adapter.add_environment_variable("token", auth_token) # Create a chatbot chatbot = adapter.create_anthropic_chatbot( anthropic_client=anthropic_client, llm_config={ "model": "claude-3-7-sonnet-20250219", "temperature": 0.7, "max_tokens": 4096, "system": "You are a helpful assistant with access to various tools." }, options={ "tool_names": [{ "name": "GenerateOtpTool", "version": "python-sdk" }] } ) # Use the chatbot response = chatbot["invoke"]("can u pls generate otp for 93467678911") print(response["text"]) # Reset the conversation chatbot["reset_conversation"]()
Ready to build with OpenAPI?
Get started for free and see the power of OpenAPITools
Enterprise Solutions
Need to build tools for thousands of APIs? Manage your entire API ecosystem in seconds with our enterprise solutions.
Contact us at openapitools@gmail.com