Your AI assistant reads raw files and guesses at structure. DKE-Forge's MCP server lets it query the actual graph — ask “what calls this?” and get the real call chain, not a search result. Compatible with Claude Code, Cursor, VS Code, and any MCP-compatible client.
Install:
pip install 'dke-forge[mcp]'
Claude Code — add to .mcp.json:
{
"mcpServers": {
"forge": {
"command": "dke-forge-mcp",
"env": {
"FORGE_URL": "https://dke-forge.com",
"FORGE_API_KEY": "fga_your_key"
}
}
}
}
Source on GitHub (MIT). See the full tool reference for all 27 tools.
The Model Context Protocol is an open standard for connecting AI assistants to external tools. An MCP server is a lightweight process that exposes tools over a standard protocol — AI clients discover them automatically and call them when relevant. No plugins, no custom integrations.
With the Forge MCP server installed, your AI answers “what calls this function?” or “what breaks if I change this?” by querying the code graph directly, instead of searching through files and guessing. Every answer is provenance-tagged — you can ask where a claim came from. Queries are deterministic — same graph, same answer.
See the API Docs for the full tool reference, or read the source on GitHub.