The SuperDoc MCP server lets AI agents open, read, edit, and save .docx files. It exposes the same operations as the Document API through the Model Context Protocol — the open standard for connecting AI tools to agents.
The MCP server is in alpha. Tools and output formats may change.
Setup
Install once. Your MCP client spawns the server automatically on each conversation.
Claude Code
Claude Desktop
Cursor
Windsurf
claude mcp add superdoc -- npx @superdoc-dev/mcp
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:{
"mcpServers": {
"superdoc": {
"command": "npx",
"args": ["@superdoc-dev/mcp"]
}
}
}
Add to ~/.cursor/mcp.json:{
"mcpServers": {
"superdoc": {
"command": "npx",
"args": ["@superdoc-dev/mcp"]
}
}
}
Add to ~/.codeium/windsurf/mcp_config.json:{
"mcpServers": {
"superdoc": {
"command": "npx",
"args": ["@superdoc-dev/mcp"]
}
}
}
Workflow
Every interaction follows the same pattern: open, read or edit, save, close.
superdoc_open → superdoc_get_content / superdoc_search → intent tools → superdoc_save → superdoc_close
superdoc_open loads a .docx file and returns a session_id
superdoc_get_content reads the current document and superdoc_search finds stable handles or addresses
- Intent tools use
session_id plus action to edit, format, create, comment, review track changes, or run batched mutations
superdoc_save writes changes to disk
superdoc_close releases the session
The MCP server exposes 12 tools total:
- 3 lifecycle tools:
superdoc_open, superdoc_save, superdoc_close
- 9 grouped intent tools generated from the SDK catalog
All tools except superdoc_open take a session_id from superdoc_open.
Lifecycle
| Tool | Input | Description |
|---|
superdoc_open | path | Open a .docx file. Returns session_id and file path |
superdoc_save | session_id, out? | Save to the original path, or to out if specified |
superdoc_close | session_id | Close the session. Unsaved changes are lost |
| Tool | Actions | Description |
|---|
superdoc_get_content | text, markdown, html, info | Read document content in different formats |
superdoc_search | match | Find text or nodes and return handles or addresses for later edits |
superdoc_edit | insert, replace, delete, undo, redo | Perform text edits and history actions |
superdoc_format | inline, set_style, set_alignment, set_indentation, set_spacing | Apply inline or paragraph formatting |
superdoc_create | paragraph, heading | Create structural block elements |
superdoc_list | insert, create, detach, indent, outdent, set_level, set_type | Create and manipulate lists |
superdoc_comment | create, update, delete, get, list | Manage comment threads |
superdoc_track_changes | list, decide | Review and resolve tracked changes |
superdoc_mutations | preview, apply | Execute multi-step atomic edits as a batch |
Multi-action tools use an action argument to select the underlying operation. superdoc_search is a single-action tool and does not require action.
Tracked changes
Actions that support tracked edits use the underlying Document API’s changeMode: "tracked" option. Review or resolve tracked edits with superdoc_track_changes.
How it works
The MCP server runs as a local subprocess, communicating over stdio. It manages document sessions in memory — each superdoc_open creates an Editor instance, and all subsequent operations run against that in-memory state until you superdoc_save.
AI Agent (Claude, Cursor, Windsurf)
│ MCP protocol (stdio)
▼
@superdoc-dev/mcp
│ Document API
▼
SuperDoc Editor (in-memory)
│ export
▼
.docx file on disk
Your documents never leave your machine. The server runs locally, reads files from disk, and writes back to disk.
Debugging
Test the server directly with the MCP Inspector:
npx @modelcontextprotocol/inspector -- npx @superdoc-dev/mcp
This opens a browser UI where you can call each tool manually and inspect the raw JSON-RPC messages.
- LLM Tools — build custom LLM integrations with the SDK
- CLI — edit documents from the terminal
- SDKs — typed Node.js and Python wrappers
- Document API — the in-browser API that defines the operation set
- AI Agents — headless mode for server-side AI workflows