AI Integration
Dockflow provides first-class support for AI assistants through the llms.txt standard and a dedicated MCP server .
llms.txt
The documentation is available in machine-readable format following the llms.txt standard:
| File | Description | URL |
|---|---|---|
llms.txt | Concise index with links and descriptions for each page | /llms.txt |
llms-full.txt | Complete documentation concatenated into a single file | /llms-full.txt |
These files are generated automatically at build time from the documentation sources.
You can point any AI assistant to https://dockflow.shawiizz.dev/llms.txt to give it context about Dockflow.
MCP Server
The @dockflow-tools/mcp package provides a Model Context Protocol server that exposes Dockflow documentation as tools for AI assistants.
Installation
Claude Code
claude mcp add dockflow -- npx -y @dockflow-tools/mcpAvailable Tools
| Tool | Description |
|---|---|
list_pages | List all available documentation pages with descriptions |
search_docs | Search documentation for a specific topic or keyword |
get_page | Get the full content of a specific documentation page |
Example Usage
Once configured, you can ask your AI assistant questions like:
- “How do I configure multi-host deployment with Dockflow?”
- “What are the available CLI commands?”
- “Show me the accessories configuration reference”
The MCP server fetches documentation from the live site and caches it in memory for fast responses.