Skip to Content
⚠️ Dockflow is currently under development. Bugs may occur. Please report any issues on GitHub.
AI Integration

AI Integration

Dockflow provides first-class support for AI assistants through the llms.txt standard  and a dedicated MCP server .

llms.txt

The documentation is available in machine-readable format following the llms.txt standard:

FileDescriptionURL
llms.txtConcise index with links and descriptions for each page/llms.txt
llms-full.txtComplete documentation concatenated into a single file/llms-full.txt

These files are generated automatically at build time from the documentation sources.

You can point any AI assistant to https://dockflow.shawiizz.dev/llms.txt to give it context about Dockflow.

MCP Server

The @dockflow-tools/mcp package provides a Model Context Protocol  server that exposes Dockflow documentation as tools for AI assistants.

Installation

claude mcp add dockflow -- npx -y @dockflow-tools/mcp

Available Tools

ToolDescription
list_pagesList all available documentation pages with descriptions
search_docsSearch documentation for a specific topic or keyword
get_pageGet the full content of a specific documentation page

Example Usage

Once configured, you can ask your AI assistant questions like:

  • “How do I configure multi-host deployment with Dockflow?”
  • “What are the available CLI commands?”
  • “Show me the accessories configuration reference”

The MCP server fetches documentation from the live site and caches it in memory for fast responses.