Customizing Output
By default, ContextMD generates a single, optimized Markdown file named context.md in your current working directory. You can customize the destination, size, and density of this output using the following CLI options.
Specifying the Output Path
Use the -o or --output flag to define a custom filename or a specific file path for the generated context. This is useful when you are managing context files for multiple libraries or integrating the tool into a build pipeline.
# Save to a specific filename
contextmd https://docs.example.com --output example-docs.md
# Save to a specific directory
contextmd https://docs.example.com -o ./ai-context/project-name.md
Controlling Context Depth
Documentation sites can be vast. To prevent the output file from becoming too large for your LLM's context window, or to speed up the processing time, use the -l or --limit flag to restrict the number of pages crawled.
The default limit is 100 pages.
# Limit output to the first 20 pages found
contextmd https://docs.example.com --limit 20
Output Structure
The generated file is structured specifically for Agentic AI consumption. It follows a predictable format that helps models distinguish between different pages while maintaining a high-density information flow.
1. File Header
Each file begins with a standardized metadata header:
# Documentation Context
Generated by ContextMD from [URL] at [ISO-TIMESTAMP]
---
2. Page Sections
Every crawled page is processed, refined by AI to remove fluff, and appended to the file with a clear source attribution:
## Source: [Page Title](https://docs.example.com/page-url)
[AI-Refined Markdown Content]
---
AI Refinement and Content Cleaning
The output is automatically "cleaned" before it is written to the file. ContextMD performs the following transformations:
- Noise Removal: Elements like sidebars, navigation menus, footers, and scripts are stripped out using Cheerio to ensure the LLM only sees the core documentation.
- Agentic Optimization: The content is passed through
gpt-4o-miniwith a specialized system prompt. This process:- Strips conversational filler.
- Prioritizes API signatures and technical constraints.
- Fixes broken Markdown syntax from the raw scrape.
- Maintains strict keyword density for better RAG (Retrieval-Augmented Generation) performance.
Environment Configuration
While not a direct output flag, providing your API key correctly ensures the AI-powered refinement stage completes successfully. If the key is missing or the API call fails, ContextMD will fall back to raw Markdown output to ensure you still receive your documentation, though it will be less optimized for AI agents.
# Option 1: CLI Flag
contextmd https://docs.example.com -k your_api_key
# Option 2: Environment Variable (Recommended)
export OPENAI_API_KEY='your_api_key'
contextmd https://docs.example.com