How-to: keep docs aligned with every release¶
This guide shows you how to run the unified autopipeline on release day to ensure API documentation matches the latest contracts, passes quality gates, and generates a review manifest for operator approval. You complete the full flow in approximately 20 minutes.
Prerequisites¶
Before starting, ensure you have:
- Python 3.9 or later installed (run
python3 --versionto verify) - Node.js 18 or later for MkDocs plugins (run
node --versionto verify) pip install mkdocs-material mkdocs-macros-pluginfor site building- Write access to the documentation repository
- About 20 minutes for the full pipeline run
Already have the pipeline running?
Skip to Step 3: review the manifest to check results from a previous run.
When to run this flow¶
Run the autopipeline when any of these conditions apply:
- A new API version ships to production
- A protocol contract file changes (OpenAPI, GraphQL schema, proto, AsyncAPI spec, WebSocket contract)
- The weekly automation schedule triggers (configured in
client_runtime.yml) - Documentation gaps exceed the SLA threshold (default: 10 high-priority gaps)
Step 1: trigger the autopipeline¶
Run the unified pipeline from the repository root. This command executes gap detection, drift analysis, contract validation for all five protocols, KPI evaluation, and RAG metadata generation.
python3 scripts/run_autopipeline.py \
--docsops-root . \
--reports-dir reports \
--runtime-config reports/client_runtime.yml \
--mode veridoc \
--since 7
What each flag does:
| Flag | Value | Purpose |
|---|---|---|
--docsops-root |
. |
Root directory of the docs-ops repository |
--reports-dir |
reports |
Where pipeline writes output artifacts |
--runtime-config |
reports/client_runtime.yml |
Client-specific protocol and module settings |
--mode |
veridoc |
Full pipeline mode with all quality gates |
--since |
7 |
Analyze changes from the last 7 days |
The pipeline runs seven stages. Expect 5-10 minutes depending on contract complexity.
Expected output (summary):
[autopipeline] Starting VeriDoc pipeline run
[autopipeline] Runtime config: reports/client_runtime.yml
[autopipeline] Protocols enabled: rest, graphql, grpc, asyncapi, websocket
[stage] multi_protocol_contract ... DONE (5 protocols, 0 failures)
[stage] kpi_wall ... DONE (quality_score: 100)
[stage] retrieval_evals ... DONE (124 modules indexed)
[autopipeline] Pipeline complete. Exit code: 0
An exit code of 0 confirms all stages passed. A non-zero code indicates stages that need attention.
Step 2: check protocol contract results¶
After the pipeline completes, verify which protocols passed contract validation:
python3 -c "
import json
r = json.load(open('reports/multi_protocol_contract_report.json'))
for p in r.get('protocols', []):
status = 'PASS' if p not in r.get('failed_protocols', []) else 'FAIL'
print(f' {p}: {status}')
print(f'Failed: {r.get(\"failed_protocols\", [])}')"
Expected output:
rest: PASS
graphql: PASS
grpc: PASS
asyncapi: PASS
websocket: PASS
Failed: []
If any protocol fails, fix the root cause before proceeding. See Troubleshooting for common fixes.
Step 3: review the manifest¶
Open the review manifest to see all generated artifacts and their availability:
cat reports/REVIEW_MANIFEST.md
The manifest lists every artifact, stage status, and provides a reviewer checklist. Key sections:
| Section | What to check |
|---|---|
| Pipeline execution summary | Exit code, strictness mode, artifact counts |
| Stage availability | Which stages produced artifacts, which are missing |
| Reviewer checklist | Approval gates before publish |
You can also check the machine-readable manifest:
python3 -c "
import json
m = json.load(open('reports/review_manifest.json'))
print(f'Available: {m[\"available_artifacts\"]}')
print(f'Missing: {m[\"missing_artifacts\"]}')
print(f'Strictness: {m[\"strictness\"]}')"
Step 4: verify quality gates¶
Check that quality metrics meet your thresholds:
| Gate | Command | Expected result |
|---|---|---|
| Quality score | python3 -c "import json; print(json.load(open('reports/kpi-wall.json'))['quality_score'])" |
80 or higher |
| Contract validation | Check multi_protocol_contract_report.json |
Zero failed protocols |
| Stage summary | Check pipeline_stage_summary.json |
All required stages exist |
| Frontmatter | All docs have valid frontmatter | 100% metadata completeness |
If the quality score is below 80:
-
Check the gap report for high-priority items:
python3 -c " import json r = json.load(open('reports/doc_gaps_report.json')) high = [g for g in r.get('gaps', []) if g.get('priority') == 'high'] print(f'{len(high)} high-priority gaps') for g in high[:5]: print(f' - {g[\"title\"]}')" -
Address each high-priority gap by creating or updating the relevant document.
-
Re-run the pipeline and verify the score improves.
Step 5: build the demo site¶
Generate the browsable MkDocs documentation site:
python3 scripts/build_demo_site.py \
--output-root demo-showcase/veridoc \
--reports-dir reports \
--build
This command copies documentation pages, updates the mkdocs.yml navigation, and runs mkdocs build to produce the final HTML site in demo-showcase/veridoc/site/.
Step 6: approve and publish¶
After verifying all gates pass, complete the reviewer checklist from the manifest:
- [ ] Confirm stage summary has no missing required artifacts.
- [ ] Review protocol docs and test asset links.
- [ ] Review quality and retrieval reports before publish.
- [ ] Approve publish only if critical findings are resolved.
- [ ] Verify RAG retrieval index is current and complete.
- [ ] Confirm advanced retrieval features are enabled (hybrid search, HyDE, reranking, embedding cache).
Validation checklist¶
Before considering the release complete:
- [ ] All five protocol contracts validated (zero failures)
- [ ] Quality score at or above 80
- [ ] No high-priority documentation gaps remain
- [ ] Review manifest approved by operator
- [ ] MkDocs site builds without errors
- [ ] Knowledge graph and retrieval index are current
- [ ] Advanced retrieval features enabled (hybrid, HyDE, reranking, cache)
Common issues and solutions¶
Issue: gRPC stage fails¶
The protoc compiler is not installed.
Solution:
apt-get install -y protobuf-compiler
protoc --version
Then re-run the pipeline.
Issue: quality score below 80¶
Stale documents, missing frontmatter, or unresolved documentation gaps.
Solution:
- Run
python3 -c "import json; r=json.load(open('reports/doc_gaps_report.json')); print(len([g for g in r.get('gaps',[]) if g.get('priority')=='high']), 'high-priority gaps')"to count gaps. - Address each gap by creating or updating documents.
- Re-run the pipeline.
Issue: MkDocs build fails with theme error¶
Solution:
pip install mkdocs-material mkdocs-macros-plugin
Next steps¶
- Concept: pipeline-first documentation lifecycle to understand why this workflow matters
- Quality evidence and gate results for the latest KPI metrics
- Troubleshooting: common pipeline issues for detailed fix procedures