No. Automated scans cannot serve as legal compliance evidence for ADA conformance or WCAG conformance. Scans only flag approximately 25% of issues, miss the majority of WCAG criteria that require human judgment, and produce no record of conformance. A clean scan report does not prove a website is accessible. Courts, plaintiffs’ attorneys, and federal agencies look for evidence produced through human evaluation, which means a (manual) accessibility audit and the documentation that comes with it.
| Question | Answer |
|---|---|
| Do scans prove WCAG conformance? | No. Scans detect roughly 25% of issues and cannot evaluate context, meaning, or user experience. |
| Do scan reports hold up in court? | No. They are not recognized as conformance documentation in ADA litigation. |
| What does count as evidence? | A (manual) accessibility audit report, an ACR, remediation records, and validation documentation. |
| Where do scans fit? | Ongoing monitoring between audits to catch regressions on known issue types. |

Why Scans Fall Short as Evidence
An automated scan runs a set of programmatic checks against a page. It can flag missing alt attributes, empty form labels, low contrast ratios in some cases, and other code-level issues. What it cannot do is interpret whether an alt attribute is meaningful, whether a form label describes the right field, or whether a page reads correctly with a screen reader.
Most WCAG success criteria require human evaluation. A scanner cannot determine if a heading structure makes logical sense, if focus order matches reading order in a complex layout, or if an interactive component announces its state correctly to assistive technology. These are the issues that drive lawsuits.
A scan report shows what the scanner caught. It does not show what the scanner missed. That distinction matters in any legal context.
What Courts and Plaintiffs Actually Look For
ADA website lawsuits are typically filed by plaintiffs’ attorneys who conduct their own evaluations using screen readers and keyboard navigation. They identify issues a scanner would never flag: confusing focus traps, unlabeled icon buttons, modals that break screen reader output, checkout flows that fail with a keyboard.
When a defendant points to a scan report as proof of accessibility, the response is direct: the scanner did not catch the issues being claimed. The report is not evidence of conformance. It’s evidence that someone ran a tool.
What carries weight is documentation of human evaluation. An accessibility audit conducted by qualified auditors produces a report that identifies issues against specific WCAG success criteria, with locations, severity, and recommended fixes. That report, paired with remediation records, demonstrates an actual effort to meet the standard.
What Real Compliance Evidence Looks Like
Strong documentation typically includes a (manual) audit report, a remediation log showing fixes applied, validation confirming those fixes resolved the issues, and where appropriate, an Accessibility Conformance Report. For SaaS companies and software vendors, the ACR produced from a VPAT serves as the formal conformance document delivered to procurement teams and, when needed, referenced in legal contexts.
For ADA defense, a recent audit report dated within a reasonable window, plus evidence that identified issues were addressed, is the kind of record that supports a defense or settlement position. A scan report sitting on a server does not.
Where Scans Do Belong
Scans have a role. They are useful for ongoing monitoring between audits, catching code regressions on issue types they can detect, and giving developers fast feedback during builds. A team that pushes a release and gets an alert that 40 images lost their alt text has caught a real problem early.
That use case is operational, not legal. Scans support a maintenance routine. They do not produce conformance documentation, and they should never be presented as if they do.
How to Build a Defensible Record
The path is direct. Start with a (manual) accessibility audit against WCAG 2.1 AA or WCAG 2.2 AA. Work through remediation on the identified issues, prioritizing by user impact and risk. Validate the fixes. Keep the records. Update the audit after major product changes.
For software vendors, layer in an ACR so procurement teams have the document they need. For organizations under ADA Title II or facing lawsuit risk under Title III, the audit and remediation history is what demonstrates good faith and actual conformance work.
Frequently Asked Questions
Will a scan report help in an ADA lawsuit?
Not as conformance evidence. It may show a defendant was paying some attention to accessibility, but it will not refute claims about issues the scanner cannot detect. Plaintiffs’ attorneys evaluate sites manually, and that’s where most claimed issues come from.
Can a scan be combined with an audit to lower costs?
Audits and scans are separate activities. An audit identifies WCAG conformance issues through human evaluation. A scan flags a narrow slice of code-level issues. They serve different purposes and produce different outputs. Combining them does not produce a hybrid document with legal weight.
How often should a website be audited for compliance purposes?
An annual audit is a common cadence, with additional audits triggered by major redesigns, new features, or platform migrations. Between audits, scans and internal checks help catch regressions, but the audit is what anchors the conformance record.
What if a vendor sells a scan as a compliance product?
Read the documentation carefully. A scan tool that markets itself as producing compliance evidence is overstating what the technology can do. No automated scan can determine WCAG conformance, and no scan report should be treated as a substitute for a (manual) audit.
Scans answer one question: what code-level issues can a tool detect right now? Legal compliance asks a different question entirely, and the answer requires human evaluation, documented thoroughly.
Contact Kris to discuss audit and conformance documentation: Contact Kris.