Most AI accessibility tools sold today do not make a difference. They claim to automate WCAG conformance, generate fixes on the fly, or remove the need for human evaluation. None of that holds up when you look at actual audit data. The AI that does make a difference is narrower in scope and focused on speeding up work that skilled practitioners are already doing. That includes drafting remediation guidance, summarizing audit findings, generating first-pass VPAT content, and helping teams prioritize issues from a real audit report.
| Use Case | Does It Actually Help? |
|---|---|
| Auto-generating VPAT content from an audit report | Yes. Saves hours of manual writing when paired with real audit data. |
| Remediation guidance for identified issues | Yes. AI can draft code-level recommendations developers can review and apply. |
| Prioritizing issues by risk or user impact | Yes. AI can sort and group issues from an audit faster than spreadsheets. |
| Replacing a human auditor | No. AI cannot determine WCAG conformance. |
| Auto-fixing accessibility issues on a live site | No. This is the same false promise made by automated fix vendors years ago. |

The line between real AI and marketing AI
There is a wide gap between AI that helps practitioners work faster and AI that claims to replace them. The first category is real. The second is marketing.
Real AI in accessibility supports a workflow that already produces accurate results. It takes the output of a (manual) audit and helps teams move through remediation, documentation, and reporting more efficiently. It does not generate the audit. It does not decide whether a page conforms to WCAG 2.1 AA or WCAG 2.2 AA. A human auditor does that.
Marketing AI promises a button that makes a website accessible. That product does not exist. Automated scans only flag approximately 25% of issues, and adding an AI label to a scan does not change what it can detect.
Where AI actually helps right now
Once an audit report exists, AI can do real work. It can draft remediation steps for each issue, including code-level recommendations a developer can review. It can generate the narrative content for a VPAT based on findings. It can group issues by component, severity, or page so a team knows what to address first.
This is the difference between AI that sits on top of accurate data and AI that tries to invent the data itself. The audit is conducted by a person. The AI takes that report and reduces the time it takes to act on it.
What AI cannot do
AI cannot evaluate a website for WCAG conformance. Conformance requires judgment about whether content meets each criterion in context, and that judgment depends on understanding intent, alternatives, and how assistive technology interprets the page. A scan, automated or AI-labeled, cannot make that call.
AI also cannot reliably fix issues on a production site without human review. Code that looks correct in isolation may break a component, change keyboard behavior, or introduce a new issue elsewhere. Every AI-suggested fix needs a developer to review and a validation step after.
Which AI accessibility tools make a difference for most teams?
The ones that connect to real audit data and reduce hours of manual work. AI that drafts a VPAT from a completed audit, AI that writes remediation guidance for each identified issue, and AI that helps prioritize what to fix first. These are practical applications that respect what AI is good at and what it is not.
What to avoid
Skip any tool that claims to make a site WCAG conformant automatically. Skip any tool that promises full ADA compliance through a script or a scan. Skip any vendor that frames AI as a replacement for human evaluation. These claims have been around for years and have not held up in court or in audits.
The tools worth using are honest about what they do. They support a workflow that includes a (manual) audit, a remediation phase, and a validation step.
Can AI conduct an accessibility audit?
No. An audit requires a human auditor evaluating each WCAG success criterion in context. AI can support parts of the workflow before and after the audit, but it cannot conduct the evaluation itself.
Does AI help with remediation?
Yes, when paired with a real audit report. AI can draft code-level recommendations for each identified issue, which a developer reviews and applies. It speeds up the work without removing human judgment.
Can AI generate a VPAT?
AI can auto-generate the content of an ACR when given a completed audit report as input. The VPAT is the template, the ACR is the filled-in document. Without real audit data, AI-generated VPAT content is guesswork.
Why do scan-based AI tools miss so much?
Automated scans, even those labeled AI, only flag approximately 25% of accessibility issues. The rest require human evaluation. Adding AI to a scan does not change the underlying limit of what code can detect without context.
AI is useful in accessibility when it sits on top of accurate human work. It is not useful as a shortcut around it.
Contact Kris to discuss a practical accessibility approach for your team: Contact Kris.