We must manually conduct accessibility audits because it’s the only way to know all of the accessibility issues that exist on a website or other digital asset. During an audit, we’re manually evaluating a digital asset against the Web Content Accessibility Guidelines (WCAG). Automated scans can only reliably flag 13% (7/55) of WCAG 2.2 AA success criteria.
Thus, we must conduct an audit fully manually so that all WCAG success criteria are evaluated. However, we can use an automated scan as a best practice to ensure that all issues correctly flagged by a scan are included in our audit report.
Key Point | What It Means for You |
---|---|
Issue Detection Rate | Manual audits find all WCAG 2.1 AA issues while scans only flag approximately 25% of success criteria |
Screen Reader Testing | Only human auditors can test how content works with NVDA, JAWS, and VoiceOver assistive technologies |
Context Evaluation | Humans understand whether alt text actually describes images and if error messages make sense to users |
Keyboard Testing | Manual evaluation tracks focus order and ensures all functionality works without a mouse |
Legal Compliance | ADA and European Accessibility Act (EAA) requirements need human judgment that automated tools cannot provide |
Table of Contents
What is an Audit?
An accessibility audit is a formal, comprehensive evaluation of a website’s accessibility conducted by technical accessibility experts. During a manual audit, these experts systematically grade your website or other digital asset against the Web Content Accessibility Guidelines, looking for any instances of non-conformance.
The word audit itself implies manual work. This means human expertise and real effort are being used to thoroughly and meticulously evaluate your digital asset. Accessible.org audits are always conducted this way — with technical experts who genuinely care about accessibility spending time to ensure that no accessibility issues are missed.
Manual audits employ multiple evaluation methodologies that automated tools simply cannot perform:
- Screen reader testing with NVDA, JAWS, or VoiceOver
- Keyboard testing to ensure all functionality is available without using a mouse
- Visual inspection of color contrast and visual focus indicators
- Code inspection of the underlying HTML, CSS, and JavaScript
- Automated scans used only as a secondary check
Each methodology is essential to find different types of accessibility issues. Keyboard testing uncovers issues with keyboard traps or focus order. Screen reader testing identifies problems with text alternatives or name, role, and value attributes.
Automated Scans Limitations
Scans are automated tools that run through code looking for specific issues based on pre-programmed rules. While scans can be helpful as a review tool, they are extremely limited in what they can detect.
The limitations become clear when you examine what scans miss:
- Scans cannot evaluate the actual experience of users with disabilities
- Scans are prone to false negatives, missing real issues
- Scans generate false positives, flagging things that aren’t actually issues
- Scans cannot evaluate content for meaningful alternatives or proper structure
Free options like WAVE or AXE will give you the same results as paid versions for single pages. There’s no value in premium scans because the technology itself has inherent limitations that money cannot overcome.
Human Element in WCAG Audits
WCAG 2.1 AA conformance requires human judgment for most success criteria. Consider these examples that demonstrate why manual evaluation is essential:
Alternative Text Quality: An automated scan might detect that an image has alt text, but it cannot determine if that alt text serves an equivalent purpose. A decorative image marked as “company logo” creates confusion when it should be marked as decorative and ignored by screen readers.
Reading Order Logic: Software cannot determine if the sequence in which content is presented affects its meaning. A human must verify that the reading sequence makes sense when accessed through assistive technology.
Error Message Clarity: While a scan might detect that an error message exists, only a human can determine if the error is described to the user in text that actually helps them understand and fix the problem.
Focus Order Evaluation: Automated tools cannot determine if focusable components receive focus in an order that preserves meaning and operability. This requires understanding the logical flow of the interface.
Meeting Compliance Requirements
Organizations face increasing pressure to meet accessibility standards. The ADA in the United States and the European Accessibility Act (EAA) across Europe require digital accessibility. Shopify stores and other e-commerce platforms must ensure equal access to all users.
These regulations don’t accept automated scan results as proof of conformance. They require comprehensive evaluation that only manual audits provide. When accessibility issues lead to lawsuits, courts look for evidence of thorough manual evaluation, not automated scan reports.
Manual audits also support proper remediation. When you receive a detailed audit report with specific findings for each identified issue, your development team knows exactly what to fix. The report includes:
- Location and steps to reproduce each issue
- Associated WCAG success criteria
- Screenshots or code snippets
- Recommendations for remediation
After receiving your audit report, we always recommend using the Accessibility Tracker platform for tracking fixes and validation. This ensures your remediation stays organized and every issue gets properly addressed.
The Investment in Quality
Some digital accessibility companies sell “automated audits” or “automated testing” — but this is simply marketing language. If it’s automated, it’s a scan. If it’s an audit, it’s manual.
The marketplace confusion often comes from companies that suggest you can combine automated testing and manual testing. This hybrid approach still misses critical issues because it relies too heavily on automation for the initial evaluation.
Quality manual audits require:
- Experienced auditors with genuine accessibility expertise
- Time for thorough evaluation of every page in scope
- Multiple evaluation methodologies applied to each element
- Careful documentation of every issue found
Accessible.org’s approach demonstrates this commitment. Every audit uses fully manual evaluation with zero reliance on automated scans for primary issue identification. Scans serve only as a final review to ensure comprehensive coverage.
Making the Right Choice
When selecting an accessibility audit provider, watch for these red flags that indicate over-reliance on automation:
- Pricing doesn’t adjust after significant audit scope changes
- Emphasis on automated testing or solutions
- Inflated claims about scan effectiveness
- Unable to detail their audit methodology
- Conflating user testing with auditing
A legitimate manual audit tracks every accessibility issue through human evaluation. The auditor uses screen readers to experience your content as blind users do. They navigate using only a keyboard to ensure motor-impaired users can access all functionality. They inspect code to verify proper semantic structure.
This level of detail takes time and expertise. But it’s the only way to identify all WCAG 2.1 AA issues and achieve true conformance. Your users with disabilities deserve this thoroughness, and compliance requirements demand it.
FAQ
How long does a manual accessibility audit take?
An audit typically takes 10 to 15 days depending on the scope and complexity of your digital asset. This timeline ensures thorough evaluation using all necessary evaluation methodologies.
What’s included in a manual audit report?
A complete manual audit report includes issue descriptions, URL locations, specific element locations, testing environment details, applicable WCAG success criteria, related code, visual documentation, and remediation recommendations.
How often should I conduct accessibility audits?
It’s situation dependent. Once annually is a solid cadence, but bi-annual audits can make sense for some organizations.
Get Started
Visit Accessible.org to get an estimate on your accessibility audit to day.