How to Measure Accessibility Progress After Remediation

You measure accessibility progress after remediation by comparing your current conformance status against your original audit baseline. The number of open issues, their severity, and the percentage of WCAG success criteria now met give you a clear picture of where you stand and how far you have come.

Without a structured approach to measurement, remediation work can feel endless. Teams fix issues, but nobody confirms whether meaningful progress occurred. The path forward is a repeatable process that connects audit data to real outcomes.

Measuring Accessibility Progress After Remediation
Measurement Method What It Tells You
Baseline audit comparison Percentage of original issues resolved since the first evaluation
Validation audit Whether specific fixes meet WCAG 2.1 AA or WCAG 2.2 AA criteria
Issue severity tracking Whether high-impact issues are being addressed first
Scan monitoring Surface-level regression detection between audits (scans only flag approximately 25% of issues)
Conformance percentage How close you are to full WCAG conformance across evaluated pages

Why a Baseline Audit Is the Starting Point

You cannot measure progress without a reference point. Your initial accessibility audit creates that reference point. It identifies every issue, maps each one to specific WCAG success criteria, and assigns severity ratings.

This baseline becomes the denominator in your progress equation. If an audit identified 140 issues and your team has resolved 95, you are at roughly 68% resolution. That number means something because it ties directly to documented, evaluated findings.

Organizations that skip the baseline and start fixing things ad hoc have no way to prove progress to leadership, procurement partners, or legal counsel. The audit report is the foundation.

What Does a Validation Audit Confirm?

A validation audit is a follow-up evaluation conducted after remediation. An auditor reviews the specific issues from the original report and confirms whether each fix meets the applicable WCAG success criterion.

This is different from a full re-audit. Validation focuses on previously identified issues rather than evaluating the entire digital asset from scratch. It is faster, more affordable, and directly tied to the remediation work your developers completed.

Tracking Issues Through the Remediation Lifecycle

Every issue from your audit report moves through a lifecycle: identified, assigned, in progress, fixed, validated. Tracking where each issue sits in that lifecycle at any given point gives you a real-time view of progress.

Spreadsheets work for smaller projects. For larger digital assets or ongoing compliance programs, a dedicated tracking system is more practical. The key is mapping every issue to its WCAG criterion, assigning severity using User Impact prioritization formulas, and letting project managers see exactly what percentage of work is complete.

The lifecycle view matters because not all fixes carry equal weight. Closing 30 low-severity issues is less meaningful than resolving 5 critical ones that block screen reader users entirely.

How Severity Ratings Shape Your Progress Story

Raw issue count is a useful metric but an incomplete one. Severity ratings add context.

If your audit report categorizes issues by severity, your progress reporting should reflect that. Resolving 100% of critical and high-severity issues while 20 minor issues remain open is a strong position. Resolving 80% of total issues while 10 critical ones remain open is not.

When reporting progress to decision-makers, frame it in terms of risk reduction and user impact. A board member does not need to know what ARIA attributes do. They need to know that the most significant accessibility issues, the ones most likely to affect users or trigger ADA compliance concerns, are resolved.

Where Scans Fit in Progress Monitoring

Automated scans are useful for catching regressions between audits. If a developer pushes a code change that removes alt text from images or breaks heading hierarchy, a scan can flag that quickly.

But scans only flag approximately 25% of issues. They cannot confirm WCAG conformance, and they cannot validate that a fix actually resolves the underlying accessibility issue. A scan might report zero detectable issues on a page that still has keyboard traps, missing form labels only visible in certain states, or insufficient contrast in dynamic content.

Use scans as a monitoring layer. They sit between audits and catch the obvious regressions. They are not a measurement of conformance progress on their own.

Conformance Percentage as a North Star Metric

The clearest single metric for accessibility progress is conformance percentage: what portion of applicable WCAG 2.1 AA (or WCAG 2.2 AA) success criteria does your digital asset currently meet?

This is different from issue resolution percentage. One WCAG criterion might have produced 12 issues across your site. Fixing 11 of them still leaves that criterion in a state of nonconformance. Conformance percentage accounts for this by looking at the criterion level rather than the individual issue level.

Both metrics are valuable. Issue resolution tells your development team how much work remains. Conformance percentage tells your compliance team how close the product is to full WCAG conformance.

Reporting Progress to Different Audiences

Developers want issue-level detail: what is left to fix, what WCAG criterion it maps to, and what the expected behavior should be.

Project managers want lifecycle status: how many issues are open, in progress, or validated.

Leadership wants the summary: conformance percentage, risk posture, and timeline to completion.

Structuring your reporting to address each audience prevents the common problem where a progress update is either too granular for executives or too vague for the people doing the work.

How Often Should You Measure Progress?

For active remediation projects, review progress weekly. This keeps momentum and surfaces blockers early.

For ongoing maintenance after reaching conformance, a quarterly review paired with continuous scan monitoring is a reasonable cadence. Any significant product update, redesign, or new feature release should trigger a fresh evaluation.

ACRs should also be updated after significant product changes. If your organization maintains a VPAT for procurement, the ACR reflects a point-in-time conformance status. Measuring progress after remediation is directly connected to keeping that documentation current.

Do I need a full re-audit to measure progress?

Not always. A validation audit focuses on previously identified issues and confirms whether fixes meet WCAG criteria. A full re-audit evaluates the entire digital asset and is appropriate when major changes have occurred or when enough time has passed that the original scope may no longer be representative.

Can scans replace audits for tracking remediation progress?

No. Scans only flag approximately 25% of issues and cannot confirm whether a fix achieves WCAG conformance. They are useful for detecting regressions between manual evaluations, but they are not a substitute for an auditor reviewing the remediated content.

What metric should I report to leadership first?

Conformance percentage paired with severity-weighted resolution. Leadership needs to know how close the product is to full WCAG 2.1 AA or WCAG 2.2 AA conformance and whether the highest-risk issues have been addressed. Raw issue counts without severity context are less meaningful at the executive level.

Measuring accessibility progress after remediation is not about checking a box. It is about creating a repeatable, data-driven process that connects audit findings to verified outcomes, keeps your team accountable, and gives every audience the information they need to make good decisions.

Contact Kris Rivenburgh for guidance on structuring your accessibility remediation and measurement workflow.