Skip to content

Testing & Audit Tools

Automated tools catch about 30–50% of accessibility issues—things like missing alt text, low contrast, and invalid ARIA. Recent machine learning enhancements have pushed automated detection to approximately 57%, with projections suggesting 70% by late 2025. But the rest requires human testing: navigating with a keyboard, listening with a screen reader, and understanding whether the experience actually makes sense.

No single tool or approach is sufficient. The W3C recommends a hybrid approach combining automated testing with manual evaluation. This gives you the speed of automated checks with the thoroughness of human judgment.

Automated testing excels at detecting:

  • Missing alternative text on images
  • Insufficient color contrast ratios
  • Invalid ARIA attributes and roles
  • Missing form labels and associations
  • Duplicate IDs and structural issues
  • Empty links and buttons
  • Missing language declarations
  • Invalid heading hierarchy (skipped levels)

Manual testing is essential for:

  • Keyboard navigation flow and logic
  • Screen reader experience quality
  • Dynamic content announcements
  • Focus management after interactions
  • Reading order versus visual order
  • Meaningful alternative text (not just present, but useful)
  • Understandable content and instructions
  • Error recovery experience
  • Context and purpose of interactive elements

Testing with people with disabilities reveals:

  • Real-world usability beyond conformance
  • Unexpected interaction patterns
  • Assistive technology compatibility issues
  • Cognitive and comprehension challenges
  • Fatigue and efficiency problems
  • Workarounds users actually employ

Best for: Deep accessibility analysis, CI/CD integration, developer workflows

Axe DevTools is the industry standard with 85+ automated checks. The underlying axe-core library powers many other tools including Lighthouse.

Strengths:

  • Most comprehensive rule coverage
  • Excellent CI/CD integration
  • Clear issue explanations
  • Guided manual testing (paid tier)
  • High accuracy with low false positives

Limitations:

  • Fix suggestions and dashboards require paid tier
  • Still only catches automated-detectable issues

How to use:

  • Browser extension for interactive testing
  • CLI for build pipeline integration
  • API for custom integrations

Best for: Quick audits, general web quality checks, development workflow

Lighthouse is built into Chrome DevTools. Runs accessibility, performance, SEO, and best practices audits in one pass.

Strengths:

  • No installation required (built into Chrome)
  • Quick overview of site quality
  • Good starting point for beginners
  • Performance and SEO combined with accessibility

Limitations:

  • Uses axe-core but runs a subset of tests (not the full 85+)
  • Too limited for comprehensive accessibility analysis
  • Think of it as a quick scan, not a deep audit

How to use:

  • Chrome DevTools → Lighthouse tab
  • CLI for automated builds
  • PageSpeed Insights online

Best for: Visual feedback, content creator education, quick checks

WAVE provides visual icons directly on your page showing where issues occur.

Strengths:

  • Excellent visual feedback
  • Free with no login required
  • Good for educating content teams
  • Shows issues in context

Limitations:

  • Overlay icons can be confusing with complex layouts
  • Struggles with absolutely positioned elements
  • Invisible elements are hard to locate
  • Less suited for CI/CD integration

How to use:

  • Browser extension (Firefox, Chrome)
  • Online at wave.webaim.org
  • API available for automation

Best for: CI pipelines, automated regression testing, command-line workflows

Pa11y is a command-line tool designed for automation.

Strengths:

  • Simple CLI interface
  • Easy CI/CD integration
  • Multiple runners (Puppeteer, Playwright)
  • Dashboard option for tracking over time

Limitations:

  • Less visual than browser tools
  • Requires technical setup
  • Same detection limits as all automated tools
ToolBest ForChecksFree
axe DevToolsDeep analysis, CI/CD85+Core free
LighthouseQuick auditsSubsetYes
WAVEVisual educationGoodYes
Pa11yCI pipelinesaxe/HTML_CSYes

Best practice: Use 2-3 tools together. Different tools catch different issues. Run both Lighthouse (quick scan) and axe DevTools (deep scan) for better coverage.

Automated tools are necessary but not sufficient. After automated testing, manual evaluation uncovers severe accessibility barriers that automation misses.

Put down your mouse and complete key tasks using only the keyboard. This 5-minute test can reveal severe barriers.

What to test:

KeyExpected BehaviorCheck
TabMove to next interactive elementFocus moves logically
Shift+TabMove to previous elementCan navigate backwards
EnterActivate buttons and linksAll controls work
SpaceActivate buttons, toggle checkboxesControls respond correctly
Arrow keysNavigate within components (menus, tabs)Complex controls work
EscapeClose modals and popupsCan exit overlays

Common failures:

  • Focus not visible on some elements
  • Focus order doesn’t match visual layout
  • Can’t reach elements without a mouse
  • Trapped in a component with no escape
  • Modals don’t trap focus appropriately
  • Focus lost after dynamic updates

Test with at least one screen reader—ideally the most common pairings from the WebAIM survey:

Essential combinations:

  • VoiceOver + Safari (Mac/iOS): Cmd+F5 to start on Mac
  • NVDA + Firefox or Chrome (Windows): Free download
  • TalkBack + Chrome (Android): Built into Android

What to listen for:

  • Interactive elements announced with clear names and roles
  • State changes announced (expanded, selected, checked, disabled)
  • Reading order is logical
  • Dynamic content updates announced appropriately
  • Form errors are announced and associated with fields
  • Headings and landmarks help navigation

Testing approach:

  1. Navigate through the page using heading shortcuts (H key)
  2. Navigate through form fields (F key)
  3. Tab through interactive elements
  4. Complete key user tasks by listening only

Test with browser zoom and operating system magnification:

Browser zoom:

  • Increase to 200%, 400%
  • Check for horizontal scrolling (shouldn’t be required)
  • Verify content reflows appropriately
  • Confirm nothing is cut off or overlapping

System magnification:

  • Use Windows Magnifier, macOS Zoom
  • Verify focus remains visible when magnified
  • Check that tooltips and popups appear near their triggers

The Website Accessibility Conformance Evaluation Methodology (WCAG-EM) provides a structured approach:

Step 1: Define scope

  • What pages/states are included?
  • Which WCAG version and level (typically 2.2 AA)?
  • What is the evaluation goal?

Step 2: Explore the target website

  • Identify key pages and functionality
  • Note technologies used
  • Find representative samples

Step 3: Select representative sample

  • Include home page and key entry points
  • Cover all templates and page types
  • Include critical user flows

Step 4: Audit the selected sample

  • Evaluate each page against all success criteria
  • Document issues with location and impact
  • Rate severity of failures

Step 5: Report findings

  • Summarize overall conformance level
  • Detail failures by success criterion
  • Provide remediation guidance

Organize findings by WCAG’s four principles:

POUR framework:

  • Perceivable: Can users perceive all content?
  • Operable: Can users operate all controls?
  • Understandable: Can users understand content and interface?
  • Robust: Does it work with assistive technologies?

For each issue, document:

  • Success criterion violated
  • Location (page, component)
  • Description of failure
  • Impact on users
  • Recommended fix
  • Severity/priority
  • Chrome DevTools: Rendering → Emulate vision deficiencies
  • Firefox DevTools: Accessibility → Simulate
  • Stark: Includes vision simulation
  • Polypane: Multiple simulations side-by-side

During development:

  • Linting with eslint-plugin-jsx-a11y or similar
  • Automated checks on save/build
  • axe browser extension for quick checks

Pull request checks:

  • Automated accessibility scans in CI
  • Keyboard navigation verification
  • Screen reader spot checks for changed components
  • Require accessibility checklist completion

Build pipeline:

# Example: Fail build on critical issues
npm run test:a11y -- --tags wcag2a,wcag2aa --exit-on-error
  • Track accessibility bugs alongside other issues
  • Prioritize by user impact, not just conformance level
  • Include WCAG criterion reference
  • Document steps to reproduce with AT
FrequencyTesting Type
Every commitAutomated linting
Every PRAutomated scans + keyboard check
WeeklyManual spot checks
QuarterlyComprehensive manual audit
AnnuallyExternal audit with user testing

User testing with people with disabilities

Section titled “User testing with people with disabilities”

Conformance testing tells you if you meet standards. User testing tells you if people can actually use your product.

What you learn from users:

  • Real-world usability versus theoretical compliance
  • Workarounds and pain points
  • Preferences and expectations
  • Compatibility with their specific AT setup
  • Partner with disability organizations
  • Use specialized recruiting services
  • Include people with diverse disabilities
  • Pay participants fairly for their time
  • Let participants use their own devices and AT
  • Give tasks, not instructions
  • Observe without interrupting
  • Ask about their typical experience
  • Note workarounds and frustrations

According to 2024 accessibility testing research, recent machine learning enhancements to axe have increased automated test coverage to detect approximately 57% of accessibility issues by volume, with projections suggesting nearly 70% by end of 2025.

2024 tool comparisons confirm that while Lighthouse uses axe-core, it runs only a subset of the 70+ tests available in full axe DevTools. For comprehensive accessibility analysis, dedicated tools like axe DevTools are recommended over general-purpose tools like Lighthouse.

Business owners across Europe face a June 2025 deadline for the European Accessibility Act. Organizations selling products or services in EU markets must meet strict accessibility standards, with penalties reaching up to €500,000 in some countries.

The W3C and industry research continues to emphasize that a combination of automated tests and human expertise—the hybrid approach—is essential. Automated scanning alone cannot achieve full WCAG conformance.

2025 accessibility testing guides emphasize that the best tool is one you’ll actually use. Integration into existing workflows is critical—automated testing in CI/CD, with manual testing checkpoints at key stages.

  • Automated scanning: axe or equivalent in CI pipeline
  • Keyboard testing: Part of PR review process
  • Screen reader testing: Minimum VoiceOver + NVDA
  • Color/contrast checks: Tools available to designers
  • Issue tracking: Accessibility bugs prioritized appropriately
  • Audit schedule: Quarterly manual, annual comprehensive
  • Automated scans pass
  • Keyboard navigation verified for new/changed features
  • Screen reader spot check completed
  • Color contrast verified for new UI
  • No new WCAG failures introduced

Automated Tools:

Testing Methodology:

Recent Research:

Checklists: