Testing & Audit Tools
Automated tools catch about 30–50% of accessibility issues—things like missing alt text, low contrast, and invalid ARIA. Recent machine learning enhancements have pushed automated detection to approximately 57%, with projections suggesting 70% by late 2025. But the rest requires human testing: navigating with a keyboard, listening with a screen reader, and understanding whether the experience actually makes sense.
No single tool or approach is sufficient. The W3C recommends a hybrid approach combining automated testing with manual evaluation. This gives you the speed of automated checks with the thoroughness of human judgment.
The testing pyramid
Section titled “The testing pyramid”What automated tools catch
Section titled “What automated tools catch”Automated testing excels at detecting:
- Missing alternative text on images
- Insufficient color contrast ratios
- Invalid ARIA attributes and roles
- Missing form labels and associations
- Duplicate IDs and structural issues
- Empty links and buttons
- Missing language declarations
- Invalid heading hierarchy (skipped levels)
What only humans can evaluate
Section titled “What only humans can evaluate”Manual testing is essential for:
- Keyboard navigation flow and logic
- Screen reader experience quality
- Dynamic content announcements
- Focus management after interactions
- Reading order versus visual order
- Meaningful alternative text (not just present, but useful)
- Understandable content and instructions
- Error recovery experience
- Context and purpose of interactive elements
What only users can tell you
Section titled “What only users can tell you”Testing with people with disabilities reveals:
- Real-world usability beyond conformance
- Unexpected interaction patterns
- Assistive technology compatibility issues
- Cognitive and comprehension challenges
- Fatigue and efficiency problems
- Workarounds users actually employ
Automated testing tools
Section titled “Automated testing tools”axe DevTools
Section titled “axe DevTools”Best for: Deep accessibility analysis, CI/CD integration, developer workflows
Axe DevTools is the industry standard with 85+ automated checks. The underlying axe-core library powers many other tools including Lighthouse.
Strengths:
- Most comprehensive rule coverage
- Excellent CI/CD integration
- Clear issue explanations
- Guided manual testing (paid tier)
- High accuracy with low false positives
Limitations:
- Fix suggestions and dashboards require paid tier
- Still only catches automated-detectable issues
How to use:
- Browser extension for interactive testing
- CLI for build pipeline integration
- API for custom integrations
Lighthouse
Section titled “Lighthouse”Best for: Quick audits, general web quality checks, development workflow
Lighthouse is built into Chrome DevTools. Runs accessibility, performance, SEO, and best practices audits in one pass.
Strengths:
- No installation required (built into Chrome)
- Quick overview of site quality
- Good starting point for beginners
- Performance and SEO combined with accessibility
Limitations:
- Uses axe-core but runs a subset of tests (not the full 85+)
- Too limited for comprehensive accessibility analysis
- Think of it as a quick scan, not a deep audit
How to use:
- Chrome DevTools → Lighthouse tab
- CLI for automated builds
- PageSpeed Insights online
Best for: Visual feedback, content creator education, quick checks
WAVE provides visual icons directly on your page showing where issues occur.
Strengths:
- Excellent visual feedback
- Free with no login required
- Good for educating content teams
- Shows issues in context
Limitations:
- Overlay icons can be confusing with complex layouts
- Struggles with absolutely positioned elements
- Invisible elements are hard to locate
- Less suited for CI/CD integration
How to use:
- Browser extension (Firefox, Chrome)
- Online at wave.webaim.org
- API available for automation
Best for: CI pipelines, automated regression testing, command-line workflows
Pa11y is a command-line tool designed for automation.
Strengths:
- Simple CLI interface
- Easy CI/CD integration
- Multiple runners (Puppeteer, Playwright)
- Dashboard option for tracking over time
Limitations:
- Less visual than browser tools
- Requires technical setup
- Same detection limits as all automated tools
Comparison summary
Section titled “Comparison summary”| Tool | Best For | Checks | Free |
|---|---|---|---|
| axe DevTools | Deep analysis, CI/CD | 85+ | Core free |
| Lighthouse | Quick audits | Subset | Yes |
| WAVE | Visual education | Good | Yes |
| Pa11y | CI pipelines | axe/HTML_CS | Yes |
Best practice: Use 2-3 tools together. Different tools catch different issues. Run both Lighthouse (quick scan) and axe DevTools (deep scan) for better coverage.
Manual testing essentials
Section titled “Manual testing essentials”Automated tools are necessary but not sufficient. After automated testing, manual evaluation uncovers severe accessibility barriers that automation misses.
Keyboard navigation testing
Section titled “Keyboard navigation testing”Put down your mouse and complete key tasks using only the keyboard. This 5-minute test can reveal severe barriers.
What to test:
| Key | Expected Behavior | Check |
|---|---|---|
| Tab | Move to next interactive element | Focus moves logically |
| Shift+Tab | Move to previous element | Can navigate backwards |
| Enter | Activate buttons and links | All controls work |
| Space | Activate buttons, toggle checkboxes | Controls respond correctly |
| Arrow keys | Navigate within components (menus, tabs) | Complex controls work |
| Escape | Close modals and popups | Can exit overlays |
Common failures:
- Focus not visible on some elements
- Focus order doesn’t match visual layout
- Can’t reach elements without a mouse
- Trapped in a component with no escape
- Modals don’t trap focus appropriately
- Focus lost after dynamic updates
Screen reader testing
Section titled “Screen reader testing”Test with at least one screen reader—ideally the most common pairings from the WebAIM survey:
Essential combinations:
- VoiceOver + Safari (Mac/iOS): Cmd+F5 to start on Mac
- NVDA + Firefox or Chrome (Windows): Free download
- TalkBack + Chrome (Android): Built into Android
What to listen for:
- Interactive elements announced with clear names and roles
- State changes announced (expanded, selected, checked, disabled)
- Reading order is logical
- Dynamic content updates announced appropriately
- Form errors are announced and associated with fields
- Headings and landmarks help navigation
Testing approach:
- Navigate through the page using heading shortcuts (H key)
- Navigate through form fields (F key)
- Tab through interactive elements
- Complete key user tasks by listening only
Zoom and magnification testing
Section titled “Zoom and magnification testing”Test with browser zoom and operating system magnification:
Browser zoom:
- Increase to 200%, 400%
- Check for horizontal scrolling (shouldn’t be required)
- Verify content reflows appropriately
- Confirm nothing is cut off or overlapping
System magnification:
- Use Windows Magnifier, macOS Zoom
- Verify focus remains visible when magnified
- Check that tooltips and popups appear near their triggers
WCAG conformance evaluation
Section titled “WCAG conformance evaluation”WCAG-EM methodology
Section titled “WCAG-EM methodology”The Website Accessibility Conformance Evaluation Methodology (WCAG-EM) provides a structured approach:
Step 1: Define scope
- What pages/states are included?
- Which WCAG version and level (typically 2.2 AA)?
- What is the evaluation goal?
Step 2: Explore the target website
- Identify key pages and functionality
- Note technologies used
- Find representative samples
Step 3: Select representative sample
- Include home page and key entry points
- Cover all templates and page types
- Include critical user flows
Step 4: Audit the selected sample
- Evaluate each page against all success criteria
- Document issues with location and impact
- Rate severity of failures
Step 5: Report findings
- Summarize overall conformance level
- Detail failures by success criterion
- Provide remediation guidance
Audit report structure
Section titled “Audit report structure”Organize findings by WCAG’s four principles:
POUR framework:
- Perceivable: Can users perceive all content?
- Operable: Can users operate all controls?
- Understandable: Can users understand content and interface?
- Robust: Does it work with assistive technologies?
For each issue, document:
- Success criterion violated
- Location (page, component)
- Description of failure
- Impact on users
- Recommended fix
- Severity/priority
Color and contrast tools
Section titled “Color and contrast tools”Contrast checkers
Section titled “Contrast checkers”- WebAIM Contrast Checker: Quick manual checks
- Colour Contrast Analyser: Desktop application with color picker
- Browser DevTools: Built-in contrast checking in Chrome, Firefox
Design tool plugins
Section titled “Design tool plugins”- Stark: Figma, Sketch, Adobe XD plugin
- Use Contrast: macOS menu bar app
- Polypane: Browser with accessibility simulation
Color blindness simulation
Section titled “Color blindness simulation”- Chrome DevTools: Rendering → Emulate vision deficiencies
- Firefox DevTools: Accessibility → Simulate
- Stark: Includes vision simulation
- Polypane: Multiple simulations side-by-side
Integrating testing into workflow
Section titled “Integrating testing into workflow”Development workflow
Section titled “Development workflow”During development:
- Linting with eslint-plugin-jsx-a11y or similar
- Automated checks on save/build
- axe browser extension for quick checks
Pull request checks:
- Automated accessibility scans in CI
- Keyboard navigation verification
- Screen reader spot checks for changed components
- Require accessibility checklist completion
Build pipeline:
# Example: Fail build on critical issuesnpm run test:a11y -- --tags wcag2a,wcag2aa --exit-on-errorIssue tracking
Section titled “Issue tracking”- Track accessibility bugs alongside other issues
- Prioritize by user impact, not just conformance level
- Include WCAG criterion reference
- Document steps to reproduce with AT
Testing cadence
Section titled “Testing cadence”| Frequency | Testing Type |
|---|---|
| Every commit | Automated linting |
| Every PR | Automated scans + keyboard check |
| Weekly | Manual spot checks |
| Quarterly | Comprehensive manual audit |
| Annually | External audit with user testing |
User testing with people with disabilities
Section titled “User testing with people with disabilities”Why user testing matters
Section titled “Why user testing matters”Conformance testing tells you if you meet standards. User testing tells you if people can actually use your product.
What you learn from users:
- Real-world usability versus theoretical compliance
- Workarounds and pain points
- Preferences and expectations
- Compatibility with their specific AT setup
Recruiting participants
Section titled “Recruiting participants”- Partner with disability organizations
- Use specialized recruiting services
- Include people with diverse disabilities
- Pay participants fairly for their time
Conducting sessions
Section titled “Conducting sessions”- Let participants use their own devices and AT
- Give tasks, not instructions
- Observe without interrupting
- Ask about their typical experience
- Note workarounds and frustrations
Recent Research (2024-2025)
Section titled “Recent Research (2024-2025)”Automated Testing Coverage Improvements
Section titled “Automated Testing Coverage Improvements”According to 2024 accessibility testing research, recent machine learning enhancements to axe have increased automated test coverage to detect approximately 57% of accessibility issues by volume, with projections suggesting nearly 70% by end of 2025.
Tool Comparison Research
Section titled “Tool Comparison Research”2024 tool comparisons confirm that while Lighthouse uses axe-core, it runs only a subset of the 70+ tests available in full axe DevTools. For comprehensive accessibility analysis, dedicated tools like axe DevTools are recommended over general-purpose tools like Lighthouse.
EAA Compliance Deadline
Section titled “EAA Compliance Deadline”Business owners across Europe face a June 2025 deadline for the European Accessibility Act. Organizations selling products or services in EU markets must meet strict accessibility standards, with penalties reaching up to €500,000 in some countries.
Hybrid Testing Approach
Section titled “Hybrid Testing Approach”The W3C and industry research continues to emphasize that a combination of automated tests and human expertise—the hybrid approach—is essential. Automated scanning alone cannot achieve full WCAG conformance.
Testing Tools Evolution
Section titled “Testing Tools Evolution”2025 accessibility testing guides emphasize that the best tool is one you’ll actually use. Integration into existing workflows is critical—automated testing in CI/CD, with manual testing checkpoints at key stages.
Implementation checklist
Section titled “Implementation checklist”Testing infrastructure
Section titled “Testing infrastructure”- Automated scanning: axe or equivalent in CI pipeline
- Keyboard testing: Part of PR review process
- Screen reader testing: Minimum VoiceOver + NVDA
- Color/contrast checks: Tools available to designers
- Issue tracking: Accessibility bugs prioritized appropriately
- Audit schedule: Quarterly manual, annual comprehensive
Per-release checks
Section titled “Per-release checks”- Automated scans pass
- Keyboard navigation verified for new/changed features
- Screen reader spot check completed
- Color contrast verified for new UI
- No new WCAG failures introduced
References
Section titled “References”Automated Tools:
- axe DevTools — Deque
- WAVE — WebAIM
- Pa11y
- Lighthouse
Testing Methodology:
Recent Research:
- Comprehensive Guide to WCAG Testing 2024 — AudioEye
- Top Accessibility Testing Tools 2025 — TestGuild
- axe vs WAVE vs Lighthouse Comparison
Checklists:
- WCAG Checklist — DigitalA11Y
- Accessible.org WCAG Checklist
See Also
Section titled “See Also”- WCAG Guidelines — Standards to test against
- Assistive Technologies — AT to test with
- ARIA & Keyboard Patterns — Implementation to verify
- Accessibility Checklist — Quick reference