Testing with Screen Readers
Screen reader testing is the most direct way to experience your site the way blind and low-vision users do. This guide gives you the knowledge to test effectively.
Why Screen Reader Testing Matters
Automated accessibility tools catch approximately 25–35% of accessibility issues. Screen reader testing uncovers the remaining issues that no tool can detect: meaningful alt text, logical reading order, sensible focus management, and understandable announcements.
More importantly, screen reader testing reveals how your interface actually feels to blind users. A form may pass all automated checks and still be completely unusable if labels are confusing, error messages are unclear, or the reading order is illogical.
Important: Screen reader testing by sighted developers is not a substitute for user testing with blind users. It is, however, a necessary first step that will catch the majority of screen reader issues before real users encounter them.
Screen Reader Options
There are five major screen readers in widespread use. Each has preferred browser pairings — testing in the wrong combination will give misleading results.
Setting Up NVDA for Web Testing
NVDA is the recommended starting point because it is free, widely used, and works well with Chrome. Here is how to set it up for web testing:
Go to nvaccess.org/download. NVDA is free (donations appreciated). Install with default settings.
NVDA works best with Chrome. Ensure Chrome is up to date.
Open NVDA Settings (Insert+N → Preferences → Settings → Speech). Reduce the rate slider if NVDA speaks too fast to follow. Most testers use 50–70%.
NVDA automatically enters Browse Mode for web pages. In Browse Mode, you can use single-letter shortcuts to navigate by heading (H), link (K), landmark (D), form field (F), and more.
By default, the NVDA modifier key is the Insert key. On laptops without Insert, CapsLock is often set as the alternative NVDA key (check Keyboard settings in NVDA preferences).
Core Testing Commands
The following commands are the most important for web testing. NVDA and JAWS share most Browse Mode commands. VoiceOver (Mac) uses VO (Control+Option) as the modifier.
| Action | NVDA / JAWS | VoiceOver (Mac) |
|---|---|---|
| Start reading from top | Insert+Home then Insert+↓ | VO+A |
| Stop reading | Control | Control |
| Next heading | H | VO+Cmd+H |
| Next heading level 2 | 2 | Rotor → Headings → Level 2 |
| Next link | K | VO+Cmd+L |
| Next landmark | D | Rotor → Landmarks |
| Next form field | F | VO+Cmd+J |
| Next table | T | VO+Cmd+T |
| List all headings | Insert+F7 | Rotor → Headings |
| List all links | Insert+F7 | Rotor → Links |
| Enter / exit forms mode | Enter / Escape | Automatic |
| Read current element | Insert+Tab | VO+F3 |
Common Testing Workflow
Use this structured workflow for each page you test. It simulates common screen reader user behavior and surfaces the most critical issues.
What does the screen reader announce when you first load the page? Is the title descriptive? Does it include the page name AND site name?
Press D to jump through landmarks. Can you quickly identify the main navigation, main content area, and footer? Are all major sections represented?
Press H to jump through headings. Does the heading structure outline the page content logically? Are there any missing headings or heading level skips?
Open the links list (Insert+F7 in NVDA). Do the link names make sense out of context? Are there multiple "Read more" or "Click here" links?
Tab through all buttons, links, and form fields. Is every element reachable? Are names meaningful? Is focus always visible?
Complete critical tasks using only the screen reader: fill out a contact form, complete a checkout, search for content. Note any points of confusion.
Trigger modals, dropdowns, loading states, and dynamic messages. Are changes announced? Does focus move correctly?
What to Test
During your screen reader session, verify these specific behaviors:
Images
- Alt text is meaningful and descriptive
- Decorative images are skipped
- Complex images (charts) have full text alternatives
Forms
- Every field is announced with its label
- Required fields are indicated
- Error messages are announced and specific
Tables
- Headers are announced before data cells
- Table caption is present
- Navigation between cells is logical
Modals
- Focus moves into modal on open
- Focus is trapped inside
- Focus returns to trigger on close
Navigation
- Main nav has an accessible name
- Current page is identified with aria-current
- Dropdowns are operable via keyboard
Dynamic Content
- Status messages are announced
- Error alerts interrupt speech correctly
- Loading indicators are communicated
Interpreting Results
When documenting issues from screen reader testing, record:
- What was announced — the exact words the screen reader said
- What should have been announced — the expected behavior
- Which screen reader + browser combination revealed the issue
- The WCAG 2.2 criterion that is failing (e.g., SC 1.1.1, SC 4.1.2)
- Severity — does this block the user from completing a task, or just cause confusion?
Tip: Screen reader behavior varies between tools. An issue in NVDA+Chrome might not appear in JAWS+Chrome. Always document which combination revealed the issue, and test fixes across multiple screen readers before closing the issue.
Automation vs Manual Testing
Both automated and manual screen reader testing have their place. Here is how to think about when to use each:
Automated Tools (axe, WAVE)
- Fast — runs in seconds
- Catches structural HTML errors
- Good for CI/CD integration
- Finds ~25–35% of issues
- Cannot evaluate meaning or context
Manual Screen Reader Testing
- Finds issues automation misses
- Evaluates real user experience
- Tests dynamic behavior
- Requires trained testers
- Slower, but necessary
Best practice: Run automated tools in your CI pipeline on every pull request. Conduct manual screen reader testing on every major feature release and at least quarterly for stable pages.
Need Professional Screen Reader Testing?
Our certified testers evaluate your site with NVDA, JAWS, VoiceOver, and TalkBack across all major browser pairings. Get a comprehensive report with prioritized findings.
Get Screen Reader Testing