There are widely available tools, like Google Lighthouse, which provide audits for web content accessibility.
None of the tools I am aware of today provide results that are meaningful. They are quite easily gamed into perfect 100/100 scores, which instils a false sense of confidence that your website is accessible when it is not.
These tools work by piping your HTML through a WCAG 2.0-based checklist. Claiming that a website is accessible because it passes WCAG 2.0 is about as relevant as saying a site works for all users and their browsers because it passes an HTML5 validator. It's a good prerequisite, but far detached from the full story.
Accessibility testing, being a branch of usability testing, is not easily automated. Here's one way to do it:
- Do user research with users with access needs;
- Fix the issues found;
- Do user research again to confirm the fixes work;
- Write automated tests to prevent regressions from occurring.
Writing automated tests, as opposed to running accessibility audit tools, focuses on the barriers specific to your product.
A tool to write automated tests that detects accessibility features in the way they impact users does not exist today.
In more concrete technical examples, there is no automated way to check what is pronounced (if anything) as a result of your
aria-live region update on IE11 and JAWS14.
There is no automated way to check if your
onclick handler which triggers when tapping on iOS Safari will still work when VoiceOver is on.
There is no automated way to check if it is possible to click on your dynamically inserted button by telling Dragon NaturallySpeaking to "click button."
And until these things are possible, accessibility testing tools are lying to you.