We have covered some of the ways in which we build for accessibility in our previous blog post. However, just building what you think is an accessible website doesn’t imply that your website is, in fact, accessible—you need to test. Below, we’ll outline some of the approaches we use to confirm that our websites are what we think they are.
In general, there are two ways to test the accessibility features of a website—automatic and manual—and both should be used if possible.
Automate what you can
We love automated testing. And it’s not about being lazy—it’s about having consistent results whenever possible. This means that we do a good chunk of our testing using automated accessibility tools. Luckily, there are quite a few of these available.
Our primary development tool is Google’s Chrome browser. One of the main reasons for that is the quality of its developer-oriented tools, which are second to none, really. The fact that Chrome has become the de facto standard of modern browsers makes its developers’ go-to tool position even stronger.
Chrome, along with Firefox, has some very nice tools and extensions which help in building and testing accessible websites. We’ll introduce some of them below.
This tool is available by going to Developer Tools, and then selecting the Audits tab in the tools panel. Lighthouse is a pretty powerful tool—useful for a lot more than just accessibility testing—but in our case, we select the “Accessibility” checkbox and move forward with our testing.
Results that are generated by Lighthouse’s Accessibility report provide a wide array of accessibility-related information. The report looks at the roles, checks the color contrasts, and gives a final accessibility rating for easy evaluation and comparison.
It’s then pretty easy to jump into the page’s source code and check out what exactly is wrong in order to implement a fix.
Another tool we find helpful is the AXE plugin from Deque.
One of the nicer features this tool offers is actual grading of the issues. It measures their impact, ranging from critical all the way down to minor, which allows you to prioritize efforts. This tool also has a pretty convenient way of inspecting elements that require attention, as well as some links to resources that expand upon certain checks that are failing.
AXE also has a Pro version, which is a rather complex tool. It guides you through the whole process of accessibility checks, which includes both automatic and manual tests. At Cloudberry, we don’t have a need for this specific functionality, but if you are just starting your accessibility journey, you might find it useful.
WCAG Accessibility Audit plugin
Sometimes simplicity is best, and this Chrome plugin provides just that. It is quite simple, and because of that, it might not be as thorough as the two tools we mentioned above. However, we like it specifically for its simplicity.
In order to use this plugin, all you need to do is navigate to a page you’d like to check, click one button, and receive a nice report with an overlay, allowing you to go in and focus on specific areas with accessibility issues.
As a bonus, the tool also exports a text version of its report, which might be helpful as well!
As of late, Firefox has made big strides in terms of improving its developer tools. Chrome is still our tool of choice, but when it comes to accessibility specifically, Firefox has a very useful test suite under the Accessibility tab in its Developer tools panel. And since this suite uses a different testing and reporting approach than Chrome and other tools, it could serve as a good secondary (or maybe even primary) tool for accessibility testing.
Manually test the rest
Automatic test suites are extremely advantageous, helping test many clearly defined scenarios. However, not everything can be automated—some things just need actual human interaction, and that’s where our manual testing comes in.
One of the most important things to test when it comes to accessible interaction with the page is keyboard navigation. Generally, you should be able to control all interactive elements on the page—such as buttons, menus, video players, etc.—with only your keyboard.
Core HTML5 elements already have this functionality built in, so you don’t have to worry about standard buttons, form fields, and the like. However, if your page uses custom controls, such as mega menus, tabs, or accordions, that’s when human interaction is needed in order to test things to make sure they work as expected.
Here are a few things we test for manually:
- We make sure components can be focused on by using the TAB
- Once they’re in focus, we make sure components can be controlled with just the keyboard. For example, submenus within larger navigation menus can be operated with solely the return & spacebar keys, arrows, and escape key. There are different expectations of how different components should be controlled, and you can find more information on that matter at webaim.org.
- We ensure we don’t trap users inside any single component. This means that a user can easily navigate out of a component—in most cases just by continuously pressing the TAB
- We make sure the order in which components are being focused on makes sense by following the visual flow of the page.
- During our testing, we also confirm that elements which are not supposed to be actionable or focusable don’t receive focus. An example could be a checkbox inside of an inactive tab panel. This issue is normally resolved by either hiding elements or removing them from the default navigation flow with tabindex=”-1”
- Lastly, we make sure that elements with focus are distinguishable enough to the end user (either by allowing default outlines to be displayed by the browser or overriding these with our own design).
Structure of the document
This is one area where Firefox’s accessibility tools are indispensable. The accessibility inspector provides you with a visual process overview of the document’s structure—its headings, landmarks, and other elements—and quickly locates areas that require attention.
Sure, it is still possible to check the document’s structure just by looking at the code, but Firefox’s developer tools make this process much easier.
The last (but not least) manual test in our suite is testing with actual screen readers. This is a required step, as using a screen reader on a webpage can (and does) change the way the navigation flow works and how various elements are interacted with in rather significant ways.
It is also important to experience how a webpage functions for visually impaired users. Can the content be comprehended just by reading it out loud? Are actionable elements getting announced properly? Do dynamic areas work as they should?
We normally test across these three commonly used screen readers:
- JAWS (Windows)
- NVDA (Windows)
- VoiceOver (macOS/iOS)
The way screen readers work is well beyond this article, so we recommend consulting the respective documentation and reviewing learning materials for each of them.
Checklists rule the world
We have one final piece of the puzzle to cover: how do you know exactly what you need to test, and is there an audit trail you can refer to for your client or for yourself?
This concludes our quick overview of our testing practices. Thanks for reading! Let’s make the web accessible, together.