ARTICLE – 10 OCTOBER 2022

Why automated accessibility testing is not enough

There are many great automated accessibility testing solutions out there. So we don’t need to do manual accessibility testing any more, right? Not so fast! There are many issues which manual tests are better at finding.

Automated tests are great at picking up:

  • Colour contrast issues
  • Form inputs which are not linked to labels
  • Images which are missing alt text
  • Many more…

Here are some things automated accessibility checkers aren’t good at picking up.

Interactive elements which aren’t keyboard accessible

There are many reasons a person might use only a keyboard to access the web, for example motor issues which make it difficult or impossible to use a mouse. Screen reader users also often use the keyboard.

When building a web page, there are multiple ways to achieve the same result. A button might be represented in HTML as:

  • A button tag - typically keyboard accessible by default; or
  • A div tag with additional styling and javascript to make it behave like a button. This requires additional attributes to make it keyboard accessible. These additional attributes are often forgotten.

Both might look the same visually. If the additional attributes are missing from the div ‘button’, an automated accessibility checker has no way to tell it is supposed to behave like a button. A person navigating the page via the keyboard will quickly discover if the ‘button’ isn’t keyboard accessible.

Keyboard focus highlight and tab order

As mentioned above, it’s important for websites to be keyboard accessible. When tabbing through the page, there needs to be some kind of visual indication of where you’re up to - a focus highlight. It’s usually a bold box around the currently selected element. The easiest way to test this is by manually tabbing through the page.

It’s also important that the tab order is sensible. It should match the structure of the page. The best order could be subjective and based on context, which makes assessing it a better task for a human than a machine.

Information conveyed only by colour

Information should be conveyed by other means in addition to colour. This supports people who are colour blind or have vision issues which affect seeing or discerning colours. This is why links are often underlined as well as being a different colour from the surrounding text.

One way to test this is to use a browser extension to review the site in grayscale to see if any context is lost without the colours. A human will be able to discern issues where an automated tool can’t, for example a form where mandatory fields are only indicated by different coloured text.

Poor quality alt text

Alt text on an image is read by screen readers so that people who are blind or visually impaired don’t miss out on important information or context from the image. It is also displayed if the image fails to load.

Automated checkers are very good at picking up when an image is missing alt text. However, they cannot tell you whether or not the alt text is meaningful, accurate, or even necessary. Consider manually inspecting the alt text for some of your images, particularly if the image contains critical information.


Automated accessibility checkers are fantastic for picking up common accessibility issues. However, manual testing can uncover issues that automated tests don’t. The issues listed here are some examples - there could be many more. While manual tests (like tabbing through the page) can be quick and easy to do, testing by a knowledgeable professional is always best. It is important to go beyond automated tests to ensure a good experience for all web users.

Get in touch with Annex to see how we can help you create beautiful, accessible websites.