Skip to main content

20-20a Westminster Buildings, Theatre Square, Nottingham, NG1 6LG(0115) 888 2828

User testing vs automated testing

Written by Zara Gemmell on

Here at HeX, we are huge advocates of accessibility and pledge to make sure every website that we build is accessible for people with disabilities. This includes those with visual, hearing and mobility impairments as well as those with learning difficulties.

In terms of accessibility, testing is a crucial part to ensuring that a website is as usable as possible for everyone. With all methods of testing, it’s best to plan it into a project early on and throughout, than to do it later and end up spending time fixing complex issues.  Making sure things are working effectively is a priority, Stackoverflow.com, one of the largest development forums in the world surveyed 72,355 users of the site and 60.2% said they review code every day, a further 19.1% said they review code multiple times a week.

The above statistics highlight the importance of quality checking code, but what are the key differences? And who wins in the argument of ‘user testing vs automated: which is better?’

User testing

We use Shaw Trust Accessibility Service’s disabled website testers to go through the website at each stage of the build, from the designs through to the final build to ensure that everyone can access it using screen readers, keyboard tabbing, voice commands, to name a few. User testing is the least utilised of the two, with over 1600 self-titled ‘web testers’ claiming only 30% used co-ordinated user testing, with only 5% more (35%) claiming they used user simulations according to a survey done by PractiTest.

The most advantageous part of this method is getting the human element of the testing. A piece of software is great for spotting technical errors, but a person with a disability is the best person to inform you as to where a problem lies in the user interface and design. The downside is of course the time frame with more than 47% of the 1600 ‘web testers’ revealing they found timeframes ‘extremely challenging’ when it came to using this method, with a further 31% labelling it as ‘challenging’.

Specific user-journey testing is also available with this method, meaning a tester can run through a journey, such as looking for and buying a product using the website or navigating to the contact page and filling out a form. This allows for the interface, the experience and the technicalities to be tested effectively. In automated testing, there are explicit tests that are carried out and this is why the human perspective is important – there are often things that automated tests can’t pick up on, that humans can.

Automated testing

Automated testing is achieved using an online or downloadable piece of software that scans through the code of the website and determines how technically-sound the website is, based on the code and the standards set out in the W3C’s Web Content Accessibility Guidelines (WCAG). As the most popular testing method, nearly 60% of ‘web testers’ use this method to find faults. Using pre-scripted tests, their function is to compare the actual results with the expected results of an accessible website. In doing so, it can fish out bugs, such as blank headings or links, that are buried in the code.

Automated testing is great for testing that specific website functions are accessible to the user. This includes making sure that the website has been quality assured and all bugs picked up in testing have been fixed. When user testers see a website for an extended period of time, there is a chance that they have memorised certain user journeys and how to navigate them, this means that issues could easily be missed. Automated testing eliminates the human error aspect and alerts developers of faults that may have been missed.

So, which is the most important?

In the argument, user testing vs automated testing, which one is the best? One gives you the freedom and the time to carry out other tasks, whilst the other provides real-life results. But truthfully, one method of testing isn’t better than another – but when used in tandem, it can be a recipe for a better and more usable website. eMarketer quizzed marketers and webmasters alike and found that 86% of those considered testing a good practice that benefits the websites they run.

Despite the broad coverage of automated testing, user testing should never be underestimated to assess the usability and function of the customer experience in relation to accessibility. As much as automated programmes work in a sense that they highlight things like missing Alt tags on images, no link description, etc, they can never accurately tell you that a website’s colour contrasts are too low for a low vision user.

What does HeX do?

At HeX, we use three different testing methods to be confident of a good result:

  • An automated test – to check for errors on the site
  • A technical manual review – one of our expert team members goes through the site checking the code for the styling elements that an automated test would miss.
  • Shaw Trust disabled user-testers – who fully run through the website to ensure that it is accessible and usable.

If you’d like to find out more about our accessible websites and user testing, get in touch with us: https://www.horlix.com/contact-us/