Steve Mathieson is a freelance analyst, journalist and editor, covering IT, government and healthcare, often in combination, writing for publications including The Guardian, I-D Information Daily, editing Society of IT Management's magazine.@samathieson
A third of councils failed a basic accessibility test for Socitm’s Better Connected research, according to experts from the Digital Accessibility Centre who carried out the work.
The Digital Accessibility Centre (DAC) ran an automated test on 416 local authority websites in late 2016, checking for basic accessibility. This included looking for visual ‘focus’ indicators of which section of a page was active, seen when using the tab key to navigate.
“Stage one was intended to filter out those that didn’t want to focus on accessibility,” Gavin Evans, the centre’s operations director, told a workshop session at Connected Local Government Live in Birmingham on 29 June. He explained that having such visual indications is vital for those with impaired mobility who rely on the tab key to navigate through a webpage. Referring to a colleague who uses voice activation and a keyboard, he said: “If she’s tabbing through each of the elements on screen, if she can’t see where she is, she doesn’t know what she’s selecting,” adding: “For that user-group, it’s a show-stopper.”
Those councils that failed the stage one test had the opportunity to make changes, with 275 of 416 passing this. For 195 councils that passed the first test and were members of Socitm Insight, the centre undertook more comprehensive set of tests where it reviewed five top-level pages and tried to carry out a number of tasks, with 134 passing the stage two test. For each task in stage two, the centre applied four ‘show-stoppers’ that led to a zero score. These included automatically-starting audio or flashing content, with the latter potentially triggering fits in people with epilepsy. Evans said he hadn’t seen any council using either, but councils that implement advertising on their sites could fall foul of this.
The centre also awarded a zero score where it found keyboard traps which blocked navigation through a page with the keyboard or ‘captcha’ anti-spam tests without an accessible alternative. Audio captchas can be very hard to use, as the sound quality is deliberately poor, so the centre recommends alternatives such as honeypots – hidden forms that spam generating software will complete but humans will not – or two-factor verification using email or text messaging.
The DAC gave councils a minimal score of one out of three for each task for a further list of issues, including movement on pages or no mobile dedicated design, with the latter providing a thin version of pages which are suitable for visually-impaired users to magnify. The same score was awarded if a process could not be completed with online forms rather than non-HTML documents such as PDFs; the latter are very difficult to interact with on mobile devices. “It’s key to take into consideration, can we complete something online, all the way through from start to finish,” Evans said.
The centre’s tests rewarded the use of unique and informative page titles. “The page title is the very first thing the screen-reader user hears,” said Evans. “It’s so key for them to know that when they click on the button, they’re being taken the correct page.” If a series of pages supporting a process all have the same title, the user may not realise they are making progress. Other factors rewarded included a good heading structure, such as having just one heading tagged ‘H1’; a functioning, visible ‘skip links’ option; and appropriate text alternatives for images, all factors that help those using screen-reader software. Sufficient colour contrast, which helps visually-impaired users, was also recognised.
The DAC, a not-for-profit social enterprise, carries out research and testing for a number of public and private-sector organisations. It also provides an accessible tool for collecting feedback from users called AccessIn, used by Nationwide Building Society and the Money Advice Service.
Mike Taylor, a senior accessibility analyst for the centre, demonstrated how screen-reading software works with the Tesco and Manchester United websites. The centre has worked with the supermarket chain and Taylor showed how it is possible to fill in its online form to register for an account, something which requires carefully-designed error handling messages that make sense if read out.
For Manchester United’s site, Taylor attempted to find accessibly information on booking tickets. At the time, the site’s home-page was occupied by a splash-screen that had to be clicked on to enter the site. However, the image used had no alternative text attribute so the screen-reader software could only recite a meaningless file-name, making it nearly impossible to enter without guessing what this was.