Why we test new content
There’s nothing more motivational for us as content designers (and probably not for any Citizens Advice staff member or volunteer) than hearing the stories of hardship, frustration and battles against the system some of our clients have.
Last Tuesday we heard from 5 people living near Salford, who gave feedback on some of our new content on Universal Credit. Listening to people recount their experience and talk about what information would be useful to them, gives us a fresh boost to make our digital advice content the best it can be: empowering claimants to get the best outcome for their circumstances. We’ve written previously about our approach to content design, but empowerment for us means giving our users the information, tools and arguments to use to make the system work for them.
Testing with real people is an essential part of the content design process. We have to know whether the content is meeting the needs of the people it’s designed for. And if it doesn’t work, we tweak it – again and again, until we get it right (iteration is a key part of Agile). In spite of lots of research, we still make assumptions when we write, so we have to test those assumptions. It’s a valuable learning tool. It’s also the most exciting part of the content production process because you get to hear from real people with real stories.
What we tested
We tested the content with 5 benefit claimants, some of whom were on JSA, tax credits or Universal Credit. We even had one who thought he was on JSA, but turned out to be on Universal Credit. One learning point: people don’t always know what benefits they’re on!
After recounting their personal stories, they were guided through various tasks and questions before looking at our online content and giving feedback. We tested:
- their understanding of the content – did it make sense? Was it useful to them? Would they read it? How did they read through the page? The language and terminology – was it clear? Was the use of jargon limited enough and explained where necessary?
- navigation and journeys through the content – did the structuring of the content make sense? Did they know how to get from one page to another?
- design of the pages – warning boxes, links, headings – did these help them to read and understand the content or were they a hindrance?
- tools and printable checklists – were these useful?
We’re still analysing the testing results in detail to see exactly what amendments we need to make to our content, but some of our initial learning has covered the following:
- Gaps in our content. For example, some claimants wanted to see content containing background information on what Universal Credit is and how it works. So we need to think about how we link our users to this information, or provide it ourselves if necessary.
- We can’t assume the audience has knowledge of the quirks of the benefit system. For example, several claimants did not always understand the definition of ‘savings’ or ‘childcare costs’. We need to define these terms so claimants don’t make mistakes when completing their claim form or putting numbers into a calculator.
- The navigation down the right-hand side of the page was mostly ignored – we need to revisit whether we include this or not.
- Claimants felt that Jobcentre Plus (JCP) staff were not familiar with Universal Credit. They thought that decisions made by JCP were sometimes discretionary or contradictory. They felt there was a lack of consistency between how different applicants were treated or what they were asked to do during the application process. So we need to make sure we’re fully explaining the process and the rules so that claimants have the knowledge to question and challenge potential mistakes made by the Jobcentre.
- Claimants wanted to use a calculator to see how much Universal Credit they could get, but they didn’t like being given a figure that was a ‘maximum’ amount. If they actually ending up getting much less than the maximum amount we gave them, they felt it would result in disappointment and would reduce their trust in our content. So we either need to provide a more accurate figure, or give people a range, eg ‘you could get between £350 – £988’.