User testing for content: when users skim read and when they don't

For at least two decades, I have been conducting think-aloud interviews with users to test their understanding of content.  

I have tested - and continue to test  - a diverse range of topics, for example: cycling safety, exchange-traded funds, denied insurance claims, superannuation fees and charges, and many others.  I have watched how people read letters, landing pages, statements and brochures. 

As part of this experience, I have learned how people actually read.

It is a myth that all users skim read information

Content creators are often advised to write for skim readers who navigate via headings.

My experience is that this is a false assumption. In the user tests I have conducted, I have seen many times that while some users do skim read, others users read carefully, while some don't read at all.

Why some people read, some skimmed and some avoided depended on:

  • How much they thought they knew about the topic already ('thought they knew' is a crucial phrase here)

  • How it was written

  • What it was about, and

  • How much there was to read.

Users who thought they knew the content

When the page or document I was testing was short, users who thought they knew the content already typically skim-read it using the headings as navigation. If it was long, many did not read it at all. Long Welcome packs were a good example - they were not only long but also seemed to contain no new information, so some users did not even open them.

Some users in one test avoided the section headed 'Important Information' because they assumed that was generic information that they seen before.

Users who believed that they did not need to act on the content

In many of our tests, we found that users were more likely to skim read when they believed that they did not need to act on the content. For example, more people read more of a letter about an ongoing  claim than read a letter about a price rise that has already happened. 

Users who had not seen this kind of content before but thought it was important

Users in our tests who thought new information was important to them personally tried to read the content carefully word by word. They usually succeeded at this only if the content was written for their level of knowledge. For example, new investors who needed information about investment risks stopped reading when the content was written in a language that only experienced investors would understand.

What does this mean?

When testing, I make sure that the sample contains users with different levels of pre-existing knowledge. I watch where users start reading and where they stop and in the interview afterwards, ask why.  I have therefore been able to show clients how to rewrite their content to suit readers with different levels of knowledge and motivation.  To find out more contact me:  This email address is being protected from spambots. You need JavaScript enabled to view it.

For a quote or a confidential discussion about your research needs