One of the best articles I read at university was about the Roman mob.
Traditionally the Roman mob was see as a mindless destructive hoard, who at the drop of a hat (helmet?) would rampage through Rome burning the place down.
The solution to keep the mob quiet was ‘bread and circuses’ keep ’em fed and entertained – hence the ‘populist’ emperors.
However one historian (apologies for not remembering the name) decided to find out who was actually in the Roman mob.
As a result he came to the conclusion that the mob were not just a load of layabouts who liked to riot but discrete groups of individuals with particular concerns.
They might be veteran soldiers without a pension, some peasants who had just come to Rome and not found a job yet. Or some of the poorer craftsmen suddenly affected by a rise in grain prices.
In fact the mob was not a mob at all it was a collection of lots of smaller groups who, on a particular day, found that all their different grievences brought them together to take ‘direct action’ – that’s the mob and rioting stuff.
I was thinking about the Roman mob and website visitors.
How often do we say ooh we have a lot of visitors and just look at the numbers?
But have we:
- identified particular groups interested in specific content?
- how many people are in these groups?
- how many of them came to your site?
- do we know which sites they come from?
- do we know what they expected to find?
- did they find it?
- did they leave an feedback?
- do they want different content?
- were they satisfied?
What I am trying to say is that we don’t just have ‘website visitors’ or ‘traffic’. We have individuals, we have niche groups, who expect or are trying to find something specific.
I am not saying that they will go on a riot if they don’t get what they want, but they might not come back to you site which could be worse.
Steph Gray @lesteph recently highlighted an article by Gerry McGovern on the The accidental website visitor. Gerry talks about how a website significantly improved its satisfaction levels once it was attracting the right, but smaller audience.
This made me think about who is the ‘right kind’ of website visitor? Are they one of us? Do they want to be? Should they wear ‘smart casual’? That’s a joke by the way.
So how would we define a ‘visitor of quality’?
Do we need to start by defining who are key audiences are? Perhaps:
- Think tanks
- Finance Directors
- Project managers etc?
Should these be ranked in an order of priority?
This sounds like stakeholder mapping to me unless I am mistaken?
How do we know if we are getting enough of these visitors?
Are they in the ‘right’ proportions to each other?
How do we change the proportions if we don’t like them?
Should we be deterring visitors we do not want, er – photocopy salesmen?
What will be the overall impact on visitor numbers – will they go down?
If they go down will ‘management’ be upset as there will not be ‘big numbers’ to talk about?
What if you are a government site and one of your key metrics is costs of running the site divided by the number of visitors?
So what did we find from our user testing?
Strangely enough a lot of things the opposite of what we expected, or were just surprises:
- NAO users tested loved PDFs – they like knowing that they are looking at the final version of information
- they also liked being able to search within PDFs by using control F
- there was very little interest in RSS feeds – in fact one said they had been actively switching them off
- the site search was thought to be quite effective
- very little interest in online forums or videos of discussions
- a modest interest in Twitter
- there was much praise for the House of Commons library as being a key service for Parliamentary researchers
Not very unexpectedly they wanted or favoured:
- shorter documents
- they often find content by using Google first and would like our content showing up higher in results
- site search to drill down across content more effectively
- to see links across the Parliamentary accountability cycle
- some wanted to know more about what the NAO does
Our overall conclusions were – the focus groups were invaluable.
We have avoided acting on some of our own assumptions and are much clearer about what these groups of users want and need.
The caveat being, as the facilitator said, that there is a difference between what users say and do.
So we still need to test some of the findings in a more direct manner before making large changes to the content, but at least we now know what to test.
We have ran some user testing recently and what an eye-opener it was.
So how did we go about this?
Well by trying to find users who were interested in our content.
I have already described how I helped to identify the persona of performance measurement specialists. The usability supplier then dug up the relevant users from their database and also the other group of Parliamentary researchers.
Most of the hard work went into the preparation – trying to be objective and thinking about parts of the site, or types of content we suspected did not really work, or could work better.
Equally we tried to highlighted some unique content formats that many users might not have found before on our site to check if they had any relevance for a wider range of users.
As well as looking at what we had we tried to find potential relevant examples from other sites; and ideas for formats used by other similar kinds of organisations to test their viability.
We were able to base a lot of our ideas on the previous research done to help form the basis of our digital communications strategy.
Since we wanted to get a bit more buy-in from colleagues in the organisation we also asked the owners of the relevant content what they would like to focus on.
Once the users were in the room and the observers were installed (including non-comms colleagues) we got into the meat of the event.
The most amusing bit is when he facilitator tells the users – ‘say what you think the site owners will not be offended’. Behind the two-way mirror we all mouthed to each other ‘yes we will’ – though of course we were not, as all the comments were super useful.
After that the facilitator did a great job of teasing out information from the users, getting consensus where it existed or highlighted differences when they were evident. This was all based around following the script we had agreed in advance.
So what did we find? Read the next post – to follow…
How about doing something different?
Here are my very personal starters for ten:
- Opt out of the Official Publishing contract for House Papers e.g. for publishing VFM reports.
- Negotiate direct with the House of Commons and find out what they really want. After all the NAO is part of the House so why not talk direct?
- We could also talk direct with the various organisations who have subscriptions via their libraries and find out what they really want.
- Print on demand obviously.
- Become our own publishing house with an open publishing platform like the Guardian. Not using The Stationery Office or anything similar.
- Use Digital Object Identifiers
- Have our own ISBN.
- All content with API stream.
- All tables and charts easy to export by users to import into Word documents or slides.
- All digital content have built in tracking so if a chart is exported we can track where it goes. (Not sure this technology exists yet)