Do you really want user feedback? Part 2

I wrote last week about the more automated kinds of feedback so here are some thoughts and ideas on direct user feedback.

There are a number of approaches that we have followed so far:

Usability testing

In March we ran two days of user testing with people who use the NAO website. Many people would see this as the gold standard of generating feedback. It is very direct as you can sit in a room and usually watch your users struggling to complete the task you set them. Almost everyone who has ran user testing says the same thing. ‘I did not expect that’ ‘I did not realise that our wording was misleading’ etc. One thing is sure – you never forget what users said if you hear them say it while you are watching.

Accessibility testing

We also ran some accessibility testing in March. As far as possible we tried to set exactly the same tasks. As the site is relatively new we expected to get some comments as indeed we did. Again it is an eye opener when you realise how small details can really damage a users’ experience. Try and get someone to demo how they use a screen reader on your website or one you are familiar with. Often it is a horrifying experience.

Follow up

Of course you will need to take the necessary remedial action.

Feedback forms

Another great idea that is proving really effective is to create a feedback form. We introduced these in November last year and the trick is to keep them simple. So we ask as below and get users to say what user type they are – government, private sector etc:

feedback

 

 

 

 

 

 

 

 

 

 

We have also added a simple form to the publications filter to check that users find what they expect:

Filter

 

 

 

 

 

 

 

We want to add a similar form to the search results page.

What else?

We are also building a slightly bigger user survey that we will trigger to run say the first day of every month. This will cover asking about user types;  what sort of content they came to find, guide, report, data, a survey etc. A quick measure of satisfaction and any other comments.

Once the form is live it should give us a monthly profile of who are users are; what they are looking for and how satisfied they are. I look forward to seeing the results.

So there are some ideas. Have you done the same or got some other tips?

 

Do you really want user feedback? Part 1

One of the current mantras for government online services is to say that they are customer focussed.

This is an excellent aspiration and it is becoming a standard protocol to build services around iterative user testing.

However once a service is up and running how do we ensure that we are still meeting our user needs?

This is not a definitive list but a description of some of the things that we have done with our website which might be worth considering.

Let’s break this down into indirect (this post) and direct user feedback (the next post).

You will probably collecting some of this information already. For example if you use an analytics package such as Google you should be monitoring the number of ‘page not found’ events.

Typically this is when some content has been moved or deleted and the person who bookmarked it uses the old link and arrives at a dead end. So every month I (and I do mean myself) go through these records and try and work out what the before and after for these links should be. I send these across to our hosting provider to be uploaded onto our server. If things go to plan, and they are, the number of these ‘page not found’ continually decreases.

Yes this is incredibly tedious but the benefits are that it does really focus on fixing the direct problems that users have. At least it does give sense of satisfaction of making something better. It also makes you more aware of your content what is still of interest – ‘why did we ever delete that page?’ Also any more fundamental problems become evident if the same kind of event keeps recurring.

As an extra touch we have added a small feedback box on the 404 page to ask if the user can say what they were trying to find. Hopefully this shows that we are user focussed and genuinely want to help them find our valuable content.

What else?

Do not forget that your other analytics data will tell you how long people stay on your pages and if they visit more of them or ‘bounce’ off to other sites. Did they come to your site by accident or deliberately. Was your content boring or fascinating?

There are also a number of remote monitoring tools that are useful unobtrusive and do not collect personal data.

Google has a handy ‘in page’ view that will show where users clicked on an individual page. We use software called Clicktale that does something similar though with a higher level of sophistication and range. It works very well for example to see how far users scroll down a page or what they click on. The latter can be very useful on forms to see how far people progress through them and when they give up. Clearly we should then act on these insights.

Of course the other key data is page views and downloads. This is your users voting with their mouse as to what content of yours they find interesting.

Another neat trick, which I have not tried yet, is attempting to measure if there are any pages that never get visited. To do this you need to be able to generate a list of all your pages and match against an export from your analytics. Take a look you might wonder why you ever set up that x page saying how great your organisation is about y?

You probably have other tricks so why not let me know your best ones.

Next week I will talk about the feedback that we have generated direct from users.