Project summary

Office for National Statistics

Allow survey respondents to manage their account

I conducted data analysis to categorise and address issues raised by survey respondents in a UK business data collection system. This led to the creation of a prototype for self-service account management, reducing monthly service desk calls from 42 to 4 and enhancing user experience. I also conducted a navigation tree test which provided insights into the findability of help sections. The successful implementation of help and account sections in the service resulted in a substantial drop in service desk calls.

The challenge

Survey Data Collection is a digital service that allows UK businesses to complete and send survey data to the ONS. Traditionally completed by paper questionnaire, more surveys were being migrated to the online system.

The user self-service project was developed from a business requirement to categorise external communications so that they can be triaged effectively by a team in the ONS. Users were sending issues unrelated to data queries through the secure messaging service which increased the workload for internal staff. The requirement could be met by enabling respondents to deal with common issues themselves.

Conduct data analysis

I gained an understanding of the issues raised through the secure messaging system by exploring secure message data, categorising and measuring the volumes of the non-data issues. I measured the impact of the issues to the business and revealed where in the user journey issues are raised. I determined which issues could be answered succinctly and which needed more complex solutions.

I then refined, simplified and amalgamated the categories to provide the most succinct answers to be presented to the user before they need to send a message.

I also analysed one month’s worth of secure message analysis for three large surveys and categorised them manually. I analysed referrals and page views of the ‘contact us’ screen of the respondents site, the common reasons for respondent secure messages and wrap up codes from the call centre which included complaints, grievances and queries.

Determine baseline to measure success

It was possible to measure the time taken to deal with respondents issues which would provide a baseline to measure the success of any intervention.

Create a hypothesis

A help section would allow users to solve issues on the site. If respondents were unable to solve issues on the site they could send the ONS a message which would be categorised as users route through the ‘help’ user journey. The aim was to reduce contact with the ONS by allowing staff to spend more time dealing with data queries and response chasing.

Create a prototype and test with respondents

I designed and built a prototype which would allow users to perform the biggest issues internal users had to deal with:

  • Change user details such as name, email, password
  • Share and transfer survey access
  • Information about the surveys and ONS

I tested the prototype with 5 participants, separately in 45 minute sessions. Participants were given scenarios in the form of issues to solve on the site. I recorded how the interface allowed users to solve their issue.

I created a fully functioning prototype in HTML using the ONS front end design system comprising:

  • a help section to allow users to find answers to issues quickly, prior to suggesting contacting the ONS via secure message.
  • an account help section as separate micro-services to fulfil needs such as changing account details or sharing and transferring surveys to colleagues.

Test the performance of the navigation tree with respondents

Working with a business analyst, I ran a tree test survey which asked users to complete tasks based on the 10 most popular scenarios. This gave us results on the performance of the navigational tree and insights into the findability of the help sections.

I created the tree test in Treejack and linked to the survey within a notification on the external site for one month, this was seen by approximately 90,000 respondents. We received around about 100 completed surveys.


Answers four levels deep in the navigation mostly performed poorly, technical issues were the worst performing scenarios and observed participants tended to look in ‘contact details’ for their business address.

Gain insights

There was a balance to be struck between the cost of including an additional navigational element in the tree and the benefits of including an answer to an issue. Participants understood the concept of sharing survey access but navigate there when they want to transfer access. Issues relating to specific survey questions may be best dealt with in context (corroborated by observations from face to face testing in Version 3 research).

The results were presented to the team. A video I created highlighting key insights was distributed amongst senior management and was key in changing the approach for respondents to change their business address.


The help and account sections were added to the service. I provided design review to ensure consistency in the delivery process and adherence to the design system. The successful result was a massive drop in service desk calls. From 42 calls from respondents requiring help recorded in January 2019 to only 4 in April 2021.