BLS Uses Meta-Feedback to Hone Its Surveys

Bureau of Labor Statistics logo

Think about the last time you wanted to look up a statistic, apply for a new passport, or had a tax question. The federal government provides vital services to the citizens via the Internet, and with the use of a website your questions can be answered in a matter of seconds.

Naturally, the usability of a website is crucial to the government being able to cater to the public online. We have all been frustrated with a site that loads too slowly, has confusing navigation labels, or has a color so flashy that your head hurts. It is important to ensure that every website has concise and clear content, captivating visual images, accessible company info, and easy navigation. If a site does not have great web usability, discouraged users soon give up. So how does the government ensure that its own websites are top notch, user friendly and accessible?

An excellent option is crowdsourcing (the child of crowd and outsourcing), which harnasses the collective brainpower of the public. The steps are simple; the government thinks of a task or idea that they need completed or want feedback on, uses a crowdsourcing agency to reach out to their target demographic, and soon gets detailed feedback from individual testers, who are compensated for their work. The government can gauge what people want and can analyze the data to gain more insightful and accurate results: in this case, to tweak their website to make it the best it can be.

Here’s one case study: the Bureau of Labor Statistics is an important agency of the Department of Labor and the U.S. Federal Statistical System. Its primary duty is to provide relevant facts and statistics on labor and employment in the United States, and naturally must have data that is consistently high quality. Among other methods, the BLS uses online surveys and evaluations targeted at the general population to get its fundamental data, and presents the data via written web material on various websites. The BLS needed to test the usability of its online survey questions and written web material in order to weed out any errors or confusing patches.

BLS worked with TryMyUI, a crowdsourced usability testing company, to gain feedback on their surveys and data disseminating sites. The BLS tasked specially selected testers (in their desired demographic) with narrating their thoughts and actions as they performed a series of tasks while using a statistical dissemination site or taking a survey. The crowd also answered pre-set questions for the BLS. Since TryMyUI’s software tapped into the microphone and screen of each tester’s computer, the BLS received a live-screen video with vocal accompaniment that gave them real-time feedback.

Once tests are done the results were sent to the BLS in as little as two hours. These sessions enabled the BLS to understand and analyze the user experience is and take action based on this data. TryMyUI calls the real-time experience of people who are interacting with the surveys meta-feedback, as it is feedback about feedback, measuring the efficacy of feedback systems (in this case, surveys).

The BLS got to see what users were gravitating towards or where they were getting stuck, and was able to change certain questions on surveys and modify other written materials. With the help of crowdsourced usability testing, this federal government agency was able to make sure its online service was at its best.

Durga Jayaraman is a psychology and sociology student at Middlebury College. She is interested in all forms of human interaction and spent the summer working at TryMyUI on user experience, public relations, and marketing.

Editor’s Note: For details about  the U.S. Bureau of Labor Statistics’ experience with crowdsourcing and usability testing, download the full BLS/TryMyUI case study. We feature compelling case studies from across the industry and welcome submissions.

Advertisement

Speak Your Mind

*