Tuesday, January 16, 2007

 

Using focus groups to define tasks for usability testing

You have a new project to do a usability test of a web site. After defining the site's audience groups, you face your biggest challenge — defining the tasks to use in the testing. You've interviewed the site's key stakeholders and gained a solid picture of what they think their site's visitors want to do. (You certainly know what they want the visitors to do.) But do you have a complete and accurate picture of the visitors' goals and objectives? You have only one way to ensure that you do — obtain input directly from people who might visit the site.

One common approach to eliciting such input involves focus groups. Jakob Nielsen has called the use of focus groups "voodoo usability" (link will open in a new window), but he is criticizing their use to gather design ideas or evaluate a site. I agree with Jakob that for those purposes, focus groups can do more harm than good. However, I am convinced that they can (if done well) provide valuable information regarding what site visitors want to do.

And I think I've done it well. In a recent project at UserWorks (new window), I introduced a focus group technique that I learned from a nonprofit organization. Here's how it works:
I used this technique in a usability assessment project that UserWorks conducted for a professional membership organization. We ran two focus groups, one for members of the organization and one for nonmembers. Most of the nonmember participants worked in the profession and thus had substantial interest in and familiarity with many of the topics on which the site aimed to provide information, and membership info ranked near the top of their list of priorities.

Thus, each focus group produced a list of six primary tasks for its audience segment, and identified potential design issues that could affect task performance. In the usability testing that followed the focus groups — which involved the same two audience segments — we designed our test procedures to use those six tasks with the participants from the respective audiences.

We at UserWorks found this technique both effective and cost effective. In particular, it makes voting somewhat private and thus avoids much of the social pressure that "public" voting can carry, and it tends to concentrate the votes by ensuring that only those items that are among someone's top choices receive any votes at all.

Best of all, it ensures that usability testing focuses on tasks that are important to the target audiences.

Monday, October 30, 2006

 

The Usability of Credit Card Disclosure Materials

Earlier this year we conducted a usability evaluation of credit card disclosure materials for the Government Accountability Office. With credit card debt and personal bankruptcy at an all-time high, the GAO wanted to find out if there were any usability issues in the printed documents that card issuers use to communicate information about their rates and fees to consumers.

We used three methods to evaluate the disclosure documents: a readability analysis, heuristic evaluation, and a 12-person usability test.

From the readability analysis, we learned that most disclosure documents are written at a 10th to 12th grade level. That may sound pretty reasonable until you realize that nearly half of the U.S. adult population reads at or below an 8th grade level. We also discovered that the sections dealing with interest rates and how much you owe are written at a much more difficult level; some were estimated to require graduate-level education to understand.

The usability test identified things you would expect: difficulty finding information in the documents and difficulty interpreting the information correctly. But it also identified some things we didn't really expect. For example, the phrase "the terms [of this credit card] can change at any time for any reason" is repeated frequently in the disclosure documents. So much so that it appeared to become something of a deterrent to even using the documents. If terms can change at any time for any reason, many of our participants reasoned, why bother reading this cardmember agreement?

Read the GAO report Credit Cards: Increased Complexity in Rates and Fees Heightens Need for More Effective Disclosures to Consumers

To see the Washington Post article Credit Cards' Hidden Costs: GAO Study Finds Confusing, Sometimes Misleading, Practices

Thursday, October 12, 2006

 

A UserWorks project makes the news!

UserWorks did a study a few months ago for the Government Accountability Office (GAO) on the understandability of credit-card disclosure materials to the general population. Our research project was part of a larger GAO study that has just been reported in The Washington Post. Angela Colter, who led UserWorks' portion of the study, will be blogging about it in the next few days. Stay tuned...

Thursday, April 13, 2006

 

Introducing the UserWorks Blog

Placeholder first post. Lorem ipsum, and all that jazz.

This page is powered by Blogger. Isn't yours?