Tuesday, January 16, 2007
Using focus groups to define tasks for usability testing
You have a new project to do a usability test of a web site. After defining the site's audience groups, you face your biggest challenge — defining the tasks to use in the testing. You've interviewed the site's key stakeholders and gained a solid picture of what they think their site's visitors want to do. (You certainly know what they want the visitors to do.) But do you have a complete and accurate picture of the visitors' goals and objectives? You have only one way to ensure that you do — obtain input directly from people who might visit the site.
One common approach to eliciting such input involves focus groups. Jakob Nielsen has called the use of focus groups "voodoo usability" (link will open in a new window), but he is criticizing their use to gather design ideas or evaluate a site. I agree with Jakob that for those purposes, focus groups can do more harm than good. However, I am convinced that they can (if done well) provide valuable information regarding what site visitors want to do.
And I think I've done it well. In a recent project at UserWorks (new window), I introduced a focus group technique that I learned from a nonprofit organization. Here's how it works:
Thus, each focus group produced a list of six primary tasks for its audience segment, and identified potential design issues that could affect task performance. In the usability testing that followed the focus groups — which involved the same two audience segments — we designed our test procedures to use those six tasks with the participants from the respective audiences.
We at UserWorks found this technique both effective and cost effective. In particular, it makes voting somewhat private and thus avoids much of the social pressure that "public" voting can carry, and it tends to concentrate the votes by ensuring that only those items that are among someone's top choices receive any votes at all.
Best of all, it ensures that usability testing focuses on tasks that are important to the target audiences.
One common approach to eliciting such input involves focus groups. Jakob Nielsen has called the use of focus groups "voodoo usability" (link will open in a new window), but he is criticizing their use to gather design ideas or evaluate a site. I agree with Jakob that for those purposes, focus groups can do more harm than good. However, I am convinced that they can (if done well) provide valuable information regarding what site visitors want to do.
And I think I've done it well. In a recent project at UserWorks (new window), I introduced a focus group technique that I learned from a nonprofit organization. Here's how it works:
- Spend 15-20 minutes brainstorming all the things that the group might want to do on the site. Make this a pure brainstorming session, with no assessing or arguing about what people contribute.
- Spend another five minutes or so looking for items that may be combined into one.
- Hand out lined 4x6 index cards and ask participants to write down the five items that they would place at the top of their list. Have them write each item on a separate line, with blank lines between them.
Here's where it gets tricky. - When you say "Go," all participants simultaneously pass their cards one person to the left.
- After each passing, participants look at the list of five items that they're holding and place a mark (|) by their top
items from that list. - When all participants have marked their top three, pass again. (It can help to instruct them to make their mark diagonally if the card they're holding has a group of four.)
- When the cards have gone all the way around and participants have their own cards back, stop passing. (Participants do not get to make the three marks on their own cards.)
- Total the results and select the top five or six. These will become your tasks for usability testing.
- Then run the group through the top tasks, and ask targeted questions. This can give insights into design issues that you will need to watch out for during testing.
Thus, each focus group produced a list of six primary tasks for its audience segment, and identified potential design issues that could affect task performance. In the usability testing that followed the focus groups — which involved the same two audience segments — we designed our test procedures to use those six tasks with the participants from the respective audiences.
We at UserWorks found this technique both effective and cost effective. In particular, it makes voting somewhat private and thus avoids much of the social pressure that "public" voting can carry, and it tends to concentrate the votes by ensuring that only those items that are among someone's top choices receive any votes at all.
Best of all, it ensures that usability testing focuses on tasks that are important to the target audiences.
Monday, October 30, 2006
The Usability of Credit Card Disclosure Materials
Earlier this year we conducted a usability evaluation of credit card disclosure materials for the Government Accountability Office. With credit card debt and personal bankruptcy at an all-time high, the GAO wanted to find out if there were any usability issues in the printed documents that card issuers use to communicate information about their rates and fees to consumers.
We used three methods to evaluate the disclosure documents: a readability analysis, heuristic evaluation, and a 12-person usability test.
From the readability analysis, we learned that most disclosure documents are written at a 10th to 12th grade level. That may sound pretty reasonable until you realize that nearly half of the U.S. adult population reads at or below an 8th grade level. We also discovered that the sections dealing with interest rates and how much you owe are written at a much more difficult level; some were estimated to require graduate-level education to understand.
The usability test identified things you would expect: difficulty finding information in the documents and difficulty interpreting the information correctly. But it also identified some things we didn't really expect. For example, the phrase "the terms [of this credit card] can change at any time for any reason" is repeated frequently in the disclosure documents. So much so that it appeared to become something of a deterrent to even using the documents. If terms can change at any time for any reason, many of our participants reasoned, why bother reading this cardmember agreement?
Read the GAO report Credit Cards: Increased Complexity in Rates and Fees Heightens Need for More Effective Disclosures to Consumers
To see the Washington Post article Credit Cards' Hidden Costs: GAO Study Finds Confusing, Sometimes Misleading, Practices
We used three methods to evaluate the disclosure documents: a readability analysis, heuristic evaluation, and a 12-person usability test.
From the readability analysis, we learned that most disclosure documents are written at a 10th to 12th grade level. That may sound pretty reasonable until you realize that nearly half of the U.S. adult population reads at or below an 8th grade level. We also discovered that the sections dealing with interest rates and how much you owe are written at a much more difficult level; some were estimated to require graduate-level education to understand.
The usability test identified things you would expect: difficulty finding information in the documents and difficulty interpreting the information correctly. But it also identified some things we didn't really expect. For example, the phrase "the terms [of this credit card] can change at any time for any reason" is repeated frequently in the disclosure documents. So much so that it appeared to become something of a deterrent to even using the documents. If terms can change at any time for any reason, many of our participants reasoned, why bother reading this cardmember agreement?
Read the GAO report Credit Cards: Increased Complexity in Rates and Fees Heightens Need for More Effective Disclosures to Consumers
To see the Washington Post article Credit Cards' Hidden Costs: GAO Study Finds Confusing, Sometimes Misleading, Practices
Thursday, October 12, 2006
A UserWorks project makes the news!
UserWorks did a study a few months ago for the Government Accountability Office (GAO) on the understandability of credit-card disclosure materials to the general population. Our research project was part of a larger GAO study that has just been reported in The Washington Post. Angela Colter, who led UserWorks' portion of the study, will be blogging about it in the next few days. Stay tuned...
Thursday, April 13, 2006
Introducing the UserWorks Blog
Placeholder first post. Lorem ipsum, and all that jazz.