Session Summary: Usability Testing–Down, Dirty, and Doable!

TechWhirl’s coverage of WritersUA 2012 is sponsored by Madcap Software. Find out more and download a trial copy of Flare 8.

Presenter: Leah Guren, CowTC

The Usability Testing presentation started with some humor. An attendee, looking at Leah Guren’s slides featuring her cow insignia for CowTC , said “Would cows be the midpoint between pigs and chickens in an Agile environment?” Then Guren used the remainder of the break before the official session time to play her company appropriate song, I Am Cow,  by Arrogant Worms.

Guren’s session focused on how to start doing usability testing on your own at your company for the first time. She began by describing what usability testing isn’t, such as quality assurance (QA) or looking for faulty materials. She offered the example of a pressure test on a chair, which does a good job of displaying how well a product will hold up, but not how well people can use the chair as intended, if it is ergonomic, intuitive to use, or properly supports the back.

Guren warned attendees to not be put off by the myths about usability testing—it’s too expensive, too time consuming or not necessary because the engineers know what they’re doing. She again pointed to a chair that may look really good, artistic, but is actually not comfortable to use. She also pointed out that while usability may add time to production, it takes much more time and money to retroactively fix a product.

So with the negatives out of the way, she asked what user assistance (UA) usability testing would uncover:

  • Do users succeed with help or finding the facts they want?
  • Can users navigate the help?
  • Do users understand the information they find?
  • Can users be successful with their goals for using the help?

Guren contents that testing help systems is actually easier than testing graphical user interfaces (GUIs) because the help system can be tested later in the development cycle, can be simpler and independent of the product QA budget.

Nine Step Approach to UA Usability Testing

  1. Have a Plan: Decide why you are testing; define the key workflows and tasks that testers will perform;  and determine how they will rate the ease of completing. Always think in terms of user workflow scenarios and making the workplan mirror real user workflows.
  2. Get Some Tools: While you don’t have to spend a lot or have the best equipment to conduct usability testing, you will need a way to capture mouse and screen actions, minimally using a voice or video recorder, even something as simple as your cell phone or shareware like EZ Screen Recorder that you can get as a trial version for free.
  3. Find Testers: You don’t have to go far to find testers who match your user profiles, you can look to another department or new hires. You really only need 5 to 6 testers for each persona and you only need each to do 3 to 4 tasks. Do not ask users to assess their own abilities with computers, only survey them for quantifiable responses.
  4. Adapt for Targeted Viewing: Perform usability testing  on all likely user platforms, and consider the product’s appearance on different devices or with different software.
  5. Set up the Environment: Reserve a space that is quiet, well-lit, and where testers cannot be peered at by others. Provide volunteers something in exchange for their time, even just some food or lunch. Also, consider swapping roles (interviewer/ facilitator/ equipment operator) so that everyone on the testing team gets the full experience.
  6. Follow Protocols: Talk to each tester individually, explaining each task but avoiding key navigation words or pushing them to one path. Only do this for one task at a time and remind them to think out loud for good feedback/insight. Have subject matter experts (SMEs) observe testing to enhance their understanding. Remember, if a tester  turns to the facilitator for an answer, the tester has given up and that is all the feedback required.
  7. Avoid Common Cultural Pitfalls: Understand that there are cultures where people would not offend their hosts, would not question the system, or cannot speak English or are limited in their English proficiency. Consider giving these testers a comfortable role to play where they can act, tell testers you’ve purposefully placed bugs in the system for them to catch, or tell them what something means in their own language.
  8. Conduct Exit Interviews: Refer to the testing experience to expound on answers testers provided during the test.
  9. Present Your Recommendations: Examine data from testing on benefit-to-effort ratio, low-to-high basis to identify the viable items with aspects with the most return on investment (ROI), or that are necessary to fix. Instead of a report, consider providing recommendations as a presentation with key supportive video clips from testing.

In making an argument for usability testing, point out that customer feedback surveys (which usually only get about a 3% response rate) are typically only responded to by those who have something negative to say, and rarely those who want to offer constructive feedback.

Guren closed by encouraging attendees to make friends with tech support as in-house allies, to add metrics by testing an old system to compare to a new system, and to examine reduction in calls for assistance. When in doubt about moving ahead with usability testing, remember how affordable it can actually be, and that it is sometimes easier to ask for understanding than permission.

Subscribe to TechWhirl via Email