Game Usability: Advice from the Experts for Advancing the Player Experience (Ch. 6)

Shaffer, N. (2008). Heuristic Evaluation of Games. In K. Isbister and N. Schaffer (Eds.), Game Usability: Advice from the Experts for Advancing the Player Experience (pp. 79-90). New York: Morgan Kaufmann.

“Discount methods” of usability testing are used to find usability problems quickly and cheaply. Shaffer discusses one such method, called heuristic evaluation, which use heuristics, or “shortcuts,” to find usability problems. An important disadvantage to using heuristics is that one is not using representative users (validity issue).

Heuristics have been created geared specifically for games. The experience-oriented nature of games creates different issues when it comes to usability, compared to task-oriented interfaces.

Nielsen’s 10 Heuristics (from Nielsen, J. (1993). Usability Engineering)

  • Visibility of system status
  • Match between system and the real world
  • User control and freedom
  • Consistency and standards
  • Error prevention
  • Recognition rather than recall
  • Flexibility and efficiency of use
  • Aesthetic and minimalist design
  • Hep users recognize, diagnose, and recover from errors
  • Help and documentation

One criticism of this list that that it tends to be used to categorize the usuability problems that experts find but don’t actually prompt finding usability problems.

Heuristic lists for games:

  • Nielsen’s set, as applied by Sauli Laitinen
  • Melissa Federoff’s set of usability heuristics for games.
  • Heuristic Evaluation for Playability (HEP) – game play, game story, mechanics, usability.
  • Nokia’s Heuristics – game usability, mobility, game play.
  • Shaffer’s set – general, graphical user interface, game play.

Implementation:

  • Best to test early (less expensive to make changes) and often. Heuristic evaluation can be used in the beginning to screen out as many issues as possible. User testing can then be used to find problems that were missed and to polish.
  • General process: 1) 3-5 evaluators should evaluate independently, using previously agreed-upon heuristics as a checklist. Each problem should have a severity rating and a link to the specific heuristic; 2) evaluators meet and combine lists into a rough master list; 3) prepare a report, organizing problems in order of severity, and offering some recommendations for fixes. Use screenshots to clarify wherever necessary.

Single experts have expertise in either games or usability. Double experts have expertise in both (Nielsen, 1993). According to Shaffer, a novice evaluator will find about 22% of usability problems while a double expert will find about 60%.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: