Chapter 16: Evaluation: Inspections, Analytics, and Models

Chapter Introduction | Web Resources | In-Depth Activity Comments | Teaching Materials | Quickvote


Objectives

The main goals of this chapter are to accomplish the following:

  • Describe the key concepts associated with inspection methods.
  • Explain how to do heuristic evaluation and walk-throughs.
  • Explain the role of analytics in evaluation.
  • Describe how A/B testing is used in evaluation.
  • Describe how to use Fitts’ law — a predictive model.


Introduction

The evaluation methods described in this book so far have involved interaction with, or direct observation of, users. In this chapter, we introduce methods that are based on understanding users through one of the following:


  • Knowledge codified in heuristics
  • Data collected remotely
  • Models that predict users’ performance

None of these methods requires users to be present during the evaluation. Inspection methods often involve a researcher, sometimes known as an expert, role-playing the users for whom the product is designed, analyzing aspects of an interface, and identifying potential usability problems. The most well-known methods are heuristic evaluation and walkthroughs. Analytics involves user interaction logging, and A/B testing is an experimentalmethod. Both analytics and A/B testing are usually carried out remotely. Predictive modeling involves analyzing the various physical and mental operations that are needed to perform particular tasks at the interface and operationalizing them as quantitative measures. One of the most commonly used predictive models is Fitts’ law.