Sehrish Nizamani
1 article-
Heuristic Evaluation Versus Guideline Reviews: A Tale of Comparing Two Domain Usability Expert's Evaluation Methods ↗
Abstract
<bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Background:</i></b> The usability of university websites is important to ascertain that they serve their intended purpose. Their usability can be evaluated either by testing methods that rely on actual users or by inspection methods that rely on experts for evaluation. Heuristic evaluation and guideline reviews are two inspection methods of usability evaluation. A heuristic evaluation consists of a few general heuristics (rules), which are limited to checking general flaws in the design. A guideline review uses a much larger set of guidelines/suggestions that fit a specific business domain. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Literature review:</i></b> Most of the literature has equated usability studies with testing methods and has given less focus to inspection methods. Moreover, those studies have examined usability in a general sense and not in domain- and culture-specific contexts. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Research questions:</i></b> 1. Do domain- and culture-specific heuristic evaluation and guideline reviews work similarly in evaluating the usability of applications? 2. Which of these methods is better in terms of the nature of evaluation, time needed for evaluation, evaluation procedure, templates adopted, and evaluation results? 3. Which method is better in terms of thoroughness and reliability? <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Research methodology</i></b> : This study uses a comparative methodology. The two inspection methods—guideline reviews and heuristic evaluation—are compared in a domain- and the culture-specific context in terms of the nature, time required, approach, templates, and results. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Results:</i></b> The results reflect that both methods identify similar usability issues; however, they differ in terms of the nature, time duration, evaluation procedure, templates, and results of the evaluation. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Conclusion:</i></b> This study contributes by providing insights for practitioners and researchers about the choice of an evaluation method for domain- and culture-specific evaluation of university websites.