Heuristic Evaluation Versus Guideline Reviews: A Tale of Comparing Two Domain Usability Expert's Evaluation Methods

Sehrish Nizamani University of Sindh ; Saad Nizamani University of Sindh ; Nazish Basir University of Sindh ; Gulsher Laghari University of Sindh ; Khalil Khoumbati University of Sindh ; Sarwat Nizamani University of Sindh

Abstract

<bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Background:</i></b> The usability of university websites is important to ascertain that they serve their intended purpose. Their usability can be evaluated either by testing methods that rely on actual users or by inspection methods that rely on experts for evaluation. Heuristic evaluation and guideline reviews are two inspection methods of usability evaluation. A heuristic evaluation consists of a few general heuristics (rules), which are limited to checking general flaws in the design. A guideline review uses a much larger set of guidelines/suggestions that fit a specific business domain. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Literature review:</i></b> Most of the literature has equated usability studies with testing methods and has given less focus to inspection methods. Moreover, those studies have examined usability in a general sense and not in domain- and culture-specific contexts. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Research questions:</i></b> 1. Do domain- and culture-specific heuristic evaluation and guideline reviews work similarly in evaluating the usability of applications? 2. Which of these methods is better in terms of the nature of evaluation, time needed for evaluation, evaluation procedure, templates adopted, and evaluation results? 3. Which method is better in terms of thoroughness and reliability? <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Research methodology</i></b> : This study uses a comparative methodology. The two inspection methods—guideline reviews and heuristic evaluation—are compared in a domain- and the culture-specific context in terms of the nature, time required, approach, templates, and results. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Results:</i></b> The results reflect that both methods identify similar usability issues; however, they differ in terms of the nature, time duration, evaluation procedure, templates, and results of the evaluation. <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><i>Conclusion:</i></b> This study contributes by providing insights for practitioners and researchers about the choice of an evaluation method for domain- and culture-specific evaluation of university websites.

Journal
IEEE Transactions on Professional Communication
Published
2022-12-01
DOI
10.1109/tpc.2022.3201732
CompPile
Search in CompPile ↗
Open Access
Closed
Topics
Export

Citation Context

Cited by in this index (0)

No articles in this index cite this work.

Cites in this index (2)

  1. IEEE Transactions on Professional Communication
  2. IEEE Transactions on Professional Communication
Also cites 35 works outside this index ↓
  1. 10.1080/15332861.2013.763696
  2. 10.26692/sujo/2019.01.27
  3. 10.1353/jhe.2014.0024
  4. 10.1145/142750.142834
  5. 10.1145/142167.142179
  6. 10.1207/S15327590IJHC1501_13
  7. 10.1111/j.1365-2575.2007.00255.x
  8. 10.1016/j.intcom.2006.10.001
  9. 10.1109/ITNG.2014.81
  10. 10.1080/0144929X.2013.783114
  11. 10.1080/10447318.2014.930311
  12. 10.1145/1362550.1362602
  13. 10.1207/s15327590ijhc0903_2
  14. 10.1207/s15327051hci1303_2
  15. 10.2307/2529786
  16. 10.1145/3359174
  17. 10.1207/S15327590IJHC1304_05
  18. 10.15388/infedu.2013.16
  19. 10.1177/0961000618773133
  20. 10.1093/jamia/ocw100
  21. 10.1016/j.ijmedinf.2019.05.001
  22. 10.1207/s15327051hci1303_1
  23. 10.1145/259963.260531
  24. 10.4018/978-1-5225-1944-7.ch015
  25. 10.1016/j.csi.2019.01.004
  26. 10.1016/j.csi.2020.103423
  27. 10.1145/1970378.1970382
  28. 10.1016/j.infsof.2011.02.007
  29. Domain usability, user's perception
    Human-Computer Systems Interaction Backgrounds and Applications 3  
  30. 10.1080/0144929X.2012.751708
  31. 10.1109/ICADEIS49811.2020.9277343
  32. 10.21015/vtcs.v8i2.382
  33. 10.12785/ijcds/040404
  34. 10.1109/ITNG.2013.26
  35. 10.1080/0144929X.2021.1903080