Writing Quality Predictive Modeling: Integrating Register-Related Factors

Heqiao Wang Michigan State University ; Gary A. Troia Michigan State University

Abstract

The primary purpose of this study is to investigate the degree to which register knowledge, register-specific motivation, and diverse linguistic features are predictive of human judgment of writing quality in three registers—narrative, informative, and opinion. The secondary purpose is to compare the evaluation metrics of register-partitioned automated writing evaluation models in three conditions: (1) register-related factors alone, (2) linguistic features alone, and (3) the combination of these two. A total of 1006 essays ( n = 327, 342, and 337 for informative, narrative, and opinion, respectively) written by 92 fourth- and fifth-graders were examined. A series of hierarchical linear regression analyses controlling for the effects of demographics were conducted to select the most useful features to capture text quality, scored by humans, in the three registers. These features were in turn entered into automated writing evaluation predictive models with tuning of the parameters in a tenfold cross-validation procedure. The average validity coefficients (i.e., quadratic-weighed kappa, Pearson correlation r, standardized mean score difference, score deviation analysis) were computed. The results demonstrate that (1) diverse feature sets are utilized to predict quality in the three registers, and (2) the combination of register-related factors and linguistic features increases the accuracy and validity of all human and automated scoring models, especially for the registers of informative and opinion writing. The findings from this study suggest that students’ register knowledge and register-specific motivation add additional predictive information when evaluating writing quality across registers beyond that afforded by linguistic features of the paper itself, whether using human scoring or automated evaluation. These findings have practical implications for educational practitioners and scholars in that they can help strengthen consideration of register-specific writing skills and cognitive and motivational forces that are essential components of effective writing instruction and assessment.

Journal
Written Communication
Published
2023-10-01
DOI
10.1177/07410883231185287
CompPile
Search in CompPile ↗
Open Access
Closed
Topics
Export

Citation Context

References (85) · 15 in this index

  1. Journal of Writing Research
  2. Written Communication
  3. 10.1521/jscp.1986.4.3.359
  4. Assessing Writing
  5. 10.1007/s11145-013-9481-0
Show all 85 →
  1. 10.1007/s11145-007-9107-5
  2. 10.1080/03004430.2012.711590
  3. 10.1177/0265532214542994
  4. 10.1007/s10648-020-09530-4
  5. Assessing Writing
  6. 10.1007/s11145-021-10221-x
  7. Journal of Educational Data Mining
  8. 10.1002/j.1545-7249.2008.tb00142.x
  9. Written Communication
  10. 10.1177/2515245919898466
  11. Assessing Writing
  12. Handbook of writing research
  13. Reading and writing informational text in the primary grades
  14. 10.1177/001440290407000306
  15. 10.1371/journal.pone.0209749
  16. 10.1007/s11145-018-9918-6
  17. Fisher H. M. (2017). Using activity theory to understand effective writing instruction with high poverty midd…
  18. 10.1111/medu.12517
  19. 10.1086/669938
  20. 10.1111/j.1756-8765.2010.01081.x
  21. 10.1086/678293
  22. 10.3758/BF03195564
  23. 10.1080/00461520.2018.1481406
  24. 10.1111/1467-9817.12245
  25. 10.1086/693009
  26. 10.1044/2015_LSHSS-14-0043
  27. An introduction to functional grammar
  28. Handbook of self-regulation of learning and performance
  29. 10.17265/2159-5313/2016.09.003
  30. 10.1007/s11145-020-10057-x
  31. 10.1093/elt/52.4.308
  32. 10.5395/rde.2013.38.1.52
  33. 10.1007/s11145-017-9724-6
  34. 10.1080/00220671.1995.9941199
  35. 10.7820/vli.v01.1.koizumi
  36. 10.1044/0161-1461(2012/11-0018)
  37. Lewis M. (2007). Stepwise versus hierarchical regression: Pros and cons. Online Submission. Retrieved March 8…
  38. 10.1080/10573560701277542
  39. Across the Disciplines
  40. 10.1007/s11145-018-9853-6
  41. Children writing: A reader
  42. 10.2307/357865
  43. 10.1177/0265532207080767
  44. 10.3758/BRM.42.2.381
  45. Written Communication
  46. 10.3758/s13428-012-0258-1
  47. Measuring up: Advances in how we assess reading ability
  48. 10.4018/978-1-60960-741-8.ch011
  49. 10.1017/CBO9780511894664
  50. 10.1080/01638530902959943
  51. 10.1002/trtr.1227
  52. Montenegro E., Jankowski N. A. (2017). Equity and assessment: Moving towards culturally responsive assessment…
  53. Handbook of research on writing: History, society, school, individual, text
  54. 10.1037/a0037549
  55. 10.1007/s11145-012-9392-5
  56. Osborne J. W., Waters E. (2002). Four assumptions of multiple regression that researchers should always test.…
  57. 10.1080/10573560308222
  58. 10.1037/0022-0663.91.1.50
  59. 10.1006/ceps.2000.1069
  60. Assessing Writing
  61. Assessing Writing
  62. Journal of Writing Research
  63. 10.3390/educsci10110297
  64. 10.1007/s11145-012-9379-2
  65. 10.1080/10573569.2012.632729
  66. Written Communication
  67. 10.1016/j.cedpsych.2022.102100
  68. 10.1016/j.jsp.2022.07.002
  69. 10.1177/0022466919887150
  70. 10.1007/s11145-019-09938-7
  71. 10.1371/journal.pone.0224365
  72. Assessing Writing
  73. 10.1016/j.jslw.2012.03.004
  74. 10.1016/0273-2297(92)90011-P
  75. 10.1111/j.1745-3992.2011.00223.x
  76. Assessing Writing
  77. Journal of Writing Research
  78. 10.1080/09588221.2014.881384
  79. 10.1016/j.stueduc.2020.100911
  80. 10.1006/ceps.1997.0919