Abstract
A particular application of corpus analysis, automated essay scoring (AES) can reveal much about students’ writing skills. In this article we present research undertaken at Educational Testing Service (ETS) as part of its ongoing commitment to developing effective AES systems. AES systems have certain advantages. They can: (a) produce scores similar to those assigned trained human raters, (b) provide a single consistent metric for scoring, and (c) automate linguistic analyses. However, to understand student writing, we may need to look beyond the final essay in various ways, to consider both the process and the product. By broadening our definition of corpora, to capture the dynamics of written composition, it may become possible to identify profiles of writing behavior.
- Journal
- Journal of Writing Research
- Published
- 2010-08-01
- DOI
- 10.17239/jowr-2010.02.02.4
- CompPile
- Search in CompPile ↗
- Open Access
- OA PDF Diamond
- Topics
- Export
- BibTeX RIS
Citation Context
Cited by in this index (0)
No articles in this index cite this work.
Cites in this index (0)
No references match articles in this index.
Related Articles
-
Journal of Writing Research Feb 2026Daniël Janssen; Henri Raven; Lisanne Van Weelden; Yohannes Den Hertog
-
Journal of Business and Technical Communication Apr 2023Corpus Linguistics and Technical Editing: How Corpora Can Help Copy Editors Adopt a Rhetorical View of Prescriptive Usage Rules ↗Jordan Smith
-
IEEE Transactions on Professional Communication Dec 2021Catherine G. P. Berdanier; Mary McCall; Gracemarie Mike Fillenwarth
-
Journal of Academic Writing Dec 2020Caroline Anne Dyche; Jessie Antwi-Cooper
-
Rhetorica Mar 2018Luigi Spina