Abstract

A particular application of corpus analysis, automated essay scoring (AES) can reveal much about students’ writing skills. In this article we present research undertaken at Educational Testing Service (ETS) as part of its ongoing commitment to developing effective AES systems. AES systems have certain advantages. They can: (a) produce scores similar to those assigned trained human raters, (b) provide a single consistent metric for scoring, and (c) automate linguistic analyses. However, to understand student writing, we may need to look beyond the final essay in various ways, to consider both the process and the product. By broadening our definition of corpora, to capture the dynamics of written composition, it may become possible to identify profiles of writing behavior.

Journal
Journal of Writing Research
Published
2010-08-01
DOI
10.17239/jowr-2010.02.02.4
CompPile
Search in CompPile ↗
Open Access
OA PDF Diamond
Topics
Export

Citation Context

Cited by in this index (0)

No articles in this index cite this work.

Cites in this index (0)

No references match articles in this index.