Assessing Writing

20 articles
Year: Topic: Clear
Export:
digital rhetoric ×

January 2026

  1. The effects of online resource use on L2 learners’ computer-mediated writing processes and written products
    Abstract

    While previous studies on online resource use in L2 writing have focused on the overall writing quality, limited attention has been paid to its effects on linguistic complexity and real-time writing processes. Addressing this gap, the present study explored how online resource use influences both the processes and products of L2 writing. Forty-nine intermediate L2 learners completed two computer-mediated argumentative writing tasks, either with or without the use of online resources. Writing behaviors were captured via keystroke logging and screen recording, and analyzed for search activity, fluency, pausing, and revision quantity. Cognitive processes were examined through stimulated recall interviews, and written products were evaluated for both quality and linguistic complexity. The results showed that participants spent an average of 14 % of task time using online resources, with considerable individual variation. Mixed-effects modeling revealed that resource use facilitated the production of more sophisticated words, with marginal influence on writing quality or syntactic complexity. Resource use was also associated with longer between-word pauses, fewer within-word pauses, and reduced revisions. These findings highlight the potential of online resource use to enhance the authenticity of L2 writing assessment tasks without compromising test validity, while encouraging the use of more advanced vocabulary in writing. • Learners spent 14 % of the total writing task time using online resources. • Online resource use had no significant impact on L2 writing quality. • Online resource use improved lexical sophistication, not syntactic complexity. • Online resource use reduced within-word pauses and aided spelling retrieval. • Online resource use led to fewer revisions but did not affect fluency.

    doi:10.1016/j.asw.2025.100994

July 2024

  1. Comparing Chinese L2 writing performance in paper-based and computer-based modes: Perspectives from the writing product and process
    doi:10.1016/j.asw.2024.100849

July 2023

  1. The development of teacher feedback literacy in situ: EFL writing teachers’ endeavor to human-computer-AWE integral feedback innovation
    doi:10.1016/j.asw.2023.100739
  2. What skills are being assessed? Evaluating L2 Chinese essays written by hand and on a computer keyboard
    doi:10.1016/j.asw.2023.100765
  3. Comparing computer-based and paper-based rating modes in an English writing test
    doi:10.1016/j.asw.2023.100771

January 2022

  1. Cognitive validity evidence of computer- and paper-based writing tests and differences in the impact on EFL test-takers in classroom assessment
    doi:10.1016/j.asw.2021.100594
  2. Constructing a data-based analytic rubric for an academic blog post
    doi:10.1016/j.asw.2021.100602

October 2021

  1. Investigating the authenticity of computer- and paper-based ESL writing tests
    doi:10.1016/j.asw.2021.100548

October 2020

  1. TOEIC® Writing test scores as indicators of the functional adequacy of writing in the international workplace: Evaluation by linguistic laypersons
    Abstract

    This study examines the extent to which TOEIC Writing test scores relate to an external criterion: evaluations by linguistic laypersons of the functional adequacy of writing in the international workplace. Test-taker responses to two representative tasks from the TOEIC Writing test (e-mail requests, opinion surveys) were adapted for workplace role-play scenarios that laypersons read and evaluated in an online survey. After reading each role-play scenario, laypersons evaluated the text produced by their imagined interlocutor using functional adequacy scale items (comprehensibility, content adequacy, effectiveness, support and coherence). Overall functional adequacy evaluations were obtained by averaging the ratings for each of the two tasks. Layperson ratings of functional adequacy were strongly correlated with TOEIC Writing test scores (r = 0.76). Results suggested that test-takers’ writing performance is likely to be perceived as functionally adequate for test scores at which important decisions are typically made. Study results are discussed in terms of their implications for claims about the generalizability of TOEIC Writing test score interpretations with respect to those made in the international workplace, as well as the potential benefits, challenges, and limitations involved in this approach to validation.

    doi:10.1016/j.asw.2020.100492

April 2018

  1. Examining the comparability between paper- and computer-based versions of an integrated writing placement test
    doi:10.1016/j.asw.2018.03.006
  2. Paper-based vs computer-based writing assessment: divergent, equivalent or complementary?
    doi:10.1016/j.asw.2018.04.001
  3. Researching the comparability of paper-based and computer-based delivery in a high-stakes writing test
    doi:10.1016/j.asw.2018.03.008
  4. The effects of writing mode and computer ability on L2 test-takers' essay characteristics and scores
    doi:10.1016/j.asw.2018.02.005

January 2014

  1. The effects of computer-generated feedback on the quality of writing
    doi:10.1016/j.asw.2013.11.007

January 2011

  1. Effects of computer versus paper administration of an adult functional writing assessment
    doi:10.1016/j.asw.2010.11.001

January 2009

  1. Classroom computer experiences that stick: Two lenses on reflective timed essays
    doi:10.1016/j.asw.2009.09.001

January 2006

  1. Effects of composition mode and self-perceived computer skills on essay scores of sixth graders
    doi:10.1016/j.asw.2006.11.003

January 2004

  1. A comparative study of ESL writers’ performance in a paper-based and a computer-delivered writing test
    doi:10.1016/j.asw.2004.01.001

January 2002

  1. A comparison of composing processes and written products in timed-essay tests across paper-and-pencil and computer modes
    doi:10.1016/s1075-2935(03)00003-5

January 1995

  1. Animadversions on writing assessment and hypertext
    doi:10.1016/1075-2935(95)90003-9