Assessing Writing
20 articlesJanuary 2026
-
The effects of online resource use on L2 learners’ computer-mediated writing processes and written products ↗
Abstract
While previous studies on online resource use in L2 writing have focused on the overall writing quality, limited attention has been paid to its effects on linguistic complexity and real-time writing processes. Addressing this gap, the present study explored how online resource use influences both the processes and products of L2 writing. Forty-nine intermediate L2 learners completed two computer-mediated argumentative writing tasks, either with or without the use of online resources. Writing behaviors were captured via keystroke logging and screen recording, and analyzed for search activity, fluency, pausing, and revision quantity. Cognitive processes were examined through stimulated recall interviews, and written products were evaluated for both quality and linguistic complexity. The results showed that participants spent an average of 14 % of task time using online resources, with considerable individual variation. Mixed-effects modeling revealed that resource use facilitated the production of more sophisticated words, with marginal influence on writing quality or syntactic complexity. Resource use was also associated with longer between-word pauses, fewer within-word pauses, and reduced revisions. These findings highlight the potential of online resource use to enhance the authenticity of L2 writing assessment tasks without compromising test validity, while encouraging the use of more advanced vocabulary in writing. • Learners spent 14 % of the total writing task time using online resources. • Online resource use had no significant impact on L2 writing quality. • Online resource use improved lexical sophistication, not syntactic complexity. • Online resource use reduced within-word pauses and aided spelling retrieval. • Online resource use led to fewer revisions but did not affect fluency.
July 2024
July 2023
January 2022
October 2021
October 2020
-
TOEIC® Writing test scores as indicators of the functional adequacy of writing in the international workplace: Evaluation by linguistic laypersons ↗
Abstract
This study examines the extent to which TOEIC Writing test scores relate to an external criterion: evaluations by linguistic laypersons of the functional adequacy of writing in the international workplace. Test-taker responses to two representative tasks from the TOEIC Writing test (e-mail requests, opinion surveys) were adapted for workplace role-play scenarios that laypersons read and evaluated in an online survey. After reading each role-play scenario, laypersons evaluated the text produced by their imagined interlocutor using functional adequacy scale items (comprehensibility, content adequacy, effectiveness, support and coherence). Overall functional adequacy evaluations were obtained by averaging the ratings for each of the two tasks. Layperson ratings of functional adequacy were strongly correlated with TOEIC Writing test scores (r = 0.76). Results suggested that test-takers’ writing performance is likely to be perceived as functionally adequate for test scores at which important decisions are typically made. Study results are discussed in terms of their implications for claims about the generalizability of TOEIC Writing test score interpretations with respect to those made in the international workplace, as well as the potential benefits, challenges, and limitations involved in this approach to validation.