Assessing Writing

15 articles
Year: Topic: Clear
Export:
empirical research ×

January 2026

  1. Assessing the effects of explicit coherence instruction on EFL students’ integrated writing performance
    Abstract

    As a key attribute of effective writing, coherence remains challenging to teach in language classrooms, with traditional writing instruction frequently overlooking coherence in favor of discrete, rule-based features. This mixed-methods study investigates the effectiveness of explicit coherence instruction on English-as-a-Foreign-Language (EFL) students’ performance on integrated writing tasks. The study employed a controlled experimental design with 64 upper-intermediate-level undergraduate students at a Chinese university, drawing on Hasan’s Cohesive Harmony theory as the theoretical framework. Half of the participants (n = 32) in the experimental group received explicit instruction on coherence with a focus on cohesive chains and cohesive devices in integrated writing, while the control group (n = 32) received standard paraphrasing instruction. Quantitative analysis revealed that the experimental group showed significant improvements in coherence scores and multiple cohesive chain measures. Qualitative discourse analysis of six students’ writing samples from the experimental group demonstrated varying levels of improvement in writing coherence, with high-performing students showing better use of identity chains and pronoun references. The findings revealed that explicit instruction on coherence significantly improved students’ performance in creating coherent integrated writing, particularly through the development of cohesive chains and appropriate use of cohesive devices. This study underscores the pedagogical value of teaching coherence to enhance writing quality and provides concrete strategies for developing more effective teaching approaches for integrated writing tasks in EFL contexts. • The study examined 64 Chinese EFL students using mixed-methods experimental design. • Cohesive Harmony theory served as the framework for assessing writing coherence. • Explicit instruction significantly improved coherence in integrated writing tasks. • High-performing students demonstrated superior identity chain development.

    doi:10.1016/j.asw.2026.101019

April 2025

  1. The influence of working memory and proficiency on phraseological growth: A longitudinal study of adjective-noun combinations in Chinese EFL learners’ argumentative writing
    doi:10.1016/j.asw.2025.100915
  2. Designing a rating scale for an integrated reading-writing test: A needs-oriented approach
    Abstract

    To meet the current trends in higher education, there is accountability on EAP programmes to prepare and assess students’ access to higher education. Thus, multimodal tasks including integrated writing (IW) assessments have seen a resurgence because they arguably closely mirror academic writing. However, test practicality constraints and variability in the use and format of these assessments mean rating scales often fall short in substantiating the central claims of IW assessment. We developed an integrated reading-writing scale taking into account reading-writing requirements and empirical research on IW tests designed to assess readiness for first-year humanities and social science courses. We approached test development as part of the ongoing validation efforts, detailing the considerations involved in the scale development process. We argue that alignment with academic writing requirements should guide the development of IW tests, thereby acknowledging and comprehending nuances of academic writing. The paper demonstrates considerations and decisions in scale design as the validation process from the start, which is a reminder that assessment is not just a quantitative exercise but a multifaceted process. • The design of a rating scale for first-year undergraduate academic writing is detailed. • Emphasis is placed on the role of reading in integrated writing scales. • Academic argumentation, rather than solely source-use mechanics, is considered. • Implications for construct operationalisation in academic evaluations are offered.

    doi:10.1016/j.asw.2025.100918

October 2024

  1. A comparative study of voice in Chinese English-major undergraduates’ timed and untimed argument writing
    doi:10.1016/j.asw.2024.100896

April 2023

  1. Human scoring versus automated scoring for english learners in a statewide evidence-based writing assessment
    doi:10.1016/j.asw.2023.100719

April 2022

  1. The trajectory of syntactic complexity development in L1 Chinese narrative writings of primary school children: A systematic 5-year longitudinal study
    doi:10.1016/j.asw.2022.100622

October 2021

  1. Perceptions of the inclusion of Automatic Writing Evaluation in peer assessment on EFL writers’ language mindsets and motivation: A short-term longitudinal study
    doi:10.1016/j.asw.2021.100568

April 2021

  1. Improving student feedback literacy in academic writing: An evidence-based framework
    doi:10.1016/j.asw.2021.100525

July 2020

  1. Feedback scope in written corrective feedback: Analysis of empirical research in L2 contexts
    doi:10.1016/j.asw.2020.100469

January 2019

  1. An investigation of the text features of discrepantly-scored ESL essays: A mixed methods study
    doi:10.1016/j.asw.2018.10.003

April 2017

  1. Improvement of writing skills during college: A multi-year cross-sectional and longitudinal study of undergraduate writing performance
    doi:10.1016/j.asw.2016.11.001

October 2015

  1. Building a better rubric: Mixed methods rubric revision
    doi:10.1016/j.asw.2015.07.002
  2. Examining instructors’ conceptualizations and challenges in designing a data-driven rating scale for a reading-to-write task
    doi:10.1016/j.asw.2015.06.001

January 2010

  1. Investigating learners’ use and understanding of peer and teacher feedback on writing: A comparative study in a Chinese English writing classroom
    doi:10.1016/j.asw.2010.01.002

January 2004

  1. A comparative study of ESL writers’ performance in a paper-based and a computer-delivered writing test
    doi:10.1016/j.asw.2004.01.001