College Composition and Communication
36 articlesSeptember 2023
-
Abstract
Cultural rhetorics—as orientation, methodology, and practice—has made meaningful contributions to writing pedagogy (Brooks-Gillies et al.; Cedillo and Bratta; Baker-Bell; Cedillo et al.; Cobos et al.; Condon and Young; Powell). Despite these contributions, classroom teachers and writing program administrators can struggle to conceptualize assessment beyond bureaucratic practice and their role in assessment beyond standing in loco for the institution. To more fully realize the potential of cultural rhetorics in our classrooms and programs, the field needs assessment models that seek to uncover the counterstories of writing and meaning-making. Our work, at the intersections of queer rhetorics and writing assessment, provides a theoretical framework called Queer Validity Inquiry (QVI) that disrupts stock stories of success—a success that is always available to some at the expense of others. Through four diffractive lenses—failure, affectivity, identity, and materiality—QVI prompts us to determine what questions about student writers and their writing intrigue us, why we care about them, and whose interests are being served by those questions.
December 2022
-
Interchanges: A Kairotic Moment for CLA? Response to Anne Ruggles Gere et al.’s “Communal Justicing: Writing Assessment, Disciplinary Infrastructure, and the Case for Critical Language Awareness” ↗
Abstract
Preview this article: Interchanges: A Kairotic Moment for CLA? Response to Anne Ruggles Gere et al.’s “Communal Justicing: Writing Assessment, Disciplinary Infrastructure, and the Case for Critical Language Awareness”, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/74/2/collegecompositionandcommunication32280-1.gif
February 2021
-
Communal Justicing: Writing Assessment, Disciplinary Infrastructure, and the Case for Critical Language Awareness ↗
Abstract
Critical language awareness offers one approach to communaljusticing, an iterative and collective process that can address inequities in the disciplinary infrastructure of Writing Studies. We demonstrate justicing in the field’s pasts, policies, and publications; offer a model of communal revision; and invite readers to become agents of communal justicing.
December 2017
-
Collaborative Ecologies of Emergent Assessment: Challenges and Benefits Linked to a Writing-Based Institutional Partnership ↗
Abstract
This essay reports on a writing-based formative assessment of a university-wide initiative to enhance students’ global learning. Our mixed (and unanticipated) results show the need for enhanced expertise in writing assessment as well as for sustained partnerships among diverse institutional stakeholders so that public programming—from events linked to classroom-level learning to broader cross unit mandates like accreditation—can yield more rigorous, responsive, and mixed method assessments.
February 2016
-
Abstract
This article shares our experience designing and deploying writing assessment in English Composition I: Achieving Expertise, the first-ever first-year writing Massive Open Online Course (MOOC). We argue that writing assessment can be effectively adapted to the MOOC environment and that doing so reaffirms the importance of mixed-methods approaches to writing assessment and drives writing assessment toward a more individualized,learner-driven, and learner-autonomous paradigm.
June 2014
-
The Legal and the Local: Using Disparate Impact Analysis to Understand the Consequences of Writing Assessment ↗
Abstract
In this article, we investigate disparate impact analysis as a validation tool for understanding the local effects of writing assessment on diverse groups of students. Using a case study data set from a university that we call Brick City University, we explain how Brick City’s writing program undertook a self-study of its placement exam using the disparate impact process followed by the Office for Civil Rights of the US Department of Education. This three-step process includes analyzing placement rates through (1) a threshold statistical analysis, (2) a contextualized inquiry to determine whether the placement exam meets an important educational objective, and (3) a consideration of less discriminatory assessment alternatives. By employing such a process, Brick City re-conceptualized the role of placement testing and basic writing at the university in a way that was less discriminatory for Brick City’s diverse student population.
February 2014
-
Abstract
Writing Assessment in the 21st Century: Essays in Honor of Edward M. White Norbert Elliot and Les Perelman, eds. Race and Writing Assessment Asao B. Inoue and Mya Poe, eds. Writing Assessment and the Revolution in Digital Texts and Technologies Michael R. Neal Digital Writing: Assessment and Evaluation Heidi A. McKee and Danielle Nicole DeVoss, eds.
June 2013
-
Abstract
Grounded in the principle that writing assessment should be locally developed and controlled, this article describes a study that contextualizes and validates the decisions that students make in the modified Directed Self-Placement (DSP) process used at the University of Michigan. The authors present results of a detailed text analysis of students’ DSP essays, showing key differences between the writing of students who self-selected into a mainstream first-year writing course and that of students who self selected into a preparatory course. Using both rhetorical move analysis and corpus-based text analysis, the examination provides information that can, in addition to validating student decisions, equip students with a rhetorically reflexive awareness of genre and offer an alternative to externally imposed writing assessment.
June 2012
-
Abstract
Books discussed in this essay: Reframing Writing Assessment to Improve Teaching and Learning, Linda Adler-Kassner and Peggy O’Neill Going Public: What Writing Programs Learn from Engagement, Shirley K. Rose and Irwin Weiser, editors The Public Work of Rhetoric: Citizen-Scholars and Civic Engagement, John M. Ackerman and David J. Coogan, editors Activism and Rhetoric: Theories and Contexts for Political Engagement, Seth Kahn and JongHwa Lee, editors
February 2012
February 2011
-
Abstract
I use Burkean analysis to show how neoliberalism undermines faculty assessment expertise and underwrites testing industry expertise in the current assessment scene. Contending that we cannot extricate ourselves from our limited agency in this scene until we abandon the familiar “stakeholder” theory of power, I propose a rewriting of the assessment scene that asserts faculty and student agency and leadership for writing assessment.
June 2010
-
Abstract
Effective Grading: A Tool for Learning and Assessment in College, 2nd ed. Barbara E. Walvoord and Virginia Johnson Anderson San Francisco: Jossey-Bass, 2010. 255 pp. A Guide to College Writing Assessment Peggy O’Neill, Cindy Moore, and Brian Huot Logan: Utah State University Press, 2009. 218 pp. Organic Writing Assessment: Dynamic Criteria Mapping in Action Bob Broad, Linda Adler-Kassner, Barry Alford, Jane Detweiler, Heidi Estrem, Susanmarie Harrington, Maureen McBride, Eric Stalions, and Scott Weeden Logan: Utah State University Press, 2009. 167 pp. Teaching and Evaluating Writing in the Age of Computers and High-Stakes Testing Carl Whithaus Mahwah, NJ: Erlbaum, 2005. 169 pp. Composition in Convergence: The Impact of New Media of Writing Assessment Diane Penrod Mahwah, NJ: Erlbaum, 2005. 184 pp.
September 2009
-
Abstract
As writing-program administrators and faculty are being called upon more frequently to help design and facilitate large-scale assessments, it becomes increasingly important for us to see assessment as integral to our work as academics. This article provides a framework, based on current historical, theoretical, and rhetorical knowledge, to help writing specialists understand how to embrace assessment as a powerful mechanism for improved teaching and learning at their institutions.
February 2009
-
Abstract
This essay describes Louisiana State University’s search for an alternative to available placement protocols. Under the leadership of Les Perelman at MIT, LSU collaborated with four universities to develop iMOAT, a program for administering online assessments of student writing. This essay focuses on LSU’s On-line Challenge, which developed from the iMOAT project. The On-line Challenge combines direct and indirect writing assessments with student choice while freeing students from the constraints of time and place to invite new possibilities for assessing writing.
December 2008
-
Abstract
In a FIPSE-funded assessment project, a group of diverse institutions collaborated on developing a common, course-embedded approach to assessing student writing in our first-year writing programs. The results of this assessment project, the processes we developed to assess authentic student writing, and individual institutional perspectives are shared in this article.
September 2008
-
Abstract
Closed Systems and Standardized Writing Tests by Chris M. Anson; "Information Illiteracy and Mass Market Writing Assessments" by Les Perelman "Genre, Testing, and the Constructed Realities of Student Achievement" by Mya Poe; "The Call of Research: A Longitudinal View of Writing Development" by Nancy Sommers.
December 2007
-
Portfolio Partnerships between Faculty and WAC: Lessons from Disciplinary Practice, Reflection, and Transformation ↗
Abstract
In portfolio assessment, WAC helps other disciplines increase programmatic integrity and accountability. This analysis of a portfolio partnership also shows composition faculty how a dynamic culture of assessment helps us protect what we do well, improve what we need to do better, and solve problems as writing instruction keeps pace with programmatic change.
June 2005
-
Abstract
Assessment, including writing assessment, is a form of social action. Because standardized tests can be used to reify the social order, local assessments that take into account specific contexts are more likely to yield useful information about student writers. This essay describes one such study, a multiple-measure comparison of accelerated summer courses with nonaccelerated courses. We began with the assumption that the accelerated courses would probably not be as effective as the longer courses;but our assessment found that assumption largely to be incorrect. Contextual information made it clear that students were taking summer accelerated courses strategically, for reasons we had been unaware of and in ways that forced us to reinterpret their writing and our courses.
-
Abstract
Although most portfolio evaluation currently uses some adaptation of holistic scoring, the problems with scoring portfolios holistically are many, much more than for essays, and the problems are not readily resolvable. Indeed, many aspects of holistic scoring work against the principles behind portfolio assessment. We have from the start needed a scoring methodology that responds to and reflects the nature of portfolios, not merely an adaptation of essay scoring. I here propose a means for scoring portfolios that allows for relatively efficient grading where portfolio scores are needed and where time and money are in short supply. It is derived conceptually from portfolio theory rather than essay-testing theory and supports the key principle behind portfolios, that students should be involved with reflection about and assessment of their own work. It is time for the central role that reflective writing can play in portfolio scoring to be put into practice.
February 2004
-
Abstract
Preview this article: Reviews (Re)Articulating Assessment: Writing Assessment for Teaching and Learning by Brian Huot, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/55/3/collegecompositionandcommunication2768-1.gif
February 1999
-
Abstract
Preview this article: Looking Back as We Look Forward: Historicizing Writing Assessment, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/50/3/collegecompositionandcommunication1341-1.gif
December 1996
October 1995
-
Abstract
Preview this article: Review: Uncovering Possibilities for a Constructivist Paradigm for Writing Assessment, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/46/3/collegecompositioncommunication8738-1.gif
-
Abstract
Preview this article: Writing Assessment: A Position Statement, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/46/3/collegecompositioncommunication8736-1.gif
May 1994
-
Abstract
Preview this article: Adventuring into Writing Assessment, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/45/2/collegecompositionandcommunication8789-1.gif
May 1992
-
Abstract
Preview this article: A Selected Bibliography on Postsecondary Writing Assessment, 1979-1991, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/43/2/collegecompositionandcommunication8887-1.gif
May 1990
-
Abstract
I recently attended a conference previously unknown to me and to most college English faculty: The Assessment Forum of American Association for Higher Education (AAHE). (I was there to give a paper on measurement of writing ability and on evaluation of writing programs.) The experience of that conference ought to have been routine; after all, I have directed a variety of large-scale writing programs and I have been speaking and publishing on writing assessment for over fifteen years; I have also spent many years as chair of an English department and as a writing program administrator. But experience of hearing papers and discussions at that conference was not at all routine; it was both troubling and enlightening, as well as quite new in unexpected ways. My first reaction to sessions on writing measurement at AAHE was that I had entered a new world. The papers not only made different assumptions about writing than I, as a writing teacher, writer, and researcher, normally make, but came out of a wholly different scholarly community of discourse, one that calls itself the assessment movement. The references were entirely unfamiliar, procedures were different, and approach to subject struck me as insensitive to what writing is all about. But all of these differences seemed to center on way people spoke (and hence thought) about measurement: I was in a foreign country, language was different, and that difference changed everything. I had entered a new discourse community in a field in which I was a well-published specialist, and none of my knowledge or experience seemed to matter. And yet discourse was about measuring writing ability and evaluating writing programs, that is, about what has (however accidentally) become my specialty. I felt disoriented. When I returned home from AAHE I found a flier from Jossey-Bass, publisher of my 1985 book, Teaching and Assessing Writing. I don't expect book to appear on every flier marketing division puts out, but this little
May 1987
October 1986
-
Abstract
Preview this article: A Procedure for Writing Content-Fair Essay Examination Topics for Large-Scale Writing Assessments, Page 1 of 1 < Previous page | Next page > /docserver/preview/fulltext/ccc/37/3/collegecompositionandcommunication11232-1.gif