Order Now
Menu
  Back to all posts

The Self-Revised Essays

24 Jul 2017

At Kingsborough Community College, developmental English and ESL courses address both reading and writing, reflecting the view that the two are inextricably connected, and course curricula are sometimes focused on a course theme.

For example, one semester, my ESL students and I explored the theme of trust through reading and discussing stories where a character’s trust is betrayed. That semester we read Anita Shreve’s The Pilot’s Wife, which focuses on trust between a husband and wife; Mark Haddon’s The Curious Incident of the Dog in the Night-time, where a breach of trust damages the relationship between a teenager with Asperger’s Syndrome and his father; and S. E. Hinton’s That Was Then, This Is Now, where a teenager betrays his best friend’s trust to do what he believes is right. Reading and discussing these texts provided the class with opportunities to think about the notion of trust by exploring breaches of trust motivated by a variety of reasons and through a variety of relationships.

At the end of the semester, reading ability is assessed through short- answer departmental reading exams given at each level, and writing ability is assessed through portfolios. Portfolios consist of two drafted essays with a minimum of three drafts each, a letter or essay in which students reflect on themselves as writers, and an in-class, end-of-the-semester, timed essay exam. One week in advance of the essay exam, students are given a reading, which they are free to discuss, annotate, and bring with them to the exam. The exam asks students to respond to one of two essay prompts based on the reading, and students must cite from the reading to support their responses.

Portfolios are assessed by instructors who are normed in order to increase assessment reliability

Norming is especially important, as the purpose of portfolio assessment is not only to determine whether students’ work meets course objectives, but also to inform decisions about students’ next placement within the developmental sequence. In addition, when assessing portfolios at the end of each semester, instructors pair up with “portfolio partners” and exchange class portfolios so that instructors’ assessments are not biased by background knowledge of their own students.

Although readers respond to a single rubric and are expected to assess the portfolio as a whole, it has been my experience that such holistic scoring is extremely difficult. Hamp-Lyons and Condon, who surveyed portfolio readers at the University of Michigan, discuss the difficulty of holistic portfolio assessment: “Multiple texts, unless texts are so close in kind and quantity that they are virtually identical, inevitably force readers to consider one text in the light of another, to weigh one against the other, and to make a decision that, while representing a judgment about the whole portfolio, is grounded in a weighing of the parts, rather than in a dominant impression of the whole.

In such cases, decisions become harder, not easier” (180). As a result, they found that readers may look for short cuts to decision making—short cuts that often involve not considering all parts of the portfolio. Hamp-Lyons and Condon consider this strategy for reducing the cognitive load of holistic portfolio assessment as “a human trait”, and not as indicative of a lack of training or professionalism. I found their view comforting, since I have to admit to taking such a short cut, myself. In my case, the short cut was motivated by a lack of confidence in all but the timed essay.
My lack of faith in the drafted essays and reflective writing stems from a number of concerns.

First, for none of these writing samples is sole authorship by the student guaranteed. Even if we dismiss, for the sake of discussion, the very real possibility that students sometimes hand in work that is not their own, sole authorship may be compromised by instructor feedback. Hamp-Lyons and Condon recognized this problem of instructor input, reporting that portfolio readers in their study “were aware of the part they played as instructors in improving their own students’ texts, and that this led them to be suspicious when they saw significantly better revised texts than impromptu writing in portfolios from other classes”.

Hamp-Lyons and Condon speculated that inclusion of all drafts might solve the problem, but, in my case, it was looking at the earlier drafts themselves, that made the problem salient.

I found that, depending on the instructor, feedback on student drafts can range from scant to ubiquitous; it can take the form of questions for students to consider, suggestions for revision, directives for revision, or, in some cases, rewrites in the handwriting of the instructor (though I realize that this last form of feedback does not necessarily reflect the words of the instructor, as it can instead represent those of the student during instruc- tor-student conferencing).

Editing feedback is likewise variable, with some instructors suggesting that students review papers for certain mechanical problems, others identifying each problem directly, and still others actually making the corrections for the students. Variability also exists with respect to when editing comments appear; while most English faculty at Kings- borough refrain from editing until the penultimate draft (except in cases where global errors severely impede understanding), others begin editing comments in earlier drafts.

If instructor feedback on portfolio work raises questions about what students can do independently, variability across sections of a course poses an additional problem. Assuming, as is the case at Kingsborough, that faculty are normed for portfolio assessment, they are presumably assessing student work across individual classes taught by different instructors using the same criteria at the same course level. It seems to me that a student’s work can be privileged or disadvantaged when compared to the work of other students taking the same course, depending on the nature of the feedback the student received from his or her instructor. Program directors at Kingsborough are currently addressing this issue through faculty development focused on instructor comments, but given the large number of instructors, both full- time and part-time, who teach portfolio courses, establishing a departmental approach to feedback presents a real challenge.

Even in those cases where it is clear that instructor feedback is not compromising assessment, portfolio readers can be confident only that final drafts do not reflect the work of the instructor; they still do not know the degree to which final drafts—or any drafts for that matter—reflect what students can do on their own. This is not to say that students who are not the sole authors of their essays are necessarily being dishonest.

Portfolios reflect a pedagogy that recognizes the social aspect of writing; the notion of “work- shopping” essays, and the peer-review and discussion of student papers that is inherent in it, promotes student collaboration.

However, students, especially those at community colleges, often need time to become acculturated to the practices and values of academia; they do not always recognize the sometimes subtle distinctions between collaboration and plagiarism.

ёAnd, unfortunately, I have found that given the high-stakes nature of portfolios and the frustration that often accompanies working through developmental course sequences, some students, in desperation, do at times intentionally resort to plagiarism.

So, every time I faced a portfolio, I found myself reading the timed essay first. I thought that if students could successfully integrate and cite source material (a benefit of the practice of giving students a prior reading) and demonstrate some level of analysis in a coherent and well organized essay under timed testing conditions, then I had no reason to read any further. My decision to pass them was made.

I did read the rest of their portfolios, however, so that I could provide more informed feedback that recognized and encouraged each student’s strengths and offered suggestions for addressing weaknesses.

But the drafted essays did not inform my ultimate decision.

If a student’s timed essay did not merit a “pass” on its own, I truly struggled. If instructor feedback was not very directive, I simply trusted that what I was reading was the student’s own work, and assessed it accord- ingly. If, however, instructor feedback was highly directive, I often resorted to comparing first drafts to each other, hoping that first drafts written later in the semester were stronger than those written earlier in the semester, so that I could identify student progress.

Reflective writings did not offer much help. If these are done well, authorship is not likely to be questionable, but it has been my experience that most students in developmental courses struggle with these pieces. Most of the reflections I have read reflected less on the students’ own work and more on the English courses students were taking at the time—an issue that program directors at Kingsborough are currently addressing through faculty development focused on preparing students to write and revise reflective pieces—but I believe that the metacognitive skills needed for reflection and self-assessment make these activities particularly challenging for students in developmental courses.

In other cases, real self-assessments were not always supported by student work. Even White, arguing for grounding assessment in reflective writing in “The Scoring of Writing Portfolios: Phase 2,” notes that “the reflective letter is a genre itself, and a difficult one to do well; thus it adds a new burden to both the preparation and scoring of portfolios” (594).

In short, with the timed essay, the two drafted essays, and the reflective piece, I found that portfolios often offered “too much information” and “not enough information” at the same time. Taking to heart White’s admonition that “it is wasteful and intrusive to gather more information than we can well use” (“An Apologia” 33) and Brian Huot’s call for the development of new procedures for writing assessment that link “instruction and practical purposes with the concept of measuring students’ ability to engage in a spe- cific literary event or events” (561), I suggest a method of assessing student writing that captures, I believe, the best of portfolios and timed writing exams—the self-revised essay.

Reference

Journal of Basic Writing, Vol. 26, No. 2, 2007 75, Janine Graziano-King.

Order your paper now!

I need
My email

Testimonials

  • Dominic Perkins

    Hello those who are just about to turn to the help of customwriting.com! Want to share my impressions of the writer I had! I am so thankful that the one I was given with was an MBA graduate! He so meticulously edited my research article that I did not encounter something wrong in it! That you so much!

  • Christen Molino

    The writer I had appeared to be a real content manager! It was perfect because I had to complete a blog-post on one web-site of the technical sphere. The result I got was really good. All the details and preferences were taken into account! Thanks!

  • Jenna Wilkinson

    Without hesitation contact customwriting.com! This is the second time I do that! My order was to fulfill some academic writing for my degree. My writer saved my time and life actually! I an so grateful to her! You are real professionals!