The Internet TESL Journal

Correcting Students' Writing

Bryan Murphy
elczin [at]
English Language Centre, Assumption University, Bangkok, Thailand

A version of this report appeared in LIPS (Language Institute People Speak), an in-house publication at City Polytechnic of Hong Kong. (in 1994)

Little research seems to have been done on the effectiveness or otherwise of the ways in which teachers "correct" student compositions. This is hardly surprising, since it is hard enough to measure progress in writing skill, let alone relate it to specific teacher behaviour. The only relevant study I have come across thus concerned itself with whether the kinds of correction and comment matched the students' expectations.

Since most of us habitually spend hours dealing with our students' compositions, and I for one often wonder to what extent that precious time is being wasted, I decided to try and get some more than anecdotal feedback on various correcting strategies.

Rather than eliciting spoken comments, I asked a small first-year class at City Polytechnic which had recently had a mid-semester composition-in-class returned to it to take out that composition a week later, look at my corrections and comments, and highlight any they had found useful. I hoped, incidentally, that this "second look" would prove a useful activity for them in itself.

Table 1, below, gives a breakdown of the types of "correction" they found useful.

Table 1. Number of students who deemed a particular type of correction useful
Verb error corrected 5
Explanation 5
Vocabulary error corrected 4
Morphological error corrected 3
Style adjusted 2
Ticks 2
Comment in margin 1
Final comment 0
Mistake indicated but not corrected 0
Final mark 0

The most striking result was that comparatively few of the corrections, fewer than 10%, were considered useful. Mistakes had to be corrected for my intervention to have a chance of being considered useful: challenging the students to think by indicating a mistake but not offering a correction was not seen as useful. My attempts to encourage students by ticking "good bits" was rarely seen as useful; and the attempt to encourage them in finally comments appeared to have made no impact.

I did not analyse the scripts in sufficient detail to turn the raw scores into percentages, which would have been even more useful for comparative purposes, but a glance was enough to suggest that the most appreciated strategy was to offer an explanation as well as a correction.

I do not wish to read too much into these results. Apart from the inherent statistical unreliability of such a small sample, they beg the question of whether what students perceive as being useful is what actually helps them most. Nevertheless, in the future, I think I shall spend a bit more time writing cryptic explanations, and less time searching for something positive and helpful to say at the end.


In order to justify spending some class time on this minor survey, I did it all in English. It is hard enough writing questionnaires and other inquiry tools that respondents clearly understand in their own language. When they have to read and respond in a language foreign to them, the pitfalls are even greater. I discovered this when I discussed the results with the class in question, and learnt that some of them had not understood that they were to consider final comments as a type of "correction" for the purposes of the survey. It transpired that they felt such comments were indeed valuable. So it looks as though I shall have to give up something other than final comments in order to have more time for cryptic explanations!

Please send any comments by e-mail to the author.

The Internet TESL Journal, Vol. III, No. 2, February 1997