7
2005 IEEE International Professional Communication Conference Proceedings 0-7803-9028-8/05/$20.00 © 2005 IEEE. The Effect of a Timed Writing Assessment on ESL Undergraduate Engineering Students Joanne Lax School of Electrical and Computer Engineering Purdue University U.S.A. [email protected] Abstract Pressure from engineering faculty and external accreditors to improve the quality of undergraduates’ communication skills led to the development of a timed writing assessment, which has resulted in an unproportionally high failure rate among English as Second Language (ESL) students. This paper provides explanations for why some ESL writers are unable to successfully negotiate a timed writing instrument, the types of rhetorical and linguistic errors they commonly display in this context, and suggestions for instructors of ESL writing and engineering courses to become more proactive in helping prepare students to write under pressure as undergraduates and in their future professional lives. Keywords: writing tests, international students, engineering communication The Writing Sample There is a consensus that engineers must have good communication skills as part of the diverse skills toolbox they acquire in college. ABET 2000 requires this in criterion 3g[1]; industrial employers expect it from new hires; and engineering alumni say that undergraduates need more preparation.[2] Engineers easily spend over 50 percent of their time communicating orally and in writing, and those professionals who get promoted have the most effective communication skills.[3] With this knowledge and comments from engineering professors about the inadequacy of undergraduate writing, I developed a large-scale timed writing sample for students in the mandatory sophomore and senior-level seminars in the School of Electrical and Computer Engineering (ECE) at Purdue University in West Lafayette, Indiana (for a detailed description of the writing sample, see [4]). Although large-scale, timed essay exams are not appropriate for all purposes, they are expedient for dealing with several hundred students per semester. The purpose of the writing sample is to help identify, and ultimately help, those students who are unable to communicate effectively in a short time span. The 30-minute time constraint was dictated by the length of the seminars and the writing construct being tested (ability to communicate clearly in English). The single prompt for the writing sample (“Physically describe a … [some type of consumer electronic device] and analyze how it works. Your audience is an educated non- engineer.”) was devised largely on the basis of results of a faculty survey of communication activities I conducted. Based on the survey results, description [of objects] and analysis were chosen as the rhetorical modes for the writing sample. These rhetorical choices are corroborated in a large-scale survey of faculty from 34 universities. This study showed that description, summary, and analysis were most often needed to complete writing assigned by science and technology faculty.[5] In addition, I asked the ECE faculty to indicate the frequency with which they saw a list of major writing problems (content, organization, 40

[IEEE IPCC 2005. Proceedings. International Professional Communication Conference, 2005. - Limerick, Ireland (July 7, 2005)] IPCC 2005. Proceedings. International Professional Communication

  • Upload
    j

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

2005 IEEE International Professional Communication Conference Proceedings

0-7803-9028-8/05/$20.00 © 2005 IEEE.

The Effect of a Timed Writing Assessment on ESL Undergraduate Engineering Students

Joanne Lax School of Electrical and Computer Engineering Purdue [email protected]

Abstract

Pressure from engineering faculty and external accreditors to improve the quality of undergraduates’ communication skills led to the development of a timed writing assessment, which has resulted in an unproportionally high failure rate among English as Second Language (ESL) students. This paper provides explanations for why some ESL writers are unable to successfully negotiate a timed writing instrument, the types of rhetorical and linguistic errors they commonly display in this context, and suggestions for instructors of ESL writing and engineering courses to become more proactive in helping prepare students to write under pressure as undergraduates and in their future professional lives.

Keywords: writing tests, international students, engineering communication

The Writing Sample

There is a consensus that engineers must have good communication skills as part of the diverse skills toolbox they acquire in college. ABET 2000 requires this in criterion 3g[1]; industrial employers expect it from new hires; and engineering alumni say that undergraduates need more preparation.[2] Engineers easily spend over 50 percent of their time communicating orally and in writing, and those professionals who get promoted have the most effective communication skills.[3]

With this knowledge and comments from engineering professors about the inadequacy of undergraduate writing, I developed a large-scale timed writing sample for students in the mandatory sophomore and senior-level seminars in the School of Electrical and Computer Engineering (ECE) at Purdue University in West Lafayette, Indiana (for a detailed description of the writing sample, see [4]). Although large-scale, timed essay exams are not appropriate for all purposes, they are expedient for dealing with several hundred students per semester. The purpose of the writing sample is to help identify, and ultimately help, those students who are unable to communicate effectively in a short time span. The 30-minute time constraint was dictated by the length of the seminars and the writing construct being tested (ability to communicate clearly in English).

The single prompt for the writing sample (“Physically describe a … [some type of consumer electronic device] and analyze how it works. Your audience is an educated non-engineer.”) was devised largely on the basis of results of a faculty survey of communication activities I conducted. Based on the survey results, description [of objects] and analysis were chosen as the rhetorical modes for the writing sample. These rhetorical choices are corroborated in a large-scale survey of faculty from 34 universities. This study showed that description, summary, and analysis were most often needed to complete writing assigned by science and technology faculty.[5] In addition, I asked the ECE faculty to indicate the frequency with which they saw a list of major writing problems (content, organization,

40

2005 IEEE International Professional Communication Conference Proceedings

grammar, punctuation, spelling, vocabulary, and cohesion) in their students’ writing. Problems of cohesion and organization, respectively, were their top two concerns.

In collaboration with the coordinator of ECE’s undergraduate labs, several popular consumer electronic items were selected as possible topics for the prompt. Because Purdue University has enrolled the largest number of international students of any public research university in the United States for the past several years, and undergraduate international enrollment in ECE has been hovering around the 30 percent mark, cultural bias is considered in the selection of potential prompts so that the consumer electronic device chosen is one with which all of these students would be familiar. One topic is used at each writing sample administration; both classes receive the same topic in the same semester. After the initial topic (the light bulb) was employed, subsequent prompts were selected based on trials with the smaller groups of test-takers doing make-up writing samples later in the semester. The writing sample is scored holistically according to a four-point scale, similar to the GRE Analytic Writing and the Test of Written English.[6] I hire additional raters from the Department of English; they tend to be graduate students in rhetoric and composition and often have taught English as a Second Language composition. Many of them also have had previous experience with large-scale holistic assessment of writing. After extensive norming sessions, in which all raters compare and discuss their scores of identical papers, each rater is given a stack of ten papers to evaluate. Once all of the papers have been rated one time, the stacks of papers are redistributed to a different second rater who does not have access to the first rater’s scores on these papers. Papers that receive two different scores from the raters are averaged unless the difference is larger than one point; in this case, the paper goes to a third rater. A passing score on the writing sample is a 1.5.

The holistic scoring rubric encompasses content, organization, vocabulary, language use, and mechanics as criteria. The highest-scoring papers clearly demonstrate competence on both the rhetorical and syntactic levels, according to the rubric. Although students in the senior-level seminar may know more about the technical operation of the gadget in any given prompt, the

rubric does not prize technical knowledge. In the oral instructions the students receive just prior to writing, they are told to write about “how to use” the gadget if they do not know exactly how it works.

Because the papers are only identifiable by the students’ university identification numbers, the authors’ names are matched to their numbers after the rating process is complete. Those students receiving non-passing scores are contacted and referred to me for an individual writing tutorial before making another attempt to write a passing paper (the process is repeated if they fail the second writing sample). I use this opportunity to point out patterns of “serious” errors (this will be discussed in a later section) and give the students strategies for identifying and correcting them. Students have three chances to pass the writing sample in a semester.

Why Students Don’t Pass

Generally, no more than ten percent of the students fail on their first attempt on the writing sample. Of these, many have made errors that are fairly easy to overcome, such as use of an inappropriate genre or incomplete description or analysis. Yet three-fourths of the way into a recent semester, there remained 12 students doing the writing sample for the third time. [The original writing sample prompt was to describe and analyze a light bulb.] They were the only ones out of over 500 sophomores and seniors who had not yet passed the writing sample. I wondered what causes some students every semester to have such a difficult time negotiating the writing sample that an overwhelming majority of their classmates passed on the first attempt. Examining the students’ personal characteristics, their past writing experiences, the errors they make, and the writing assessment instrument, it is clear that each of these features has a role in determining their writing sample scores.

Learner Differences First, the students who typically require three tries (and even then do not always pass) are, not surprisingly, nonnative speakers of English. Within this category, they can be divided into “true” international students (holding F1 visas) and resident aliens. Of this specific group of students, only one, a native speaker of Korean, held a green card; the remainder, four

41

2005 IEEE International Professional Communication Conference Proceedings

Indonesians, four Malaysians, two Taiwanese, and one Thai, were holders of F1 visas. As some research points out, there is a difference in the English writing proficiency of students who have learned the language by “eye” versus by “ear.”[7] The former may know written grammar but may not have had extensive experience in writing extended prose in English while in their native countries. On the other hand, the “ear” learners of English, who learn English aurally/orally on the playground and the neighborhood in an English-speaking country, may have some gaps in their knowledge of grammar and even incorrect assumptions based on their inferences from what they hear. Thus, the writing of these semi-fluent English speakers may contain errors not present in their speech.[7]

The students’ differences in residency may be related to differing motivation for improving their writing in English. Thus, international students who plan to return to their native country, either by choice or due to the conditions of their sponsorship abroad, may not improve their English as much as students who plan to remain in an English-speaking country. Because of the government sponsorship of many of the Malaysian and Indonesian students, most of these students will return home after receiving their undergraduate degree. Thus, these students probably have “instrumental” motivation to learn English—just enough to get their degree.[8] Alternately, the Korean-speaking resident alien should be spurred by “integrative” motivation to learn English well enough to succeed in his adopted country.

Finally, the attitude of language learners to the target language may be responsible for their linguistic progress, according to Schumann’s Acculturation Model.[8] Students who feel socially and psychologically alienated from the new culture are more likely to hit an early plateau in their language acquisition resulting in some second-language errors becoming permanently “fossilized.”

Past English Experiences Several of this group of students took a special section of first-year composition designated for international students. ESL instructors tend to emphasize providing a “secure” and “successful” environment for their students; one researcher claims that “…ESL writing experiences are

typically too easy.”[9] In addition, because of the predominance of the process approach to writing in North American classrooms, many student writers are unaccustomed to having to produce relatively polished text within a short time period. Anecdotally, students have told me that they often consult with staff of the Writing Lab at Purdue or with native-speaking roommates or friends, before submitting final drafts of papers. With such resources at their disposal, it is not surprising that all of the students who were doing the third writing sample had earned an “A” or “B” in their prior English composition courses.

There also can be a mismatch between the rhetorical modes favored in composition classes and those preferred in engineering classes. When different genres are prized in English departments, ESL students may not have the opportunity to practice the types of writing needed in engineering classes.[9] However, the engineering departments also bear their share of the responsibility for not providing more explicit writing instruction in engineering genres and ongoing opportunities for practice.

Error Gravity It is important to note that not all writing errors are considered equally serious by all readers. Research shows that humanities instructors tend to be much more tolerant of the writing errors of nonnative-speaking students than their colleagues in engineering and the sciences. The latter are more likely to form a negative opinion of ESL writers because they consider their errors “careless,” “irritating,” and “unacceptable.” According to a hierarchy of writing errors, they range in “acceptability” ratings by native speakers from spelling and article errors to the more serious verb form errors.[10] Examples from the writing samples of this group of students illustrate this [italics added]:

“The electric power source will generate the light bulb, makes the chrome wire produce the light and heat.” (Indonesian speaker)

“They are invented since 19th century and were widely use in everywhere in the world.” (Chinese speaker)

Thus, errors that have little effect on comprehensibility (such as spelling and articles)

42

2005 IEEE International Professional Communication Conference Proceedings

are less significant than those that distort meaning.

Interestingly enough, there can be error variability in a language learner’s writing. In this situation, the writer may use a verb form correctly, for example, in one clause and incorrectly in another clause in the same sentence.[8]

Timed Writing Assessments A timed writing test can cause specific problems for nonnative-speakers. A survey of Malaysian and Indonesian students at Indiana University revealed that 72.3 percent identified “writing quickly” as a problem; 83.2 percent, “writing concisely;” and 73.3 percent, vocabulary. The researcher found that these problems are correlated with shorter stays in the United States and a lack of prior use of English as a medium of instruction in their home countries.[11]

In a timed writing assessment, ESL writers tend to concentrate on the content of the prompt, to the detriment of their control of language.[8] Initial concern with content before turning to language is the way many ESL (and first-language) instructors teach writing, so it is not surprising that these students would approach a timed writing in this manner. Yet in a timed writing, these same students are likely to bypass some of the useful steps they are taught in composition courses [12], namely prewriting (despite the space prominently provided for this purpose on the first page of the writing sample) and revision. Unfortunately, ESL students are often unable to pace themselves appropriately to be able to plan, write, and revise within the time limit.

Standardized writing tests produced by Educational Testing Service for both native and non-native speakers of English vary in length from 20 minutes for the SAT II: writing subject tests to 45 minutes for one of the GRE writing tasks, depending upon the construct being tested. The Test of Written English (TWE), given along with the Test of English as a Foreign Language (TOEFL) and used for admission of international students to North American universities, is a 30-minute essay, similar to the ECE writing sample. Research has shown a moderate .3 score increase on the TWE’s six-point holistic scoring scale when examinees were given an extra 15 minutes to write. However, the importance of the mean-score increases was cancelled out by the fact that

the papers were rank-ordered identically despite the time-limit differences.[13]

Of course, test anxiety also may negatively affect the performance of an international student having to write under time pressure.[14] This is especially true when the stakes of the writing assessment are high, for instance in an entrance or exit exam. However, research has indicated that shorter time limits have no real effect on “students who identified themselves as slower-than-average writers who usually feel pressured on timed test.”[13]

The Need for Timed Writing

There has been some past debate in the ESL academic community as to whether timed writing is “real” writing; another view holds that anywriting is “real.”[14] Certainly, short answer items appear in many engineering exams, and answers requiring longer prose responses commonly are needed in many of the humanities courses engineering students use to fulfill general education requirements.

Another reason why ESL university students need to be able to write under pressure is the continuing presence of timed essays as part of the major standardized tests they must take to apply to graduate and professional programs in English-speaking countries. The recent introduction of the GRE’s Analytical Writing section with its two timed essays suggests that its addition was in response to the desire of many graduate programs for an indicator of prospective students’ writing ability.

Beyond the demands of higher education for writing under time constraints, it already has been established that today’s engineers often spend a majority of their time on the job communicating. Several studies have indicated that email is a major form of workplace communication, both within the corporate/industrial environment and with the outside.[2, 3, 15-17] Email is increasingly used for the exchange of general and technical information to individuals, as well as groups, and more so in companies trying to go “paperless.”[3] It is also used to create an “electronic paper trail” of progress on projects.[17]

43

2005 IEEE International Professional Communication Conference Proceedings

Of all the forms of professional communication, email perhaps is closest to the timed writing sample in several shared characteristics. Both are composed quickly, with little time for the lengthy revision processes of writing process pedagogy. There is the need that each is crafted to meet the expectations of a specified audience. In the case of the ECE writing sample, the audience is an “educated non-engineer.” When student authors violate these expectations, perhaps by writing in an inappropriate linguistic register, or by using technical formulas or, alternatively, by “dumbing down” the analysis, the paper receives a lower score. Interestingly enough, along with the control of syntax, grammar, punctuation, and spelling, ECE professors who completed my survey ranked “know[ing] how to address different audiences” as very important communication skill for both undergraduate and graduate-level engineering students.

Finally, email, similar to other forms of written communication, is more permanent than oral communication. Therefore, errors occurring in print are less easily forgotten. Authors of email messages need to be aware that, despite the speed with which they may have composed a message, they must check it for content and grammatical accuracy before clicking on “send.” As many email users have discovered, sent email usually cannot be retrieved, and poorly written emails may stigmatize the writer. In the engineering profession, in which good communications skills are one criterion for promotion, the cost of ineffective online communication may be a lot more painful than a low grade on a paper.[3]

Recommendations

Despite the predictable unpopularity of the following suggestion among some ESL writing instructors, I believe that there are a couple of reasons why ESL students need more practice with in-class, spontaneous writing. First, timed writing experiences are not likely to disappear anytime soon from higher education. For ESL students, in particular, performance on timed writing exams may determine their placement into and exit from some English programs. With the recent addition of two essays to the GRE, most graduate and professional programs now require some type of a timed writing test.

Secondly, because of the predominance of email as a means of communicating in the 21st century, all students—and not only ESL engineering majors—need to be able to write quickly, clearly, concisely, and as nearly error-free as possible. They need to be made aware of the consequences of their grammatical errors in the “real world.” While some errors, such as the omission of articles, may be considered “acceptable,” they may be simultaneously regarded as “careless.”[10] ESL students, in particular, need help in identifying and learning to edit their frequent errors. It has been pointed out that many ESL writers do not edit their work because they think it is “tedious” or “unimportant” or they have come to rely on others (instructors, tutors, or friends) to do it for them.[18] Unfortunately, the speed of the “computer culture” makes some students even less willing to revise text that has already scrolled off-screen. [16]

Given the sustained effort it takes to become communicatively competent in a new language, requiring international students to take only one writing course, usually during their first or second year at the university, does not seem adequate. Many ESL students, such as those who could not pass the ECE writing sample on their first or second attempt, could benefit tremendously from some type of ongoing institutionalized aid as their writing in English matures.

In addition, engineering schools need to provide many more opportunities for writing in the curriculum. Although engineering professors often insist that they have more content than they can cover at this point, occasional five-minute in-class writing assignments would help students develop “thinking-on-their-feet” skills that they will need on the job. To simulate future workplace communications activities, engineering professors can require email updates from students on their progress on long-term engineering projects.[17]

Finally, I think that engineering professors often give international (and native-speaking students, for that matter) mixed messages about English. On the one hand, these are the same professors who are most critical of nonnative errors in writing.[10] On the other hand, they do not consistently evaluate students on the quality of their writing when they make assignments requiring writing. For longer writing assignments such as lab reports and senior design reports,

44

2005 IEEE International Professional Communication Conference Proceedings

research suggests that professors should hold students responsible for the “clarity, form, grammar, and style” of their writing as well as its technical content.[3] Use of a grading rubric, spelling out specific criteria for writing, can provide feedback to student writers and make grading less of a burden for engineering professors who may feel uncomfortable in evaluating student writing.[19] Unless international students perceive that accurate, concise writing is important to their engineering professors, they are not likely to be motivated to improve their skills.

Whether ESL engineering students remain in an English-speaking country after graduation or return to their native country, as technological professionals English will always be a part of their lives. The vast majority of engineering journals and conference proceedings, especially those considered most prestigious, are written in English, and email correspondence outside of their country likely will take place in English.

References

[1] Engineering Accreditation Commission. (2004, Nov.). Criteria for accrediting engineering programs. Accreditation Board for Engineering and Technology, Inc., Baltimore, MD. [Online]. Available: http://www.abet.org.

[2] P. Sageev and C. J. Romanowski, “A message from recent engineering graduates in the workplace: Results of a survey on technical communication skills,” Journal of Engineering Education, vol. 90, no. 4, pp. 685-693, Oct. 2001.

[3] D. Vest, M. Long, and T. Anderson, “Electrical engineers’ perceptions of communication training and their recommendations for curricular change: Results of a national survey,” IEEE Trans. Prof. Commun., vol. 39, no. 1, pp.38-32, Mar. 1996.

[4] J. Lax. (2001, June). “’Engineering’ better writing for undergraduate students,” Presented at ASEE Annual Conference and Exposition. [Online]. Available: http://www.ASEE.org.

[5] J. Reid, “Advanced EAP writing and curriculum design: What do we need to know?” in On Second Language Writing, T. Silva and P. K.

Matsuda, Eds. Mahwah, NJ: Lawrence Erlbaum Associates, 2001, pp. 143-160.

[6] Educational Testing Service. [Online]. Available: http://www.ets.org.

[7] J. Reid, “Which non-native speaker? Differences between international students and U.S. resident (language minority) students,” NewDirections for Teaching and Learning, no. 70, pp. 17-27, summer 1997.

[8] J. Carson, “Second language writing and second language acquisition,” in On Second Language Writing, T. Silva and P.K. Matsuda, Eds. Mahwah, NJ: Lawrence Erlbaum Associates, 2001, pp. 191-200.

[9] W. Grabe, “Notes toward a theory of second language writing,” in On Second Language Writing, T. Silva and P. K. Matsuda, Eds. Mahwah, NJ: Lawrence Erlbaum Associates, 2001, pp. 39-57.

[10] R . J. Vann, F. O. Lorenz, and D. M. Meyer, “Error gravity: Faculty response to errors in the written discourse of nonnative speakers of English,” in Assessing second language writing in academic contexts, L. Hamp-Lyons, Ed. Norwood, NJ: Ablex, 1991, pp. 181-195.

[11] M. S. Ali. (1995, Jan.). The major quantitative findings of a study of the English-language-based study skills problems of two groups of foreign students at an American university. ERIC doc. 384 028, p. 26.

[12] C. Polio and M. Glew, “ESL writing assessment prompts: How students choose,” Journal of Second Language Writing, vol. 5, no. 1, pp. 35-49, 1996.

[13] H. M. Breland, B. Bridgeman, and M. E. Fowles. (1999). Writing assessment in admission to higher education: Review and framework. College Board Report no. 99-3 [Online]. Available: http://www.ets.org.

[14] L. Hamp-Lyons, “Fourth generation writing assessment,” in On Second Language Writing, T. Silva and P. K. Matsuda, Eds. Mahwah, NJ: Lawrence Erlbaum Associates, 2001, pp. 117-127.

45

2005 IEEE International Professional Communication Conference Proceedings

[15] J. Mackiewicz, “Which rules for online writing are worth following?: A study of eight rules in eleven handbooks,” IEEE Trans. Prof. Commun., vol. 46, no. 2, pp. 129-137, June 2003.

[16] M. H. Abdullah. (2003, Dec.) “The impact of electronic communication on writing,” ERIC doc. 477 614, p. 4.

[17] D. Vest, M. Long, and T. Anderson, “Relating communication training to workplace requirements: The perspective of new engineers,” IEEE Trans. Prof. Commun., vol. 38, no. 1, Mar. 1995.

[18] D. Ferris, Treatment of Error in Second Language Student Writing. Ann Arbor, MI: U. of Michigan Press, 2002.

[19] J. Lax. (2002, June). “Issues in having international teaching assistants in engineering evaluate undergraduate writing.” Presented at ASEE Annual Conference and Exposition. [Online]. Available: http://www.ASEE.org.

About the Author

Joanne Lax is the communications specialist for the School of Electrical and Computer Engineering at Purdue University, West Lafayette, Indiana, where she teaches graduate courses in academic and written communication for international students, and coordinates undergraduate communications activities. She received B.S. and M.S. degrees in journalism from Northwestern University in 1977 and 1978, respectively, and an M.A. in English linguistics, with a specialization in English as a Second Language writing, from Purdue University in 1994.

46