JALT Testing & Evaluation SIG Newsletter
Vol. 8 No. 1. March 2004. (p. 2 - 8) [ISSN 1881-5537]
PDF PDF Version

An Interview with Andrew Cohen

by Gholamreza Hajipour Nezhad

Photo of Prof. Andrew Cohen
Andrew Cohen is a professor in the MA in ESL Program at the University of Minnesota Twin Cities Campus. Since 1993 he has been Director of the National Language Resource Center at the Center for Advanced Research on Language Acquisition. From 1996 to 2002 he also served as Secretary General of the International Association of Applied Linguistics (AILA). This interview was conducted electronically in January 2004.


How did you become interested in assessment and in pragmatics?

My interest in language assessment dates back many years. When I went to grad school at Stanford's International Development Education Center, I became interested in bilingual education and found myself assuming the role of Internal Evaluator for a federally-funded bilingual education program. In that capacity I had to test Mexican-American and Anglo-American children regularly.
". . . in the late 1970s, I noticed that we weren't testing functional language behavior such as speech acts in the way we were testing other areas of language behavior."

At UCLA one of the first classes I taught was language testing. As an outgrowth of teaching that course for a few years, I wrote Testing Language Ability in the Classroom in 1980. I then revised it extensively, and it appeared as Assessing Language Ability in the Classroom with Heinle & Heinle in 1994. My interest in language assessment has not wavered since then. The area that I have continued to have particular interest in is how respondents produce answers on language tests - the language strategies and test-taking strategies that they use. In fact, I am currently co-project investigator of a research study funded by Educational Testing Service to determine how respondents produce answers to items on the reading section of a prototype of the New TOEFL.

[ p. 2 ]

I came to the field of pragmatics more recently, but out of my interest in language assessment. As I was writing my language testing book in the late 1970s, I noticed that we weren't testing functional language behavior such as speech acts in the way we were testing other areas of language behavior. It seemed like a gap to me. It was at that point that I shared my concerns with Prof. Elite Olshtain. Together we decided to try our hand at constructing a measure of speech act ability. Our efforts were published in Language Learning in 1981.
That marked the beginning of my work in the field of pragmatics. After that initial study, Elite and I have continued to do research on speech acts for many years. And although the concern has often been for improving research methods and measures, our interests have taken us pretty far away from testing as well. For example, we did a study just looking at the strategies learners were using to produce speech acts in role play situations in TESOL Quarterly in 1993. And most recently, a doctoral student in education, Noriko Ishihara, and I have been engaged in efforts to enhance the strategies that intermediate learners of Japanese use in learning speech acts through self-access web-based instruction. Prof. Olshtain has served as a curriculum consultant on this project. The learner site with self-access units is at http://www.carla.umn.edu/speechacts/japanese/introtospeechacts/index.htm.
At the same time, I continue to pursue the issue of constructing tests of pragmatic performance since we still do not have robust collections of such tests. My latest publication on this will appear shortly in a book Diana Boxer and I have edited.

What factors make the interface between the two fields significant for research?

Producing language assessment measures that adequately assess pragmatic performance in a second language is a real challenge. Part of the problem is that in order for teachers to test for, say, speech act ability in speaking or in writing, they must have benchmarks or norms for appropriate behavior in those areas. This information has until recently been hard to find, and textbooks don't usually provide very much help. This, by the way, is why we set up a teacher website on speech acts at www.carla.umn.edu/speechacts/ — namely, to give teachers and curriculum writers a place to go where they can find descriptions of the common speech acts, with suggestions for how they might teach this information.

[ p. 3 ]

". . . knowledge of speech acts such as requesting, refusing, complimenting, thanking, apologizing, and complaining, can be crucial for successful communication in second and foreign language situations."


And most of the research done in the area of pragmatics is intended for research purposes and not for classroom applications. So, for example, Thom Hudson has made this disclaimer with regard to the fine speech act batteries he and colleagues have constructed at the University of Hawai'i. He has gone on record as not recommending them for classroom use. I guess the problem is that since sociolinguistic data vary, it is difficult (and sometimes even impossible) to have definitive answers as to whether a given form is right or wrong. It will depend. So testing experts may feel the safest approach is not to propose any tests of pragmatics at all. But then since teachers often teach to the test, they will not feel the need to teach pragmatics, and here we have a problem. It has been demonstrated time and again that knowledge of speech acts such as requesting, refusing, complimenting, thanking, apologizing, and complaining, can be crucial for successful communication in second and foreign language situations.

What do you plan to talk about during the 2004 JALT Pan-SIG Conference in Tokyo?

My plenary, "The Interface among Pragmatics, Assessment, and Language Teaching" will entail a PowerPoint presentation aimed at calling attention to the burgeoning field of pragmatics and how it still is underrepresented in the area of language assessment. My message to teachers is that they can do something about it. After providing a brief description of early approaches to assessing speech act performance, I will consider more recent efforts at assessing speech act ability. Then, I will highlight some factors in assessing speech acts: contextual parameters (i.e., setting, participants, purposes for the interaction, form and content, tone, language, norms of interaction, and genre), research findings with regard to contextual parameters, and insights about raters and ratings. I'll end the talk with recommendations for L2 teachers regarding the assessment of pragmatics in the classroom.
I will also invite participants to join me for a workshop, entitled "Constructing Speech Act Tasks for Classroom Assessment." Working in groups or individually, participants will design and construct one or more tasks intended to assess the speech act ability of classroom language learners (focusing on one or more of the following: requests, complaints, apologies, compliments, thanking, or refusals). The task construction task will include determining the elicitation formats, the response formats, and methods for assessing the responses.

[ p. 4 ]

What are the main points you emphasize in your testing research?

I have maintained for many years a keen interest in how respondents produce answers to language tests. I have looked at how they respond to multiple-choice items (as we are now doing with the ETS study), how they write summaries of reading texts, how they compose essays (as well as how they respond to feedback on these essays), how they arrive at utterances on spoken role-play tasks, and the like. I pursue this interest largely because it is a valuable means for determining the validity of the measures. We can number-crunch a test endlessly and still lack insight as to why respondents answered items and tasks as they did. It is through verbal report means that we are able to explore the respondent's thinking processes. For example, why did a respondent choose "c" rather than "a," when "a" was the correct choice? Just as respondents might get an item correct purely by chance, they may also use excellent powers of logic in producing the wrong response. It can be helpful to get behind the scenes a bit to find out what is going on when respondents take a test. It may help give the teacher or test constructor real insights as to strengths and weaknesses about their tests.
There are actually three kinds of strategies that respondents may use in responding to reading comprehension questions: reading strategies, test-management strategies (e.g., reading all the choices before choosing one, watching the clock, etc.), and test-wiseness strategies (rejecting a distractor because it couldn't be true based on their knowledge of the world, matching material from a distractor with words from the text, etc.).
I am also interested in trying out innovative formats for language tests. The one I have been playing around with of late is the multiple-rejoinder discourse completion test. The purpose here is to convert an indirect, written measure of speech act ability into something, which simulates an actual conversation to a larger extent, where the interlocutor offers a series of rejoinders.
In other words, the respondent must read through a full set of rejoinders and "fill in the blanks" with discourse that makes sense across an entire exchange. In the past, discourse completion would generally have the respondent have one turn where they could take the discourse in whatever direction they liked. The multiple-rejoinder approach puts more constraints upon them to adhere to the given discourse, and is thus more demanding.

[ p. 5 ]

What are your future projects?

I have already spoken about two of my ongoing projects. The first is the project with ETS to determine how respondents produce answers to the reading section of the New TOEFL.
A second project involves the work on speech acts with teachers and learners. This research project (2002-2006) was designed to determine the effects of training second language speakers of Japanese and Spanish to learn and use speech acts more successfully while communicating in those languages. This past fall semester (2003), we conducted an experiment to determine the effects of training non-natives to learn and use pragmatic information more successfully in speaking Japanese. The web-based materials were made a component of the regular third-year Japanese curriculum at the University of Minnesota; three modular units are being assigned on a trial basis as homework in each of the intermediate Japanese classes. The next phase will involve the design of a Spanish study in replication of the Japanese one – in other words, the development of strategies-based instructional materials for enhancing the learning and effective use of pragmatic knowledge about Spanish speech acts, as well as the field testing of this material (2005-2006).
A third project has involved my colleague Michael Paige and I, along with a team of colleagues and graduate students, in the production of a series of guidebooks intended to respond to the felt need to provide study-abroad enhancement materials. The result was a guidebook series entitled Maximizing Study Abroad Through Language and Culture Strategies. The three-part series of Maximizing Study Abroad guidebooks employs a user-friendly, strategies-based approach to language and culture learning. The series includes: the self-access Students' Guide, the Program Professionals' Guide, and the Language Instructors' Guide. All three of the guides are now published and are available through the University of Minnesota bookstore.

[ p. 6 ]

Research is currently being conducted to field test the three guidebooks in the Maximizing Study Abroad series. The primary research question is: To what degree and in what ways can a strategies-based approach to developing language skills and enhancing ability to function in a new culture, transmitted by means of a set of integrated study abroad guides for students, program professionals, and instructors respectively, promote language gain and cultural adaptation by students abroad? We have most of the data collected from two cohorts of 88 students altogether who were on study abroad either in Spanish or French-speaking countries. Half received the guidebook and half did not. In addition, during the 2003-2004 academic year, we are conducting a case study of nine study abroad advisors and five on-site coordinators using the Program Professionals' Guide and four instructors using the Instructors' Guide. So there is going to be lots of material to write up for this study in the form of reports and articles.
Some of my forthcoming publications include:
Boxer, B. & Cohen, A. D. (Eds.) (2004). Studying speaking to inform second language learning. Clevedon, England: Multilingual Matters.

Hamilton, H. E. & Cohen, A. D. (2005). Creating a playworld: Motivating learners to take chances in a second language. To appear in J. Frodesen & C. Holten (Eds.), The Power of Context in Language Teaching and Learning. Boston: Heinle & Heinle.

Cohen, A. D. & Gómez, T. (2008). Towards enhancing academic language proficiency in a fifth-grade Spanish immersion classroom. In D. M. Brinton & O. Kagan. (Eds.), Heritage language acquisition: A new field emerging. Mahwah, NJ: Lawrence Erlbaum.

I have just completed a major rewrite of a manual I co-authored with S. J. Weaver on styles and strategies-based instruction. We have been using this manual for eight years in our CARLA summer institutes at the University of Minnesota.
I am also co-authoring a chapter on assessing Spanish second language ability along with Rafael Salaberry for a volume to by Barbara Lafford and Rafael Salaberry entitled Spanish Second Language Acquisition: State of the Art of Application through Georgetown University Press.

Works Cited

Cohen, A. D. (1980). Testing language ability in the classroom. Rowley, MA: Newbury House.

Cohen, A. D. (1994). Assessing language ability in the classroom. 2nd Ed. Boston: Newbury House / Heinle & Heinle.

[ p. 7 ]

Cohen, A. D. (2004). Assessing speech acts in a second language. In B. Boxer & A. D. Cohen (Eds.). Studying speaking to inform second language learning (pp. 302-327). Clevedon, England: Multilingual Matters.

Cohen, A. D., & Olshtain, E. (1981). Developing a measure of sociocultural competence: The case of apology. Language Learning, 31 (1), 113-134.

Cohen, A. D. & Olshtain, E. (1993). The production of speech acts by EFL learners. TESOL Quarterly, 27 (1), 33-56.

Cohen, A. D., Paige, R. M., Kappler, B., Demmessie, M., Weaver, S. J., Chi, J. C., & Lassegard, J. P. (2003). Maximizing study abroad: A language instructors' guide to strategies for language and culture learning and use. Minneapolis, MN: Center for Advanced Research on Language Acquisition (CARLA).

Cohen, A. D. & Shively, R. L. (2002/2003). Measuring speech acts with multiple rejoinder DCT's. Language Testing Update, 32, 39-42.

Cohen, A. D. & Weaver, S. J. (2004). Styles and strategies-based instruction: A teachers' guide. Minneapolis, MN: Center for Advanced Research on Language Acquisition, University of Minnesota.

Paige, R. M., Cohen, A. D., Kappler, B. J., Chi, J. C., & Lassegard, J. (2002). Maximizing study abroad: A program professionals' guide to strategies for language and culture learning and use. Minneapolis, MN: Center for Advanced Research on Language Acquisition.

Weaver, S. J. & Cohen, A. D. (1997). Strategies-based instruction: A teacher-training manual. (CARLA Working Paper Series #7). Minneapolis, MN: Center for Advanced Research on Language Acquisition, University of Minnesota.


Newsletter: Topic IndexAuthor IndexTitle IndexDate Index
TEVAL SIG: Main Page Background Links Network Join
last Main Page next
HTML: http://jalt.org/test/coh_haj.htm   /   PDF: http://jalt.org/test/PDF/Cohen.pdf

[ p. 8 ]