Skip to content

Composition Forum 44, Summer 2020
http://compositionforum.com/issue/44/

Super-Diversity as a Framework to Promote Justice: Designing Program Assessment for Multilingual Writing Outcomes

Mya Poe and Qianqian Zhang-Wu

Abstract: While Writing Studies scholars have embraced research on multilingualism, writing scholars have not developed program assessment methods that are informed by that scholarship. This profile describes a program assessment design that was informed by research on multilingualism, super-diversity, and consequential validity. This design included student survey data, student interviews, scoring data, and institutional data with specific attention to language and mobility. Such a design allowed us to capture multiple sources of evidence to make valid inferences about the writing of a complex population. Moreover, the bottom-up collaborative process used in this assessment design echoed the program’s deep-rooted commitment to social justice in ongoing program research.

The history of Northeastern University (NU), founded in 1898, is often told as one originating from working class roots--an educational institution established to “merge, coordinate, and improve the effectiveness of the classes ... providing part-time and supplementary education for young men” (Marston 9). Yet, the institution was also born at a time of enormous global change, resulting in both curricular innovations such as co-operative education and the recruitment of international students. For example, by the 1970s, international student presence at NU meant that the university had one of “the largest [international student] groups in U.S. higher education” (Frederick 346). In response to the growth of international students, the university developed non-credit bearing courses, such as a course called English for International Students, that were offered through the English department. English for International Students was described in the 1967 course catalog as “an intensive review of the basic mechanics of English grammar and punctuation” with readings “related to American life” (146). The development of such non-credit bearing courses, which were required prior to students entering first-year writing courses, was clearly a signal that the university identified international students as English language learners in need of basic language instruction. But as the university would change over the next 100 years, so would its international student population.

By the end of the twentieth century, the university embraced internationalization as part of its institutional identity and not just as a student recruitment issue. Currently, the university enrolls more than 14,000 international students coming from over 140 countries (Northeastern 2020). International students comprise 20% of the university's undergraduate student population as well as upwards of 50% of the graduate student population (Northeastern 2019). The latest growth in international students at NU has also coincided with the university’s rise in college rankings. Now listed at #40 in the U.S. News and World Report College rankings, the university attracts international students who might be characterized as global citizens along with highly-competitive domestic students. One of the impacts of recruiting highly-competitive international students is that multilingual international students are no longer placed in ESL programs as a form of “linguistic containment” (Matsuda 638). In fact, most international students at NU today enter with TOEFL scores of greater than 99 (Northeastern University Undergraduate Admissions), which indicates advanced or above high intermediate English proficiency according to the ETS proficiency standard (ETS 2020). Many of these students matriculate directly into first year writing program classes. Other high-performing international students exempt-out of first-year writing because they have AP or IB credit. These students move directly into a required upper-level writing course called Advanced Writing in the Disciplines. The result are writing classrooms that include domestic “monolingual” students, domestic multilingual students, and international students who come with a range of English language instructional experiences. This reality of writing classrooms at NU creates a rich linguistic ecology.

In response to this demographic change in writing program classes, the NU Writing Program began a series of research and research-based pedagogical changes beginning in 2009 to better understand and address the needs of multilingual students. One undertaking was a study of multilingual writers called the Multilingual Writers Research Project (MWRP). The MWRP brought forth important data about NU students as well as fostered theoretical, curricular, and methodological changes to the NU Writing Program. The MWRP also has helped the program re-envision what we mean by linguistic diversity in our local context--specifically, the recognition that our students’ linguistic identities are very much tied to mobility.

Through the work of the MWRP, we have found the concept of “super-diversity” to be especially helpful because it highlights the ways that migratory flows result in communities where members draw on linguistic repertoires in quite varied ways (Vertovec, “Talking Around”). For the NU writing program, the concept of super-diversity has been formative as we think about research design, placement testing, course offerings, and tutoring (Benda et al. Confronting; Benda et al. Confronting Again). Super-diversity reminds us that our students are not merely L1/L2 students, but they are a highly mobile population with complex linguistic identities. The NU writing program, however, had not addressed how this evolving understanding of our students’ linguistic identities might shape program assessment. Given the enormous potential of assessment to do harm to multilingual students through incorrect placement in basic writing courses, over-remediation, or mischaracterizing their potential through poor reader training (Gomes; Scott-Clayton; Sternglass), we felt compelled to turn our attention to questions of linguistic diversity in our program assessment. In this program profile, we describe our most recent efforts toward program assessment. In that program assessment, our goal was not merely to assess student outcomes but to assess student outcomes with a focus on linguistic diversity, mobility, and justice. To those ends, the questions that guided our program assessment were:

  • How do AWD students identify their linguistic identities? How do they identify their linguistic identities in different contexts?

  • Are AWD students meeting the general education learning goals for writing courses? How are students who identify as multilingual meeting those goals?

  • How do multilingual students who enroll in our first-year writing course for multilingual students perform in AWD? How do multilingual students who exempt out of FYW (because of AP credit or IB credit) perform in AWD?

  • What can students tell us about their linguistic identities that expand, challenge, or reshape our answers to the above questions?

1. The Multilingual Writers Research Project

The Multilingual Writers Research Project (MRWP) was established in 2009 by Chris Gallagher, who was then Director of the Writing Program, and Neal Lerner, who was then director of the writing center. Gallagher and Lerner brought together writing faculty, teaching professors and graduate students to better understand the shifting demographics at NU with the goal of better supporting students in writing classes (Gallgher and Noonan). The MWRP research project included several programs of research, including a survey, student interviews and focus groups, course redesign proposals, and a faculty working group. The purpose of the MWRP survey and interviews, which have been repeated three times in the past 10 years, were to “better understand multilingual students’ expectations, experiences, and aspirations for writing in English at Northeastern and beyond” (Benda et al. Confronting 85). Survey results showed that NU’s multilingual student population have a rich linguistic profile. Of the students who completed the 2018 survey (n=1093), for example, 35% reported writing and speaking two languages or more. Additionally, 35% reported writing in a language other than English to their friends or other students at Northeastern and 20% reported composing in languages other than English even when the final product was submitted in English. Finally, 86% said that most of their schooling experiences had been in English and that they expect to write in English in the workplace. Such survey results pointed to the global force of English in school and work contexts, even for writers whose first language is not English.

The Multilingual Writers Research Project, though, also revealed another reality of NU students: through interviews, it became clear that NU students are highly mobile, and their multilingual identities are complex and layered because of that mobility. Their identities are deeply shaped by the super-diverse worlds in which they live. Super-diversity is a term coined by Steven Vertovec, anthropologist and Director of the Max Planck Institute for the Study of Religious and Ethnic Diversity, to describe modern migration flows (World Migration Report) and forms of communication in the world (Super-diversity). As a conceptual framework, super-diversity “jettisons the rather rigid toolkit of speech communities, ethnolects and mother tongues in favour of notions of truncated repertoires and resources that better capture the plurality of styles, registers and genres of people living in a globalized world” (Mutsaers and Swanenberg 65). Simply put, super-diversity is a way to acknowledge that linguistic identity is deeply shaped by mobility and with that mobility comes a linguistic identity shaped by fragments of languages. So it’s not that we are not interested in the languages that students know and how well they think they know them; for us, super-diversity is more the recognition that the worlds our students will live in are linguistically messy, where they must shuttle across languages and dialects to get things done, and that correctness alone is not the way we should be assessing what they can do with language.

In the end, as Benda et al. write, “the findings from the MLWRP remind us to resist broad categorizations of multilingual student writers at Northeastern and make us more aware of the diversity of languages, cultures, experiences, and expectations students bring to bear in our work with them in and out of the classroom” (Confronting 89). As a result, we now see our linguistically diverse, highly mobile students as the norm in our classrooms. To those ends, over the last 10 years we have redesigned our placement test and moved to guided self-placement; we changed our tutor training practices in the writing center; we added credit-bearing first-year and advanced writing courses that multilingual students can self-select into. But there remained more to do in that we had not changed our outcome assessment practices.

2. Designing Program Assessment for Multilingual Writing

In designing a program assessment plan for learning outcomes, we recognized that conventional outcomes assessment would be inadequate because it does not account for linguistic diversity in identifying traits to be scored or setting scoring categories. Traditional outcomes assessment limits the construct of writing or speaking to variables that are measurable at a moment in time. Outcomes assessment eschews nuance for measurability because writing is not simply described; it is evaluated in relation to a standard (criterion-based) or against other samples of writing (norm-referenced). With that said, for the last two decades writing studies researchers have been advocating for locally-informed assessments (Condon; Huot), which would allow us to embrace super-diversity in our design. Yet, while super-diversity offered us a robust theoretical framework, it did not lend itself easily to program assessment. Super-diversity was developed as a descriptive term, and methodologically, most of the research that has followed from it has been ethnographic in origin (See research on linguistic landscapes, such as Gorter; Pennycook and Otsuji). Ethnographic research is expansive and can account for large spans of time. Additionally, ethnographic research often adopted in the study of superdiversity is able to capture many different stratified differences that are often very nuanced and situated (Blommaert). So, while it was relatively easy for us to place our local values at the center of our program assessment, it was difficult to figure out how to capture the nuances of our students’ linguistic identities. Our solution to this problem was to design an assessment study that used multiple sources, including student survey data, student interviews, scoring data, and institutional data--each with the goal of providing more nuance to understanding our students’ linguistic identities. Each method would be informed by recent research on multilingualism; together, we hoped they would create a portrait of students whose literacy experiences were informed by the super-diverse spaces in which they study and live. This composite design would allow us to capture multiple sources of evidence to make fair, valid, and reliable inferences about our complex population.

In looking at recent research on multilingual writers, we specifically looked at the assessment scholarship. Many studies have investigated how texts produced by multilingual writers are evaluated as compared to those written by the so-called native speakers of English (e.g., Friginal, Li, and Weigle; Johnson and VanBrackle; Lindsey and Crusan; O’Hagan and Wigglesworth; Rubin and Williams-James). For example, Eric Friginal, Man Li, and Sarah Weigle compared linguistic clusters of highly rated essays from native and nonnative speakers. In two studies, one by Donald Rubin and Melanie Williams-James and another by Peggy Lindsey and Deborah Crusan, both examined how writing assessment could be affected by instructors’ perceptions toward students’ nationalities. Rubin and Williams-James found that while nonnative English speaking students tend to have more surface errors in their texts, instructors have the tendency to favor Asian students’ compositions over those produced by native English speakers. Contrary to their findings, Lindsey and Crusan discovered that essays produced by multilingual international students tend to receive lower scores when graded analytically, although when scored holistically, these exact same essays are likely to be awarded higher grades by instructors. Such studies suggest that the construct of multilingual writing may be unstable because of the assumptions that raters bring to scoring sessions. These assumptions certainly are informed by cultural stigma of language variations (Lippi-Green) which lead to the ways that “racialization and racism underlie and reproduce inequalities in language education practices and policies” (Kubota 3).

We had one other concern in our study design: We wanted to ensure that whatever assessment approach we ended up using that we paid attention to questions of fairness and justice. We were influenced by recent interventions in writing assessment related social antiracism and social justice (See Hammond; Poe and Elliot for reviews). The spirit of such interventions might be best captured through the recent Poe, Inoue, and Elliot collection in which they write, “As a form of research, writing assessment best serves students when justice is taken as the ultimate aim of assessment; once adopted, that aim advances assessment as a principled way to create individual opportunity through identification of opportunity structures” (5). In other words, assessment processes must be directed toward addressing social inequality, not merely fulfilling institutional mandates. Moreover, a justice-oriented approach to assessment ensures that questions about inequality are considered at the beginning of the study, not after data have been collected. For example, while placement testing results reported by subgroups (e.g., African American students performed x while White students performed y), such a process fails to account for how the construct itself or scoring practices might have shaped those results. Let us offer a specific case to illustrate this point. Many writing programs conduct outcomes assessment, and they distill “writing” to traits such as “organization,” “argument,” and “correctness.” Those traits are just a few ways that we might think about what writing is. Writing is also about source use, revision processes, strategy, empathy, and so on. If we expand our notion of the writing construct and consider its social construction, then we recognize how our assessments are laden with ideology. Asao B. Inoue has made this point in his critiques of grading in which he argues that traditional standards for grading writing are racist and that labor-based contract grading offers an alternative.

While Inoue offers contract grading as a justice-based classroom assessment practice, there are few program assessment models that are justice-based. One justice model is proposed by David Slomp, Julie Corrigan, and Tamiko Sugimoto who advance “a systematic approach to collecting and weighing consequential validity evidence, one that is sensitive to racial and sociocultural realities (Solano-Flores, 2011) and that examines both intended and unintended outcomes” (278). What Slomp and colleagues propose is a program assessment design that accounts for the effects or consequences of that assessment. With each dimension of program design, they offer “consequential validity questions” that help researchers attend to the social and ideological aspects of program assessment that often remain hidden. Table 1 shows how we used the Slomp, Corrigan, and Sugimoto model.

Table 1. Consequential validity questions (Adapted from Slomp, Corrigan, and Sugimoto)

 

Consequential Validity Questions

Our Application

Construct Definition

How well is the construct understood?

How stable is this construct across social, cultural or racial contexts?

What features of multilingual writing might not be captured in traditional outcomes assessment?

Solution: add additional traits to the scoring rubric

Solution: Score multiple genres of writing from the same students

Construct Irrelevant Variance

Are related yet distinct constructs interfering with the construct definition as it is embedded in the construct sample?

What aspects of writing assignments might lead to additional difficulties for multilingual writers?

Solution: Interview students about their experiences in writing classes

Design Process

Does the assessment design contribute to potentially adverse impacts, impact on populations demonstrated to be at-risk, and educational systems serving those students?

What harm might come to our students through program assessment?

Solution: Ensure that writing program faculty are involved in the design and scoring processes and that the study has IRB approval so that students can opt-out.

Scoring Procedures

How do scoring procedures influence assessment outcomes, student populations and the educational systems serving those students?

Will the scoring process answer our research questions and attend to justice?

Solution: Add a trait for correctness as well as diversity so that external stakeholders concerns about correctness can be answered but also that we can surface potential positive outcomes related to diversity

Sampling Plan

Does the sampling plan ensure that each population is represented in sufficient quantity to allow descriptive and inferential analysis? If not, what justification is provided for limiting the sampling plan?

How might we have enough samples for statistical analysis?

Solution: Collect writing samples three times from the same students

Disaggregated

Performance

Can differences in performance between all populations be attributed to actual differences in ability in relation to the construct being measured?

How might we gather a more nuanced portrait of multilingual identity for the purposes of analysis?

Solution: Use the survey to construct a more nuanced portrait of multilingual identity and then use that portrait in disaggregating the scoring data

Construct Remodeling

Do differences in student response processes lead both to inaccurate ratings of their performance and to improper decisions on those ratings?

Do multilingual students perform differently in different genres?

Solution: Collect writing samples three types of sample from the same students

Implications:

Intended

What are the intended consequences both for each population impacted by the assessment, and for the educational systems serving those students?

How can we use our assessment results to inform future writing program assessment?

Solution: Involve faculty statekholdes, present our findings, and instantiate this design for future assessment projects.

Implications:

Unintended

What are the unanticipated positive, negative, and unknown consequences both for each population impacted by the assessment, and for the educational systems serving those students?

What gains or harm might be done by our program assessment that we might not intend?

Solution: Attempt to provide a nuanced portrait of multilingual writers’ ability so that results cannot be easily mischaracterized.

To narrow the scope of our study, we focused on advanced writing courses (AWD). AWD is the second of a two-course required sequence of NU writing courses and is taken by all NU students (in comparison to first-year writing which is only taken by about 60% of NU students). Typically taken after students have obtained 64 credits and completed one co-op experience, AWD is designed to give students sustained practice in the genres of disciplines writing for public and scholarly audiences. The writing program runs about 150 sections of AWD each academic year, including sections that are specifically for multilingual writers who seek additional support. AWD courses are offered with various foci, including writing in the technical professions, writing in business, writing in the social sciences, writing in the sciences, and writing in the health professions.

2.1 Survey Design

Rather than design a new survey, we drew on the results of the previous MWRP survey from 2014. We revised the MWRP survey questions to align more closely with our evolving understanding of linguistic identity at NU (Appendix). For example, in survey questions, we did not decontextualize linguistic identity, rather we attempted to ask students about their linguistic identities in different contexts, such as home, school, and work. We avoided terms such as “first language” and “second language” in favor of “strongest language” and allowed students to have multiple “strongest languages.” {1} Finally, we retained the term “multilingual” in the title of our survey. In doing so, we adopted Butler’s definition of multilingual users: “individuals or groups of people who obtain communicative competence in more than one language, with various degrees of proficiencies, in oral and/or written forms, in order to interact with speakers of one or more languages in a given society” (112).

The survey was distributed to all AWD students (N=2035) via a Qualtrics link after spring break. Students were sent three reminders to complete the survey, which was open until June. When students completed the survey, they added their email address, so that we could later correlate the survey responses to the scoring data.

2.2 Scoring of student work

Scoring of student work was meant to provide information about how students were meeting the NU general education (called NUPath) goals for the advanced writing courses (Appendix). Twenty-one AWD instructors were contacted and invited to participate in the study. In selecting courses, we chose a selection of courses across the different AWD “flavors.” This selection of courses and instructors (graduate students, part-time, and full-time instructors) yielded more than 900 samples of student texts, thus allowing us to do statistical analysis on the results with a 95% confidence interval.

In designing the scoring protocol, we were careful to consider that the construct of writing provided in the general education goals did not reflect our evolving understanding of our students’ linguistic identities or super-diversity (see Table 2). Specifically, there were two dangers here: First that the construct of writing may be too narrow to capture the linguistic repertoires of multilingual writers. Second, because of that limitation, it could be possible to draw wrong inferences from our results regarding the question of how students were meeting the learning goals. For guidance on these potential problems, we turned to the research in language testing (Cumming et al.; Lindsey and Crusan; Knoch et al.; Li and Casanave). The Knoch et al. study, for example, confirmed our sense that adding traits related solely to grammar and syntax may not provide useful information. Lindsey and Crusan’s work confirmed our sense that we needed to train readers on texts that evidenced a variety of linguistic features.

In addition to the NUPath learning goals, we added two other traits to the rubric as well as an overall holistic score. The trait “displays correctness of grammar, punctuation, sentence construction” was added in response to potential critiques by external stakeholders that multilingual students might lack English language proficiency. We sensed that this was not the case but wanted to gather empirical evidence to confirm (or refute) our hypothesis. The trait “displays awareness of multicultural or diverse perspectives” was added after much conversation by the assessment committee: the committee sensed that Northeastern’s multilingual student population might evidence diverse perspectives in their writing. We did not want to reduce multilingual writers to only their linguistic identities but also embrace their experiences in moving across contexts.

Table 2. Scoring Rubric

Source

Traits

Scale

   

Yes

High

Yes

Medium

Yes

Low

No

NUPath Learning Goals

Adapt writing for multiple academic, professional, and public occasions and audiences.

       

Display familiarity with the writing conventions of genres in an academic field or profession.

       

Identify credible, relevant sources and engage and cite them appropriately in their written work.

       

Draft, revise, and edit their writing using feedback from readers.*

       

Diversity trait

Displays awareness of multicultural or diverse perspectives

       

External stakeholder trait

Displays correctness of grammar, punctuation, sentence construction

       

Overall holistic score: On a scale of 1 to 6, how would you rate the quality of this project?

*For the descriptive and research assignments, raters marked this trait as n/a.

WPAC members along with a graduate student research assistant and an undergraduate work study student met three times over the semester for scoring--one time to score descriptive assignments, one time to score research assignments, and one time to score reflective assignments.{2} At each session, raters were normed on training samples that reflected a range of genres that raters might encounter in their samples as well as papers that reflected different kinds of linguistic variation. After each training session, each rater was given a set of 20-30 papers to score. Each paper was scored once with 20% of papers re-scored to verify consistency in scoring.

These shorter scoring sessions allowed us to closely evaluate how raters evaluated multilingual student writing across a range of genres and provided professional development opportunities for raters, who were also teachers in the writing program.

2.3 Institutional data

Because we had student IDs associated with papers and survey answers, we could correlate that data with information about students’ course grades, GPAs, majors, and residency. While course grades are inaccurate demonstrations of student writing proficiency (because they also include attendance, effort, participation, etc.), they do offer a window into performance in one course in relation to overall GPA. Moreover, while we recognized that writing researchers have often castigated assessment scholars who rely on course grades as markers of proficiency, we speculated that course grades in comparison to assignment sheet analysis and scoring results might give us a window into the effects of monolingual or multilingual course policies. Other forms of institutional data allowed us to take up evidence from multiple sources to look into students’ holistic development in various institutional contexts, including TOEFL score, course histories, and co-op experiences.

2.4 Interviews

A final method employed in our study was “in-depth, phenomenologically based interviewing” (Seidman 9). This method allowed “an openness to changes of sequence and forms of questions in order to follow up on the answers given and the stories told by the subjects” (Kvale 124). In considering the kinds of questions to address in interviews, we took inspiration from writing studies scholars who have advocated for translingual approaches to classroom teaching (what applied linguists might call “linguistically responsive teaching”). Horner et al. capture this sensibility in their recent white paper:

Against the prevailing language ideology of monolingualism, which presents languages as stable and discrete entities for teachers to enable students to write within, a translingual approach to language grants students agency and responsibility for language as the emerging outcome of their writing practices, with language difference thus an inevitability rather than a choice (as in repetitions unavoidably differing from what they repeat). Acknowledging students’ agency as writers for sustaining and revising language(s) changes the pedagogical dynamics of the writing classroom and, necessarily, the work of writing program administration. (2)

Thus, translingualism scholars reminded us to pay attention to student agency in questions about classroom experience regarding language as well as their linguistic experiences at NU. In other words, we needed to remind ourselves that students do not simply enter already-made linguistic contexts; they also work to create and change those contexts. As a result, we strived to tap into students’ contextualized multilingual experiences through the interview protocol. For example, we asked if the instructor acknowledged students’ linguistic diversity. If so, how was that acknowledged--was it a personal trait or rooted in social and cultural contexts?

Although students’ experiences in their writing courses were a key focus of the interview, we also took into consideration their language journeys beyond academic settings (e.g., professional and social contexts) and across time (e.g., future goals, previous experiences). Furthermore, while AWD courses are mostly about academic writing in English, our interview questions were designed to examine students’ multilingual experiences, as the English language only represents a small part of multilingual writers’ entire linguistic repertoire. In the end, through this interview process, we intended to prompt our participants to explore their multilingual identities by reflecting on various forms of communication through different languages in multiple contexts.

3. Next Steps and Observations

While it can certainly be argued that Northeastern’s history is rooted in globalization, NU today reflects a global world where our students are on the move. Our students, often global citizens before coming to Boston, do not come to NU simply to stay in the U.S. More likely, our students will move throughout the NU global network, drawing on their linguistic repertoires in different ways in each of those contexts. For writing programs like ours that serves such students, the design of any assessment of student writing must be done with this reality in mind.

Informed by super-diversity and grounded in a social justice framework, our program assessment attempts to challenge the static views on multilingualism. The super-diversity framework has pushed us to transcend the native/nonnative English speaking divide and consider the fluidity and dynamics of students’ language and writing experiences; the social justice framework prevents us from approaching our research participants without attention to the consequences of assessment.

At the time of this publication, we have begun to analyze the findings. Consistent with institutional data, approximately 20% of students on the survey (n=271) reported being international students. Yet, more than 50% of all students surveyed identified as multilingual. When we asked students how confident they are writing in their strongest language other than English, only 25% said they were “very confident.” And when we asked students where they would live after graduation and whether they would use English or another language, most students reported staying in Boston or the U.S. (93%) right after graduation for work purposes and using English for work (90%) and at home (85%). The results of the survey, to date, are consistent with previous MWRP surveys in that the NU community witnesses tremendous cultural and linguistic diversity. Despite the high percentage of international population at NU, echoing the university’s vision of cultivating globalized citizens, the vast majority of these multilingual students demonstrated an ambition to stay in the U.S. upon graduation and adopt English as a means of professional communication. The scoring data from an analysis of descriptive assignments (n=377) demonstrate that international writers met or exceeded general education learning goals related to audience, genre, and source, although they do not achieve them at the highest levels as domestic students. Moreover, like their domestic peers, they struggled most with the learning goal related to source use. On the learning goal related to awareness of diversity, international students out-performed their domestic students. Finally, on the learning goal related to correctness, most international students met or exceeded the goal. Our next step, then, is to connect the survey data to the scoring data so that we can draw conclusions about multilingual students, regardless of their residency status. To date, the findings are indeed suggesting a more complicated portrait of multilingual writing outcomes and the need for nuanced rubrics and survey instruments for program assessment purposes.

Our work, however, is not without challenges. In survey design, for example, we struggled to define questions about linguistic identity that were not based on static language assumptions. We even struggled to define what we meant by multilingual writer. On one hand, one could argue that all NU students are multilingual because the university has a foreign language requirement. On the other hand, we knew from previous iterations of the survey that many students who scholars might identify as multilingual did not see themselves in the traditional L1/L2 template. After multiple iterations of survey questions, we decided to use terms like “strongest language” on the survey to put emphasis on actual language usage rather than the absolute sequence of language acquired (e.g., first vs. second language). We hope to use the data analysis tools in Qualtrics to help us filter students into different categories for further analysis.

We also struggled with the interviewing protocol as we recognized the limitations of a one-time window into students’ linguistic identities. Language scholars such as Jones, Martin-Jones, and Bhatt and Zhang-Wu (Exploring; Unpacking) have employed language journals in the study of multilingualism. Such daily accounts of multilingual language use in various contexts have proven to be informative sources of information about the shifting ways that participants navigate various linguistic landscapes. Language journals might be a more productive method in future program research because they can tell us how students are negotiating different linguistic contexts and meeting writing program goals in ways that we do not see in outcomes assessment or in single interviews.

Finally, the design of scoring rubric highlighted how, in an attempt to move away from assessment driven by grammatical correctness, linguistic dimensions of writing may be entirely erased. In order to “see” linguistic diversity in student writing, we had to add traits to the scoring rubric that were beyond the NUPath learning goals. In our discussions about those traits, we opted to add a trait for conventions as well as traits for evidence of linguistic diversity and awareness of diverse perspectives. Our thinking was that super-diversity might be best captured through evidence of linguistic or discoursal diversity. We added the trait for grammar and stylistic conventions in anticipation of questions about external faculty concerns about “ESL students.” Assessment research, at the end of the day, is evaluative, and the evaluation of language and literacy in the U.S. is always caught up in the cultural assumptions that stakeholders have about students. To ignore those assumptions is to do potential harm to students.

In conclusion, program assessment that includes attention to multilingual writing outcomes must be done in a way that considers the potential harm done to students through poor theorization, overly reductive assessment practices, or without attention to unintended consequences. The consequential validity framework by Slomp, Corrigan, and Sugimoto helped us keep these issues at the forefront of the study design. That framework prompted us to understand the construct of multilingualism in the various ways that the term has been defined, how the scoring and data collection procedures would influence what we would find from our data, how disaggregation of data into subgroups needed to be related to construct, and the intended and unintended consequences of our program assessment. It is only through the adoption of such advancements in program assessment that we do justice to our students’ super-diverse lived realities.

Acknowledgements: We would like to thank the guest editors--Eunjeong Lee, Norah Fahim, Jennifer Johnson, and Brooke R. Schreiber--of this Composition Forum special issue for their support. We would also like to thank Norbert Elliot for his comments on an earlier version of this profile as well as the Writing Program Assessment Committee, chaired by Thomas Akbari, and writing program faculty who participated in this assessment research. Finally, we would like to thank the College of Social Sciences and Humanities for funding the multigenerational research grant that supported this research and the hiring of Greg Palermo and Devon Regan as research assistants.

Appendix: Comparison of 2014 vs. 2020 Multilingual Writers’ Surveys

2014 Multilingual Writer’s Survey

2020 Multilingual Writer’s Survey

Put emphasis on specific languages students speak.

Puts emphasis on students’ exposure to multilingual contexts and multilingual usage.

Used yes/no questions to examine students’ language usage.

Uses questions where students may select more than one option.

Asked students about the length of time learning a language.

Asks students about length of time learning and using languages.

Included general questions regarding students’ confidence in writing.

Q 16. Now rate your confidence as a writer:

  1. High

  2. Somewhat high

  3. Somewhat low

  4. Low

Includes specific questions regarding evaluating students’ confidence in writing by providing specific contexts.

Q 16. Before AWD, how successful are you able to write for everyday and professional purposes (e.g., cover letter, professional writing during co-op, internship, email, social media)?

  1. Very successful

  2. Successful

  3. Acceptable

  4. Unsuccessful

Had little association established between the AWD courses and students’ learning outcomes. Instead, attention was directed to what students have done in the writing courses.

Q 24: Which of the following kinds of writing have you been assigned to do at Northeastern?

  • Narrate or describe one of your own experiences

  • Summarize something you read, such as articles, books, or online publications

  • Analyze or evaluate something you read, researched, or observed

  • Describe your methods or findings related to data you collected in lab or field work, a survey project, etc.

  • Argue a position and/or make a proposal for action using evidence and reasoning

  • Explain in writing the meaning of numerical or statistical data

  • Write in the style and format of a specific field (engineering, history, psychology, etc.)

  • Include drawings, tables, photos, screen shots, or other visual content in you writing

  • Create a project with multimedia (web page, poster, slide presentation such as PowerPoint, etc.)

  • Create a portfolio that collects writing

  • Submit writing to an organization or group outside of Northeastern.

  • Other kinds of writing (please describe)

Has specific attention paid to the association between the AWD courses and students’ learning outcomes.

Q21: After AWD, which of the following are you currently able to do as a writer?

(Please consider your English writing capability in both academic and non-academic contexts, and select all that apply to you.)

  • Adapt writing for multiple academic, professional, and public occasions and audiences

  • Display familiarity with the writing conventions of genres in an academic field or profession

  • Identify credible, relevant sources and engages and cites them appropriately

  • Draft, revise, and edit writing using reader feedback

  • Include multicultural references, resources, or diverse perspectives

  • Use correct grammar, punctuation, sentence construction

Asked about future contexts for writing after graduation

Asks about future contexts for writing after graduation

Notes

  1. For example, in crafting survey questions, we especially wrestled with definitions of multilingualism. While the terms multilingual and multilingualism have been frequently used in the literature, their definitions vary, with some scholars paying more attention to the number of languages one can master at a certain proficiency level, while other scholars put more emphasis on language usage and exposure. For example, traditionally, bi/multilingualism has been associated with native or near-native proficiency of someone’s additional languages than his or her mother tongues (Bloomfield). The notion of near-native proficiency has been later quantified as the capability to produce “complete meaningful utterances in the other language” (Haugen 7). More recent scholars have embraced broader definitions of bi/multilingual such as “those people who need and use two or more languages (or dialects) in their everyday lives” (Grosjean 4) and individuals who have “more than one language competence” (Valdés and Figueroa 8) to capture language users’ various degrees of competence across domains. (Return to text.)

  2. Although the NU writing program does not have a common curriculum, more than 70% of instructors in AWD courses teach, at least, one descriptive assignment, one research assignment, and one reflective assignment each semester. Within each of these categories, what counts as a research assignment, for example, may vary, ranging from a research-based proposal to a literature review. (Return to text.)

Works Cited

Benda, Jonathan, et al. Confronting Superdiversity in US Writing Programs. The Internationalization of US Writing Programs, edited by Shirley K Rose and Irwin Weiser, University Press of Colorado, 2018, pp. 79-96.

Benda, Jonathan, et al. Confronting Superdiversity Again: A Multidimensional Approach to Teaching and Researching Writing at a Global University. Writing Across Difference: Theory and Intervention, edited by Daniel, James, Katie Malcolm and Candace Rai, Utah State UP, forthcoming.

Blommaert, Jan. Ethnography, Superdiversity and Linguistic Landscapes: Chronicles of Complexity. Vol. 18. Multilingual Matters, 2013.

Bloomfield, Leonard. Language. London, Allen and Unwin, 1933.

Butler, Yuko. Bilingualism/multilingualism and second-language acquisition. Blackwell Handbooks in Linguistics: Handbook of Bilingualism and Multilingualism, edited by Tej Bhatia and William Ritchie. Chichester: Wiley-Blackwell, 2012, pp. 109-136.

Condon, William. Large-Scale Assessment, Locally-Developed Measures, and Automated Scoring of Essays: Fishing for Red Herrings? Assessing Writing, vol. 18, no. 1, 2013, pp. 100-108, doi:10.1016/j.asw.2012.11.001.

Cumming, Alister, et al. Students’ Practices and Abilities for Writing from Sources in English at Universities in China. Journal of Second Language Writing, vol. 39, 2018, pp. 1-15., doi:10.1016/j.jslw.2017.11.001.

Frederick, Antoinette. Northeastern University. An Emerging Giant: 1959-1975. Boston, Northeastern University, 1982.

Friginal, Eric, et al. Revisiting Multiple Profiles of Learner Compositions: A Comparison of Highly Rated NS and NNS Essays. Journal of Second Language Writing, vol. 23, 2014, pp. 1-16., doi:10.1016/j.jslw.2013.10.001.

Gomes, Matthew. Writing Assessment and Responsibility for Colonialism. Writing Assessment, Social Justice, and the Advancement of Opportunity, edited by Mya Poe, Asao B. Inoue, and Norbert Elliot, University of Colorado UP, 2019, pp. 201-225.

Gorter, Durk. Introduction: The Study of the Linguistic Landscape as a New Approach to Multilingualism. Linguistic Landscape, 2006, pp. 1-6., doi:10.21832/9781853599170-001.

Grosjean, François. Bilingual: Life and Reality. Cambridge, Harvard UP, 2010.

Hammond, J.w. Making Our Invisible Racial Agendas Visible: Race Talk in Assessing Writing, 1994-2018. Assessing Writing, vol. 42, 2019, p. 100425., doi:10.1016/j.asw.2019.100425.

Haugen, Einar. The Norwegian Language in America. Philadelphia: University of Pennsylvania Press, 1953.

Horner, Bruce, Emily Yuko Cousins, Jaclyn Hilberg, N. Claire Jackson, Rachel Rodriguez, and Alex Way. Translingual Approaches to Writing and Its Teaching. WPA-CompPile Research Bibliography No. 28, 2019.

Huot, Brian. Toward a New Theory of Writing Assessment. College Composition and Communication, vol. 47, no. 4, 1996, p. 549., doi:10.2307/358601.

Information for International Applicants. Northeastern University, 2020. https://www.northeastern.edu/admissions/application-information/international-student-admissions/.

Interpreting TOEFL® Scores. Educational Testing Service, 2020. https://www.ets.org/toefl/score-users/scores-admissions/interpret.

Johnson, David, and Lewis Vanbrackle. Linguistic Discrimination in Writing Assessment: How Raters React to African American ‘Errors,’ ESL Errors, and Standard English Errors on a State-Mandated Writing Exam. Assessing Writing, vol. 17, no. 1, 2012, pp. 35-54., doi:10.1016/j.asw.2011.10.001.

Jones, Kathryn, et al. Multilingual Literacies: Constructing a Critical, Dialogic Approach to Research on Multilingual Literacies: Participant Diaries and Diary Interviews. J. Benjamins Pub., 2000.

Kane, Michael T. Explicating Validity. Assessment in Education: Principles, Policy & Practice, vol. 23, no. 2, 2015, pp. 198-211., doi:10.1080/0969594x.2015.1060192.

Knoch, Ute, et al. What Happens to ESL Students’ Writing after Three Years of Study at an English Medium University? Journal of Second Language Writing, vol. 28, 2015, pp. 39-52., doi:10.1016/j.jslw.2015.02.005.

Kvale, Steinar. Interviews: An Introduction to Qualitative Research Interviewing. Thousand Oaks, Sage, 1996.

Li, Yongyan, and Christine Pearson Casanave. Two First-Year Students’ Strategies for Writing from Sources: Patchwriting or Plagiarism? Journal of Second Language Writing, vol. 21, no. 2, 2012, pp. 165-180., doi:10.1016/j.jslw.2012.03.002.

Lindsey, Peggy, and Deborah J. Crusan. "How faculty attitudes and expectations toward student nationality affect writing assessment." Across the Disciplines: A Journal of Language, Learning, and Academic Writing, 8, 2011.

Lippi-Green, Rosina. English with an Accent: Language, ideology, and Discrimination in the United States. London and New York, Routledge, 1997.

Marston, Everett C. Origin and Development of Northeastern University, 1898-1960. Boston, Northeastern University, 1961.

Matsuda, Paul Kei. The Myth of Linguistic Homogeneity in U.S. College Composition. College English, vol. 68, no. 6, Jan. 2006, p. 637., doi:10.2307/25472180.

Mutsaers, Paul and Jos Swanenberg. Super-Diversity at the Margins? Youth Language in North Brabant, The Netherlands. Sociolinguistic Studies, vol. 6, no. 1, 2012, doi:10.1558/sols.v6i1.65.

NUPath Learning Goals. Northeastern University, 2020. http://catalog.northeastern.edu/undergraduate/university-academics/nupath/learning-goals/.

O'hagan, Sally Roisin, and Gillian Wigglesworth. Whos Marking My Essay? The Assessment of Non-Native-Speaker and Native-Speaker Undergraduate Essays in an Australian Higher Education Context. Studies in Higher Education, vol. 40, no. 9, Aug. 2014, pp. 1729-1747., doi:10.1080/03075079.2014.896890.

Pennycook, Alister and Otsuji Emi. Metrolingualism: Language in the City. Abingdon and New York, Routledge, 2015.

Poe, Mya, and Norbert Elliot. Evidence of Fairness: Twenty-Five Years of Research in Assessing Writing. Assessing Writing, vol. 42, 2019, pp. 1-21, doi:10.1016/j.asw.2019.100418.

Poe, Mya, Asao B. Inoue, and Norbert Elliot, editors. Writing Assessment, Social Justice, and the Advancement of Opportunity. Boulder, University Press of Colorado, 2018.

Rubin, Donald L. and Melanie Williams-James. The Impact of Writer Nationality on Mainstream Teachers’ Judgments of Composition Quality. Journal of Second Language Writing, vol. 6, no. 2, 1997, pp. 139-154.

Seidman, Irving. Interviewing as Qualitative Research: A Guide for Researchers in Education and the Social Sciences. New York, Teachers College Press, 1998.

Slomp, David, Julie Corrigan, and Tamiko Sugimoto. A Framework for Using Consequential Validity Evidence in Evaluating Large-Scale Writing Assessments: A Canadian Study. Research in the Teaching of English, vol. 48, no, 3, 2014, pp. 276-302.

Scott-Clayton, Judith. Do High-Stakes Placement Exams Predict College Success? CCRC Working Paper, No. 41, 2012.

Sternglass, Marilyn. Time to Know Them: A Longitudinal Study of Writing and Learning at the College Level. Mahwah, Lawrence Erlbaum Associates, 1997.

U.S. News and World Report. Northeastern University, 2020, https://www.usnews.com/best-colleges/northeastern-university-2199, accessed 28 Apr. 2020.

Valdés, Guadalupe and Richard Figueroa. Bilingualism and Testing: A Special Case of Bias. Norwood, NJ: Ablex Publishing, 1994.

Vertovec, Steven. Super-Diversity and Its Implications. Ethnic and Racial Studies, vol. 30, no. 6, 2007, pp. 1024-1054., doi:10.1080/01419870701599465.

Vertovec, Steven. Talking around Super-Diversity. Ethnic and Racial Studies, vol. 42, no. 1, May 2017, pp. 125-139., doi:10.1080/01419870.2017.1406128.

World Migration Report. Geneva, International Organization for Migration, 2018.

Zhang-Wu, Qianqian. Unpacking the Bilingual Linguistic Functioning of Chinese International Students: Myths and Realities. Bristol: Multilingual Matters, under contract.

Zhang-Wu, Qianqian. Exploring the Bilingual Linguistic Functioning of First-Semester Chinese International Students. 2019. Boston College, PhD dissertation.

Return to Composition Forum 44 table of contents.