Multiple facets in the assessment of English learners : challenges and possibilities

Placeholder Show Content

Abstract/Contents

Abstract
Tests are woven into the fabric of the most fundamental educational processes. Yet testing programs, models and practices do not systematically account for the unique and varied needs of students categorized as English Learners (ELs) and who can be defined as students who are developing English as a second language (Linquanti et al., 2016). Although these students enter school halls with a rich array of linguistic, cultural and academic strengths, the EL label misleads teachers to approach these students with a deficit framing (Valdés, 2004). An obvious contributing factor is that these students tend to underperform on tests of academic achievement -- which stems, in part, from the lack of design considerations for ELs' needs (Durán, 2008). Another contributing factor is that there is still much room for improvement in the uses of ELs' assessment data by educational practitioners. This dissertation addresses multiple facets of applied issues in assessment that relate to fairness and validity, for which educators and assessment experts urgently require guidance. Specifically, it examines educational assessment issues and their impacts on K-12 students who have been categorized as ELs. In this dissertation, I address gaps in the literature at the intersection of language, learning, and assessment. I weave together interdisciplinary concepts and methods that span across sociolinguistics, sociology, psychology, and the learning sciences. This dissertation consists of three related studies with distinct motivations and origins. While the first chapter relates to linguistic and cultural issues in assessment, the other two chapters address disparities in opportunities to access rigorous academic content. These issues are interrelated, given that disparities in students' opportunities to learn are strongly impacted by inappropriate test design, development, and use. The first study is a conceptual framework originally commissioned by the Smarter Balanced Assessment Consortium, which sought conceptual guidance in developing culturally sustaining assessment systems. Thus, the target audience for this first chapter is assessment developers. The second and third studies, on the other hand, were developed as part of a five-year research-practice partnership. These two studies would not have been possible without funding from the Stanford's Graduate School of Education, and without facilitation from the California Education Partners. Collectively, these two studies examine the consequences of test misuse. The primary audience for these two papers comprises educators and researchers. In the first study, I report on a critical synthesis of the literature that explores the complexities of considering cultural relevance and provision of choice in assessments. I synthesize this information into key considerations for flexible assessment designs that allow students to more fully utilize their personal, cultural, and linguistic skills in demonstrating what they know and can do on assessments. In this paper, I ask: What considerations about ways of knowing and choice should be integrated into test design and development? I also ask, What actionable steps can assessment experts take to design assessments that promote flexibility and personalization? Although the literature to date on culturally sustaining pedagogy and students' ways of knowing is growing, the extensions to individual assessments and to comprehensive and coherent assessment systems are more recent developments. To my knowledge, this is the first paper that describes how to integrate culturally relevant and sustaining assessment practices into all levels of assessment systems. I conclude this study by offering practical recommendations that assessment development organizations may take to promote equity in their assessment systems. In the second study, I examine disparities in opportunities to access rigorous academic content for five cohorts of students in one Californian high school district. I ask the following questions: Which combinations of mathematics courses (trajectories) are most common for students across language proficiency designations? How do remedial and 'bridge' mathematics courses relate to college preparation outcomes by Grade 12? How do parent education levels relate to students' course trajectories and subsequent college enrollment eligibilities? Using a novel base-2 enumeration approach (Solano-Flores, 2021), I enumerate the thousands of possible trajectories (combinations) of mathematics courses taken by students during high school. I observe significant disparities in college preparedness across students by English language designation. In comparison to Non-English Learner (Non-EL) English-speaking peers, LTELs were funneled into fewer trajectories that predominantly included lower-level mathematics courses. I find that enrollment into remedial mathematics courses in Grade 9 appears to restrict access to advanced courses, with grave consequences for Grade 12 college preparation. Finally, in Chapter 3, I examine the use of mathematics placement test scores for assignment of students into either a regular Algebra I course or a remedial Algebra Readiness course in Grade 9. I ask: What kind of evidence is available to support the use of CAASPP, MDTP and DOMA tests for mathematics course placement decisions? I also ask: What evidence supports the utility of mathematics test cut scores for Algebra I placement in Grade 9? In this two-part study, I answer the first question by examining assessment frameworks and item specification documents, developed by the test publishers. I compare information about the publishers' intended purposes of the tests with the enacted uses of those tests in a high school district. I answer the second question by developing regression discontinuity models to estimate the effect of using math placement tests on students' Grade 9 GPA and course completion rates. I find that the use of these tests for course placement was inappropriate, given their poor psychometric properties and the lack of alignment between the educators' enacted test uses and the publishers' intended test purposes. I also find that the use of mathematics placement tests for Algebra yielded different impacts, depending on whether students were classified as ELs, the specific test used, and the achievement outcome. In combination, these chapters offer nuanced perspectives on the consequences of assessment for ELs as well as the potential future assessment design considerations. My findings can inform the design, development, and use of educational assessments as they relate to students who are linguistically, racially, and culturally diverse. These findings also point to how policies can be adjusted to better support ELs' academic outcomes, including access to college preparatory coursework.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2022; ©2022
Publication date 2022; 2022
Issuance monographic
Language English

Creators/Contributors

Author Biernacki, Paulina Julia
Degree supervisor Solano-Flores, Guillermo
Degree supervisor Valdés, Guadalupe
Thesis advisor Solano-Flores, Guillermo
Thesis advisor Valdés, Guadalupe
Thesis advisor Bettinger, Eric
Thesis advisor Martínez, Ramón, 1972-
Degree committee member Bettinger, Eric
Degree committee member Martínez, Ramón, 1972-
Associated with Stanford University, Graduate School of Education

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Paulina Julia Biernacki.
Note Submitted to the Graduate School of Education.
Thesis Thesis Ph.D. Stanford University 2022.
Location https://purl.stanford.edu/sd618jg8767

Access conditions

Copyright
© 2022 by Paulina Julia Biernacki
License
This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).

Also listed in

Loading usage metrics...