EXAMINING THE EFFECT OF AN OVERT TRANSITION INTERVENTION ON THE READING DEVELOPMENT OF AT-RISK ENGLISH-LANGUAGE LEARNERS IN FIRST GRADE by DARCI A. BURNS A DISSERTATION Presented to the Department of Special Education and Clinical Sciences and the Graduate School of the University of Oregon in partial fulfillment of the requirements for the degree of Doctor of Philosophy June 2011 ii DISSERTATION APPROVAL PAGE Student: Darci A. Burns Title: Examining the Effect of an Overt Transition Intervention on the Reading Development of At-Risk English-Language Learners in First Grade This dissertation has been accepted and approved in partial fulfillment of the requirements for the Doctor of Philosophy degree in the Department of Special Education and Clinical Sciences by: Dr. Edward J. Kame‘enui Chair Dr. Roland H. Good, III Member Dr. Elizabeth Harn Member Dr. Doris A. Baker Member Dr. Robert R. Davis Outside Member and Richard Linton Vice President for Research and Graduate Studies/Dean of the Graduate School Original approval signatures are on file with the University of Oregon Graduate School. Degree awarded June 2011 iii © 2011 Darci A. Burns iv DISSERTATION ABSTRACT Darci A. Burns Doctor of Philosophy Department of Special Education and Clinical Sciences June 2011 Title: Examining the Effect of an Overt Transition Intervention on the Reading Development of At-Risk English-Language Learners in First Grade Approved: ________________________________________________ Dr. Edward J. Kame‘enui Although there is arguably substantial evidence in the literature on what works for students at risk of reading failure, the evidence on effective interventions for English- language learners (ELs) is rather meager. Moreover, there are limited curriculum programs and instructional materials available to support schools in the inclusion of ELs in reading-reform efforts. This study examined the efficacy of a systematic transition intervention designed to increase the early literacy achievement of Spanish-speaking ELs in transitional bilingual programs. The intervention included a set of 12 scripted transition lessons that made explicit for ELs the orthographic, lexical, and syntactic differences between Spanish and English. In addition, the lessons addressed the story content knowledge and vocabulary and academic language necessary to ensure that ELs could access the English literacy curriculum and classroom discourse. Seventy-eight first-grade ELs identified as at risk for reading difficulty were randomly assigned to receive either the transition lessons in the treatment condition or the standard school-based intervention v in the control condition. Students in both conditions received 60 thirty-minute sessions of small-group instruction as a supplement to their first-grade core reading program. Instruction in both conditions was explicit and focused on the core reading components (i.e., phonemic awareness, phonics, word work, fluency, vocabulary, and comprehension). Student performance was measured on the following dimensions of early reading: (a) phonemic decoding and word reading, (b) oral reading fluency, (c) vocabulary development, and (d) comprehension. In addition, fidelity of implementation, time devoted to the different literacy components, and feasibility of implementation data were collected during and after the study. A gain-score analysis was employed in this study to compare the effect of the treatment (transition lessons) and control (standard school-based intervention) conditions on scores obtained from the pretest and posttest measures of reading achievement. The results indicated that the difference in gain scores between the treatment and control conditions was not statistically significant on any of the measures utilized in the study. Therefore, the transition intervention did not appear to be more effective than the typical school-based intervention. Findings are discussed in light of current research on improving the academic performance of ELs. vi CURRICULUM VITAE NAME OF AUTHOR: Darci A. Burns GRADUATE AND UNDERGRADUATE SCHOOLS ATTENDED: University of Oregon, Eugene Simmons College, Boston, Massachusetts Eastern Illinois University, Charleston DEGREES AWARDED: Doctor of Philosophy, Special Education, 2011, University of Oregon Master of Science, Special Education, 2002, Simmons College Bachelor of Science, Elementary Education, 1989, Eastern Illinois University AREAS OF SPECIAL INTEREST: Literacy Instruction and Assessment School-Wide Literacy Change Models Response to Intervention Early Intervention and Prevention PROFESSIONAL EXPERIENCE: Director and Senior Facilitator, HILL for Literacy, Inc., Woburn, Massachusetts, 2004-present Regional Professional Development Provider, Partnership for Achievement in Reading (PAR), Boston, Massachusetts, 2003-2004 Literacy Specialist, J. W. Hennigan School, Boston, Massachusetts, 1999-2003 vii Coordinator, Intergenerational Literacy Tutoring Program, Boston Partners in Education, Boston, Massachusetts, 1996-1999 Classroom Teacher, Christa McAuliffe Elementary School, Palm Bay, Florida,1990-1996 Teaching Assistant and Substitute Teacher, Farmingdale Elementary School, Pleasant Plains, Illinois, 1989-1996 GRANTS, AWARDS AND HONORS: Research Fellowship, 2008-2010 Partners in Excellence Award, Institute of Health Professions, Massachusetts General Hospital, 2005 PUBLICATIONS: Nelson-Walker, N. J., Burns, D. A., Turtura, J., & Munir-McHill, S. (2010). Dynamic indicators of basic early literacy skills: A comparison of Nonsense Word Fluency 6th Edition and Next. Eugene, OR: University of Oregon, College of Education, DIBELS/IDEL Research Team. Jones, S., Burns, D., & Pirri, C. (2009). Leading literacy change. Longmont, CO: Sopris West. viii ACKNOWLEDGMENTS I wish to express sincere appreciation to Dr. Ed Kame‘enui for his unwavering support, intellectual integrity and thoughtful guidance throughout my doctoral program and dissertation process. Additionally, I would like to thank the SETR team, especially Drs. Doris and Scott Baker for providing me with the opportunity, continued support and encouragement to complete this dissertation study. Special appreciation goes to Tigard- Tualatin School District, Hillsboro Public Schools, Hood River School District and Walla Walla School District for their participation in this study. I would also like to acknowledge the support I received in preparing this dissertation, from my committee, Dr. Roland H. Good, III, Dr. Beth Harn, and Dr. Robert Davis. Finally, I want to thank Steve and Malcolm for the sacrifices they made on my behalf and for the constant support they provided throughout this process. ix TABLE OF CONTENTS Chapter Page I. INTRODUCTION ......................................................................................................... 1 Components of Effective Intervention for ELs .......................................................... 3 Instructional Variables That Mediate Student Performance ...................................... 5 Purpose of the Study and Research Questions ........................................................... 7 II. LITERATURE REVIEW ............................................................................................. 11 English-Language Learners in the United States ....................................................... 11 Academic Achievement of English-Language Learners ............................................ 12 Instructional Models for English-Language Learners ................................................ 14 Developing Literacy Skills in English-Language Learners ....................................... 24 Academic Language in the Context of This Dissertation Study ................................ 42 III. METHODOLOGY ...................................................................................................... 44 Participants ................................................................................................................ 44 Procedures .................................................................................................................. 73 Data Analysis ............................................................................................................. 77 IV. RESULTS ................................................................................................................... 80 Descriptive Statistics ................................................................................................. 81 Preliminary Analyses ................................................................................................. 83 Research Question 1 .................................................................................................. 83 Research Question 2 .................................................................................................. 89 Research Question 3 .................................................................................................. 90 x Chapter Page Research Question 4 ................................................................................................ 91 Fidelity of Implementation ...................................................................................... 97 Feasibility Survey .................................................................................................... 100 V. DISCUSSION ............................................................................................................ 103 Summary of Results ................................................................................................. 104 Lack of Statistically Significant Effects................................................................... 106 Differential Time on Core Reading Components .................................................... 110 Limitations ............................................................................................................... 112 Future Research ....................................................................................................... 114 Summary .................................................................................................................. 115 APPENDICES A. TRANSITION LESSON MAPS ................................................................................ 118 B. TRANSITION LESSONS PRE- AND POSTTEST ASSESSMENT........................ 131 C. ADMINISTRATION FIDELITY CHECKLIST ........................................................ 152 D. FIDELITY OF IMPLEMENTATION CHECKLIST FOR SETR TRANSITION LESSONS .......................................................................................... 155 E. FEASIBILITY OF THE READING INTERVENTION WITH SPANISH- SPEAKING STUDENTS ........................................................................................... 158 REFERENCES CITED ................................................................................................... 160 xi LIST OF TABLES Table Page 1. Characteristics of School and Student Participants ...................................................... 45 2. Characteristics of Teachers by School .......................................................................... 46 3. Frequencies and Percentages for School Demographics for Each Group (Treatment and Control) ............................................................................................... 50 4. Scope and Sequence for the Literacy Component and English Elements Covered in the Transition Lessons .............................................................................................. 54 5. Example of Phonemic Awareness, Lesson 1, Day 1 ..................................................... 55 6. Teacher Script for Phonics Lesson (Lesson 1, Day 1) .................................................. 57 7. Teaching Script for Word Work (Lesson 1, Day 1) ...................................................... 59 8. Example of Vocabulary Instruction in Phonics Section (Lesson 1, Day 1) .................. 60 9. Example of Sentence Reading (Lesson 1, Day 2) ......................................................... 61 10. Example of Reading Aloud Instructions (Lesson 2, Day 3) ....................................... 62 11. Literacy Components and Instructional Features of the Standard Intervention Programs Implemented in the Control Condition ....................................................... 66 12. Means and Standard Deviations for All Test Scores by Group (Treatment and Control) ................................................................................................................ 84 13. ANOVAs for Pretest and BVAT Bilingual Verbal Abilities Scores by Group (Treatment vs. Control) ............................................................................................... 86 14. ANOVA for Word Reading Gain Scores by Group (Treatment vs. Control) ............. 87 15. ANOVA for Passage Reading Gain Scores by Group (Treatment vs. Control) ......... 88 16. ANOVA for SAT-10 Word Reading Gain Scores by Group (Treatment vs. Control) ................................................................................................................. 88 xii Chapter Page 17. ANOVA for SAT-10 Sentence Reading Gain Scores by Group (Treatment vs. Control) ............................................................................................................... 89 18. ANOVA for DOK Vocabulary Gain Scores Group (Treatment vs. Control) ........... 90 19. ANOVA for Overall Reading Gain Scores Group (Treatment vs. Control) ............. 91 20. Correlations Between BVAT Bilingual Verbal Abilities Score and GRADE Listening Comprehension Score, GRADE Word Meaning Score, and SAT-10 Reading Comprehension Score ................................................................................. 92 21. ANCOVA for GRADE Listening Comprehension Scores by Group After Controlling for BVAT Bilingual Verbal Abilities Score .......................................... 96 22. ANOVA for GRADE Word Meaning Scores by Group ........................................... 97 23. ANCOVA for SAT-10 Reading Comprehension Scores by Group After Controlling for BVAT Bilingual Verbal Abilities Score .......................................... 97 24. Instructional Components by Condition ................................................................... 98 25. Time Spent on Core Components of Reading by Condition .................................... 100 xiii LIST OF FIGURES Figure Page 1. Cummins‘ Iceberg Model ............................................................................................. 17 2. Transition Lessons ........................................................................................................ 52 3. Scatterplot With Regression Lines (Treatment and Control) for BVAT Bilingual Verbal Abilities Score and GRADE Listening Comprehension Score ......................... 94 4. Scatterplot With Regression Lines (Treatment and Control) for BVAT Bilingual Verbal Abilities Score and SAT-10 Reading Comprehension Score ........................... 95 5. Scatterplot With Regression Lines (Treatment and Control) for BVAT Bilingual Verbal Abilities Score and GRADE Word Meaning Score .......................................... 96 1 CHAPTER I INTRODUCTION The present dissertation study examines the efficacy of an intervention for first-grade English-language learners (ELs) who are learning to read in English and Spanish. In this chapter, I provide a context for studying early reading development in ELs and in doing so, I highlight the following: (a) characteristics of ELs in U.S. schools, (b) components of effective intervention, (c) instructional variables that mediate student performance, and (d) the purpose of the study and the research questions. Providing high-quality reading instruction for English-language learners in the early grades is a critical educational objective (August & Shanahan, 2006; Slavin & Cheung, 2005). Over 3 million ELs attend elementary schools, representing more than 11.5% of the elementary school population. English-language learners are the fastest growing student population in U.S. schools. Indications are that this trend will continue in the short and long term (August & Shanahan, 2006). Although many different language groups represent English learners—there are approximately 440 different home languages spoken by children in U.S. schools (National Center for Educational Statistics [NCES], 2004)—Spanish speakers, comprise approximately 80% and are by far the largest EL group in the country (Hubler, 2005). In U.S. schools, children learning English as an additional language—the vast majority of whom are Spanish-speaking English-language learners—lag behind their monolingual English-speaking peers in reading performance (NCES, 2005; U.S. Department of Education, 2007). To compound the problem, 2 Spanish-speaking ELs represent a substantial and growing part of the population in virtually all states (Chapa & De La Rosa, 2004). These shifting population demographics mean that classroom teachers, who as a group have rarely taught ELs, now face these students on a daily basis in their classrooms. Many teachers have become, often by default and without careful preparation, teachers of ELs. Arguably, effective instruction invokes a more complex set of instructional issues for ELs than for native English speakers. Not only are ELs expected to master academic content like their peers in reading, writing, mathematics, and science (a significant challenge for a large percentage of native English speakers), but they are expected, at roughly the same time, to develop proficiency in a second language. These ―double demands‖ (Gersten, 1996, p. 18) increase the importance of optimal instructional design and delivery features in literacy instruction for ELs. In early literacy research, there are important empirical findings and insights that provide some direction. Accumulating evidence indicates that the rate of learning English among ELs can be equal to the learning rate of native English speakers, when effective instruction is provided (Chiappe, Siegel, & Wade-Woolley, 2002; Gersten, 1999). This rate of learning is most apparent in the early grades and for some specific areas of literacy development. For example, ELs appear to learn important foundational literacy skills, such as phonological awareness and phonological recoding, at the same rate as native English speakers (Baker, Gersten, Haager, Dingle, & Goldenberg, 2006; Chiappe et al., 2002; Lesaux & Siegel, 2003). Furthermore, this commensurate early literacy success for ELs is not predetermined by level of language proficiency in English (Lesaux & Siegel, 3 2003). Understanding the alphabetic principle (i.e., letter-sound correspondence, consonant and vowel diagraphs, consonant blends, etc.) does not require developed oral native language proficiency. Students with limited English proficiency can identify letter sounds and read words that include those letters sounds without knowing the word meanings of the words they are decoding accurately (Baker & Baker, 2008). However, skills requiring syntactic processing and working memory are more difficult for ELs than English-only students. One explanation for the difference is that syntactic awareness and working memory require substantial language proficiency skills, whereas phonological awareness skills do not (Lesaux & Siegel, 2003). Components of Effective Intervention for ELs It is imperative to provide effective intervention in the early grades for at-risk readers, including those who are learning to read in a second language. In the United States, if students fail to learn to read adequately in first grade, there is approximately a 90% probability that these struggling readers will remain poor readers in Grade 4 (Juel, 1988; Torgesen & Burgess, 1998) and a 75% probability that they will be poor readers in high school (Francis, Shaywitz, Stuebing, Shaywitz, & Fletcher, 1996). For students in transitional bilingual programs, the process of making the transition to English reading is crucial for subsequent school success in English-only environments. 4 Decoding Instruction Rigorous research evidence suggests that students who speak English as a second language and are learning to read in English benefit from systematic, explicit instruction in English phonology (Gunn, Biglan, Smolkowski, & Ary, 2000; Quiroga, Lemos-Britton, Mostafapour, Abbott, & Berninger, 2002), with attention given to elements of English that differ from a student‘s native language (Jiménez, 1994). For example, Quiroga et al. (2002) found that 4 first-grade language learners who were at risk for reading difficulties significantly improved in English word reading after receiving individual intervention that included phonological awareness instruction in both English and Spanish and explicit decoding instruction in English. Skills instruction appears most effective when coupled with practice in reading connected text (August & Hakuta, 1997; Gersten & Baker, 2000). Vocabulary Instruction Vocabulary development is vital for ELs to make progress in reading English (August & Shanahan, 2006). Researchers agree that insufficient vocabulary knowledge is a critical problem for many young children, especially English-language learners (August & Shanahan, 2006; Snow, Burns & Griffin, 1998). Children need to know a wide range of words to understand the texts they encounter in school. Many ELs who come to school with limited English language background find that vocabulary is their most frequently encountered obstacle in attempting to gain information from classroom texts (August & Hakuta, 1997; Carlo et al., 2004; Jiménez, 1994). Effective vocabulary instruction is directed toward a deep, integrated understanding of words and must be systematic and 5 repetitive (Beck, Perfetti, & McKeown, 1982). For ELs, instruction that facilitates vocabulary development includes the preteaching of selected key words and the use of visuals, including networks of words and the integration of words with students‘ prior knowledge (Gersten & Baker, 2000; Saunders, O‘Brien, Lennon, & McLean, 1998). Comprehension Instruction The teaching of cognitive and metacognitive strategies has been shown to improve language-minority students‘ comprehension of text (Jiménez, 1997; Klingner & Vaughn, 1996). This approach is most effective when students have adequate decoding skills and adequate verbal proficiency (Klinger & Vaughn, 1996). Bilingual readers can be taught to use comprehension strategies that competent monolingual English readers also use, but some effective strategies appear to be specific to bilingualism (Jiménez, 1997). Bilingual readers can be taught to take advantage of similarities between their two languages and to use transfer strategies and processes from Spanish to English. It is important to note that EL students may need explicit instruction to facilitate this transfer (Jiménez, 1997). Instructional Variables That Mediate Student Performance Knowing what needs to be taught is necessary, but it fails to account sufficiently for effective early reading instruction. The manner in which the instructional content is presented, or the instructional delivery and design, is also vital. Kame‘enui and Carnine (1998) have developed a set of empirically developed instructional design principles that can be used to compare instructional approaches across academic areas. The following 6 features are the organic basis for the design of explicit instructional supports for diverse learners, including ELs (Coyne, Kame‘enui, & Carnine, 2011). The first design principle is to focus on the big ideas in a skill or content area. Big ideas in beginning reading refer to skills and strategies that facilitate the most efficient and broadest acquisition of reading knowledge. Critical content includes phonological awareness, phonics, fluency, vocabulary, and comprehension strategies. All learners, most especially diverse learners, including ELs, will benefit from instruction focused in these areas to ensure that their early literacy skills are fully developed (Coyne et al., 2011). The second design principle is providing conspicuous strategies for learners to apply when learning. Conspicuous strategies refer to a series of overt teaching events and teacher actions that make abstract learning clear and concrete. Strategies are made explicit by using visual models, verbal directions, full and clear explanations, and outlined steps. Conspicuous strategies in early reading instruction involve teacher modeling of key reading skills and providing student practice and corrective feedback on these essential skills. The third design principle is providing mediated scaffolding for the learner. Mediated scaffolding provides temporary scaffolding, or instructional supports, for students to learn new material. Scaffolding is faded over time as students assume more control of their learning. The ease or difficulty of the task, materials, and selection of teacher examples are methods of mediating instruction to maximize student success. The fourth design principle is strategic integration of instructional goals that promote a full understanding of the big idea or concept. Strategic integration is the careful 7 sequencing of instruction that makes connections between new material and previously taught material. For ELs, this involves making explicit the connections between English and the sounds, letters and words of a students‘ native language. The fifth instructional design principle is priming background knowledge. Primed background knowledge includes the general knowledge that students must already possess in order to understand and acquire new knowledge. The likelihood of successfully learning new information is highly dependent on what the learners bring to the instructional task (Simmons & Kame‘enui, 1998). This principle is related to the strategic integration principle in that moving students through more difficult tasks requires linking information and skills previously taught with new information and skills. The final design principle is providing judicious review. Judicious review involves reviewing materials sequentially, adequately, and cumulatively. Review includes sufficient variety so that students do not memorize answers but can generalize the information learned to other similar content. Providing judicious review requires that there is enough practice for the learner to become automatic with new skills. Review and practice opportunities should be distributed regularly, be cumulative, and provide enough variation to demonstrate to the learner how the concept or skills are applied to a range of different tasks (Coyne et al., 2011). Purpose of the Study and Research Questions Although there is substantial evidence in the literature on what works for students at risk of reading failure, there is less known about intervention effectiveness with ELs. 8 Moreover, there are limited programs and materials available to support schools in the inclusion of ELs in reading-reform efforts. The current project was part of a larger research study, Reading Intervention With Spanish-Speaking Students: Maximizing Instructional Effectiveness in English and Spanish, that was funded by the Institute of Education Sciences (IES; Baker, Thompson, & Santoro, 2007). The study was designed to examine the impact of Systematic and Explicit Teaching Routines (SETR) in Spanish and in English on the reading performance of first- and second-grade Spanish-speaking English learners (ELs) in Oregon and Texas and in schools with early transition or paired bilingual programs. The study was a randomized control trial with assignment to condition at the school level; 37 schools in Oregon and Texas participated in the study. The SETR templates provide a framework for effectively delivering explicit instruction in target reading areas and academic language. The SETR are a series of ―packaged‖ teaching templates or lesson cards rather than lesson scripts or detailed lesson plans. The one- to two-page lesson cards have specific, explicit teaching routines that teachers integrate into existing whole-class and small-group instruction. The SETR work across different reading programs and cohesively link critical reading skills within a reading program. These SETR templates are in both Spanish and English and are intended to be used with core reading programs in both languages (see Appendix A for examples). The purpose of the SETR national study is to test the effectiveness of the SETR templates with Spanish reading core programs in transitional bilingual first-grade classrooms and with English reading core programs in second-grade classrooms. 9 As part of the SETR study, during the last quarter of first grade, students were introduced to a set of transition lessons intended to (a) build student academic language in English, and (b) help students make the transition from learning to read in Spanish to learning to read in English. The transition intervention included a set of 12 scripted transition lessons for ELs that explicitly taught the orthographic, lexical, and syntactic differences between Spanish and English. In addition, the lessons addressed the story content knowledge, as well as the vocabulary and academic language necessary to ensure that ELs can access the English literacy curriculum and classroom discourse. The transition lessons were developed using a conceptual framework based on research on effective instruction delineated earlier (Coyne et al., 2011). The transition lessons provided a framework for teachers to do the following: (a) explicitly model the use of learning strategies and new skills, (b) control task difficulty by scaffolding instruction, (c) provide multiple opportunities for students to respond in groups and individually, and (d) provide ongoing corrective feedback. The transition lessons were designed to help ELs learn the necessary academic language that would enable them to focus attention on accurate inferential skills and elicitation of background knowledge by making teacher directions and task explanations more conspicuous. The purpose of this study was to provide a more in-depth examination of the efficacy of transition lessons for strategic or intensive students performing at the strategic and intensive levels on DIBELS benchmark assessments. The student participants in this dissertation study were first-grade ELs from the treatment schools in the larger SETR 10 study. The student participants were randomly assigned to receive either the transition lessons in the treatment condition or a standard school-based intervention in the control condition. In both conditions, the students received small-group intervention in addition to instruction in their core Spanish reading program with the SETR templates. The following specific research questions were addressed in this dissertation study: 1. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on word reading and passage reading development as measured by the SAT-10 word reading and sentence reading subtests, DIBELS NWF and ORF subtests? 2. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on vocabulary development as measured by the Depth of Knowledge (DOK) subtest of the transition assessment? 3. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on overall reading achievement as measured by the transition pre-post assessment? 4. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on vocabulary development and listening comprehension as measured by the GRA+DE word meaning and listening comprehension subtests and the SAT-10 reading comprehension subtest? 11 CHAPTER II LITERATURE REVIEW In this chapter, I review the literature on the complexity of addressing the needs of English-language learners (ELs) in a school setting. The review focuses on the following topics: (a) ELs and achievement in U.S. schools, (b) instructional models for ELs, (c) developing literacy skills of ELs, and (d) intervention studies for both monolingual and bilingual students. English-Language Learners in the United States The number of ELs in public schools continues to increase and the gap in achievement between White and Hispanic students continues to grow. ELs are the fastest growing population in public schools. According to the Condition of Education Report, between 1972 and 2007, the percentage of public school students who were Hispanic increased from 32% to 44% (Planty et al., 2009). In addition, between 1979 and 2007, the number of school-age children (children ages 5-17) who spoke a language other than English at home increased from 3.8 to 10.8 million, or from 9% to 20% of the population in this age range. Of the school-age children who spoke a language other than English at home, 75% (or 2.1 million) spoke Spanish (Planty et al., 2009). It is estimated that by the year 2030, 40% of the school population will speak English as a second language (U.S. Department of Education & National Institute of Child Health and Human Development, 2003). According to the Oregon Department of Education, the increase in the number of 12 ELs enrolled in public schools is consistent with national trends. The enrollment of EL students increased 133% from 1994-2002 in the state of Oregon (Kindler, 2002). Academic Achievement of English-Language Learners The National Assessment of Educational Progress (NAEP) is the only nationally administered assessment that measures student achievement in various subject areas, including reading. The NAEP results provide a common metric for states and school districts as well as a general picture of student progress over time (Lee, Grigg, & Donahue, 2007). Descriptive statistics reported in the Reading Report Card (2007) revealed that the percent of Hispanic students assessed on the fourth- and eighth-grade reading test increased from 7% in 1992 to 19% in 2007 (Lee et al., 2007). This statistic provides further evidence of the growing population of Hispanic students in public schools. The three achievement levels or performance standards on the NAEP are basic, proficient and advanced. Below basic level denotes partial mastery of prerequisite knowledge and skills that are required to achieve grade-level proficiency. Results of the 2007 National Assessment of Educational Progress revealed that 73% of English- language learners in fourth grade and 71% in eighth grade scored below basic level on English reading measures (Lee et al., 2007). Moreover, national statistics suggest that the achievement gap between Whites and Hispanics is not closing. The 2007 White-Hispanic achievement gap was not measurably different from 2005 or 1992. Furthermore, there did not appear to be measurable changes in the eighth-grade White-Hispanic reading 13 achievement gap in 2007 when compared with 1992 or 2005 (Planty et al., 2009). In addition to the rise in the number of ELs in public schools, the evidence documenting the widespread underachievement among ELs, the No Child Left Behind Act of 2001 requirement to report achievement by subgroup, has forced local and state agencies to examine achievement rates of ELs. For example, the Oregon Department of Education reported that non-English speakers achieved at lower levels than students overall in 2001 (Kindler, 2002). In addition, English-language learners have the highest dropout rates of all public school students (McCardle, Mele-McCarthy, Cutting, Leos, & D‘Emilio, 2005). According to the National Center for Education Statistics (NCES, 2004), over 31% of Latino ELs drop out of high school. In addition, NCES reported that Latinos with limited English proficiency are more likely to drop out of high school than Latino students who are proficient in English (NCES, 2004). Educational agencies at the federal, state and local levels have made closing the achievement gap between ELs and English-only students a top priority. Thus, quality instruction is essential not only to increase reading performance among ELs but also to close the achievement gap. However, determining the best approach to increase reading performance among ELs remains a heavily debated topic, and evidence supporting a particular model of instruction remains inconclusive (Baker & Baker, 2008). 14 Instructional Models for English-Language Learners ELL Instructional Models According to the Center for Research on Education, Diversity, and Excellence (CREDE, 2003), there are several instructional models for English-language learners that have been studied and implemented in public schools. The various instructional models include (a) Two-way Bilingual Immersion programs that focus on bi-literacy (e.g., 50% of instructional time devoted to English and 50% devoted to Spanish instruction) for ELs and native English speakers; (b) One-way Developmental Bilingual Education programs, with similar goals as the Two-way Bilingual Immersion program, but designed for language minority students from one language background who will be instructed in only one language; (c) Transitional Bilingual Education programs or Early Exit models that teach English language development through academic programs and native language instruction for at least 2 or 3 years after which ELs receive all-English instruction; and (d) English Language Development (ELD) or English as a second language (ESL) instructional models that focus only on teaching English to ELL students. A review of these instructional models suggests that bilingual or native language instruction is incorporated in the majority of existing programs. Cummins‘ Iceberg Hypothesis Educators of English-language learners commonly refer to two types of English language proficiency: Basic Interpersonal Communication Skills (BICS) and Cognitive 15 Academic Language Proficiency (CALP). Cummins (1980) coined these terms and found that while most students learned sufficient English to engage in social communication in about 2 years, they typically needed 5-7 years to acquire the type of language skills needed for successful participation in content classrooms (Cummins, 1979). Limited English proficient students‘ language skills are often informally assessed and rely upon the ability of the student to comprehend and respond to conversational language. However, children who are proficient in social situations may not be proficient or prepared for the academic, context-reduced, and literacy demands of mainstream classrooms (Cummins, 1980). Judging students‘ language proficiency based on oral and/or social language assessments becomes problematic when the students perform well in social conversations but do poorly on academic tasks. The students may be incorrectly identified as having learning deficits or may even be referred for special education evaluation and eligibility under the category of learning disability (Cummins, 1980). The acronyms BICS and CALP tend to be imprecise and misused with English- language learners (Baker, 1993). Cummins (1984) addressed this problem through a theoretical framework that embeds the CALP language proficiency concept within a larger theory of Common Underlying Proficiency (CUP). The three terms are discussed in the next section. Basic Interpersonal Communication Skills (BICS) The commonly used acronym BICS (Basic Interpersonal Communication Skills) describes social, conversational language used exclusively for oral communication. Also 16 described as social language, this type of communication offers many cues to the listener and is considered context-embedded language. Typically, this type of communication, according to Cummins (1980), requires approximately 2 years of study before students from different linguistic backgrounds can readily comprehend context-embedded social language. English-language learners can comprehend social language by (a) observing speakers‘ nonverbal behavior (gestures, facial expressions and eye actions); (b) observing others‘ reactions; (c) using voice cues such as phrasing, intonations, and stress; (d) observing pictures, concrete objects, and other contextual cues that are present; and (e) asking for statements to be repeated, and/or clarified. Cognitive Academic Language Proficiency (CALP) According to Cummins (1980), CALP is the context-reduced language of the academic classroom and, he asserts, takes approximately 5-7 years for English-language learners to become proficient in the language of the classroom. Cummins argues that this amount of time is required because nonverbal clues are typically absent in the academic classroom as there is less face-to-face interaction, and academic language is often abstract. Additionally, literacy demands are high (i.e., narrative and expository text and textbooks are written beyond the language proficiency of the students), and cultural/linguistic knowledge is often needed to comprehend fully (Cummins, 1984). 17 Common Underlying Proficiency (CUP) Cummins‘ (1981) Common Underlying Proficiency model of bilingualism is generally represented pictorially in the form of two icebergs, as given in Figure 1. The two icebergs are viewed as separate above the surface. That is, two languages are visibly different in outward conversation. Underneath the surface, the two icebergs are fused such that the two languages function together and not separately. Both languages operate through the same central processing system. The common underlying proficiency model was formally expressed as the interdependence hypothesis (Cummins, 1980) as follows: To the extent that instruction in the first language (L1) is effective in promoting proficiency in the second language (L2), transfer of this proficiency to L2 will occur provided there is adequate exposure to L2 (either in school or environment) and adequate motivation to learn L2. (p. 310) FIGURE 1. Cummins‘ Iceberg Model. Thus, according to Cummins (1980), English instruction designed to develop English reading and writing skills is not just developing English skills in a Spanish-English bilingual program intended for native speakers of Spanish, but it is also developing a deeper conceptual and linguistic proficiency that is strongly related to the development of 18 literacy in the majority language (Spanish). In other words, although the surface aspects (e.g., pronunciation, fluency) of different languages are clearly separate, there is an underlying cognitive/academic proficiency that is common across languages. Thus, this common underlying proficiency, according to Cummins, makes possible the transfer of cognitive/academic or literacy-related proficiency from one language to another. As empirical evidence for the iceberg model, the interdependence hypothesis attempts to account for the consistently significant correlations between L1 and L2 reading abilities. These correlations exist even across quite dissimilar languages and writing systems (e.g., Japanese and English; Cummins et al., 1984), suggesting that the common underlying proficiency is both linguistic and conceptual. Therefore, in the case of cognate languages that are derived from similar source languages (e.g., Greek and Latin in the case of Romance languages), transfer will consist of both linguistic and conceptual elements. However, in the case of dissimilar languages, transfer will consist primarily of conceptual and cognitive elements (e.g., learning strategies). To illustrate, Cummins (2005) offers the word photosynthesis as an example. In languages such as Spanish, French, and English, the term is derived from Greek roots, and a student who knows the term in L1 and understands the concept will be able to transfer both linguistic and conceptual elements from L1 to L2. By contrast, in a situation of very dissimilar languages, only the conceptual elements will transfer. For example, in Japanese the word photosynthesis does not share the same alphabetic system or root word to assist the student in reading the word. However, if the students understand the meaning of the word 19 in Japanese, they do not need to relearn the concept of the word; only the linguistic (surface) aspects of how to read the word will be necessary to learn. According to Cummins (2005), there are five types of literacy and preliteracy skills that transfer across languages: (a) transfer of conceptual elements (e.g., understanding the concept of photosynthesis); (b) transfer of metacognitive and metalinguistic strategies (e.g., strategies of visualizing, use of graphic organizers, mnemonic devices, vocabulary acquisition strategies, etc.); (c) transfer of pragmatic aspects of language use (e.g., willingness to take risks in communication through L2, ability to use paralinguistic features such as gestures to aid communication); (d) transfer of specific linguistic elements (e.g., knowledge of the meaning of photo in photosynthesis); and (e) transfer of phonological awareness—the knowledge that words are composed of distinct sounds. Therefore, Cummins suggests that it is critical to build a foundation of skills in a students‘ first language to transfer those skills to a second language (Cummins, 2004). Bilingual Education There is considerable controversy among policymakers, researchers, and educators about how best to ensure the reading success of English-language learners. While there are many aspects of instruction that are important in the reading success of ELs, one question has dominated all others: What is the appropriate role of the native language in the actual, day-to-day instruction of English-language learners when teaching them to read in English? 20 In the 1970s and 1980s, policies and practice favored bilingual education in which children were taught partially or entirely in their native language, and then transitioned at some point during the elementary grades to English-only instruction. Such programs are still widespread, but from the 1990s to the present, the ―political tide‖ has turned against all types of bilingual education. For example, California, Arizona, Massachusetts, and other states have enacted policies to greatly curtail bilingual education. Recent federal policies have restricted the amount of time children can be taught in their native language. Among researchers, the debate between advocates of bilingual and English-only reading instruction has been fierce, and ideology has often trumped evidence on both sides of the debate (Hakuta, Butler, & Witt, 2000). Some experts assert that students are best served by receiving instruction in their native language, while others suggest that students should be taught simultaneously in both English and their native language (Greene, 1998; Slavin & Cheung, 2005). Proponents of bilingual instruction argue that while children are learning to speak English, they should be taught to read in their native language first, ostensibly to avoid the failure that is likely if children are asked to learn both oral English and reading in English at the same time. Programs based on this philosophy transition children to English-only instruction when their English is ―sufficient‖ to ensure success, typically in second or third grade. Alternatively, many bilingual programs teach young children to read both in their native language and in English at different times of the day. There is reliable evidence that children‘s reading proficiency in their native language is a strong predictor of their ultimate English reading performance (August & Shanahan, 2006; 21 Garcia, 2000; Lee & Schallert, 1997; Reese, Garnier, Gallimore, & Goldenberg, 2000), and that bilingualism itself does not interfere with performance in either language (Yeung, Marsh, & Suliman, 2000). Bilingual advocates also argue that without native language instruction, English- language learners are likely to lose their native language proficiency, or fail to learn to read in their native language, losing skills that are of economic and social, if not, generational and cultural value in the world today. Opponents of bilingual education, on the other hand, argue that native language instruction interferes with or delays English language development, and relegates children who receive such instruction to a second- class, separate status within the school and, ultimately, within society. They reason that more time on English reading should translate into more learning (Rossell & Baker, 1996). Many studies have examined the relationship between L1 and L2 development and have suggested that literacy in a student‘s native langauge provides a conceptual and skill base that transfers to reading development in a second langauge, especially in alphabetic writing systems (Cummins, 1979; Thomas & Collier, 2002). In addition, there is evidence to support the claim that a student‘s reading proficiency in his or her native language is a strong predictor of his or her later reading performance in English (Garcia, 2000; Reese et al., 2000). For example, in 2006, Francis, Lesaux, and August carried out a meta-analysis to evaluate the impact of bilingual education compared with English-only instruction on ELs‘ reading achievement. They reviewed the most methodologically rigorous studies that had been cited in prior reviews (Greene, 1998; Rossell & Baker, 22 1996; Slavin & Cheung, 2005; Willig, 1985), as well as additional studies they identified in a new search of the literature. Analyses of the effect sizes from these studies revealed a small but statistically significant advantage regarding the impact of bilingual education on English reading outcomes measures for school-age children. Moreover, these researchers did not report any evidence that bilingual instruction hindered ELs‘ academic achievement in their L1 or in English (L2). Ramirez, Pasta, Yuen, Billings, and Ramey (1991) conducted a study comparing native Spanish-speaking students in two different bilingual program models with students in an English-only program. The students in the bilingual program were either in an early- exit (e.g., transition to English in Grades 2-4) or late-exit program (e.g., transition to English in Grades 5-6). In this 4-year longitudinal study, students from four schools were matched based on pretest scores and socioeconomic status. On the English reading posttest, the students in the early exit bilingual program scored signficantly better than the students in the English-only program (Ramirez et al., 1991). Moreover, a study conducted by Thomas and Collier (2002) found that reading proficiency in a student‘s first language is a strong indicator of reading proficiency in her second language. The study focused on the academic outcomes of students in Grades K-12 who participated in either English immersion or bilingual programs from five school districts in Maine, Texas and Oregon. The bilingual program in the study involved students receiving 90% of instruction in Spanish and 10% in English in the beginning of kindergarten with English instruction increasing 10% each year. Thomas and Collier (2002) found that students who participated in a bilingual model where they received instruction in Spanish and English 23 performed at or above grade level and at the 51st percentile on standardized reading tests in Grades 1-5 in English. Students in the English immersion program showed decreases in math and reading achievement (i.e., three quarters of a standard deviation), when compared to EL students participating in a bilingual program. Most recently, Slavin, Madden, Calderon, Chamberlain, and Hennessy (2010) conducted a 5-year longitudinal study in which three successive years of kindergarteners were randomly assigned to bilingual or English-only conditions, and then followed to Grade 4. Early-exit transitional bilingual education (TBE) and structured English immersion (SEI) were compared. According to the authors, this was the first randomized study to compare TBE and SEI reading approaches over a period as long as 5 years. On the Peabody Picture Vocabulary Test (PPVT) and its Spanish equivalent (TVIP) and on English and Spanish versions of three Woodcock Reading Scales, kindergartners and first graders in TBE performed significantly better in Spanish and poorer in English than their SEI counterparts, controlling for PPVT and TVIP. After transitioning to English, TBE children in Grades 2-4 scored significantly lower than those in SEI on the measure of receptive vocabulary, the PPVT, but there were no statistically significant differences on most English reading measures. On the Spanish language (TVIP) and reading measures, TBE students scored significantly higher than SEI in Grades K-3, but not Grade 4. Both groups gained substantially in English receptive language skills over the years. These findings suggest that Spanish-dominant students learn to read in English (as well as Spanish) equally well in TBE and SEI. One conclusion that can be drawn from reviewing the research on bilingual programs compared to English-only programs is that what 24 appears to matter most in the education of English-language learners is the quality of instruction, not the language of instruction (August & Shanahan, 2006; Slavin & Cheung, 2005). Developing Literacy Skills in English-Language Learners A synthesis report from the National Literacy Panel on Language Minority Children and Youth (Francis, Lesaux, et al., 2006) and other research efforts have revealed the following strategies as effective for both monolingual students and English- language learners: (a) explicit instruction in core reading competencies, (b) controlling for task difficulty through systematic scaffolding, (c) teaching students individually or in small groups, (d) modeling, and (e) providing ongoing and systematic feedback (Foorman & Torgesen, 2001; Lyon, Fletcher, Fuchs, & Chhabra, 2006; Swanson, Harris, & Graham, 2003; Swanson, Hoskyn, & Lee, 1999; Vaughn, Gersten, & Chard, 2000). Interventions that emphasize these components are associated with improved outcomes in reading- related language skills, such as phonological awareness, rapid naming, and letter and sound identification, as well as in reading skills involving decoding, fluency, and reading comprehension, with large effects from early interventions in the foundation skills of phonological awareness and the alphabetic principle (Torgesen, 2002). Therefore, in addition to the converging research on the benefits of interventions that are based on the principles of explicit and systematic instruction, it is equally important that the focus of the instruction include the key components of literacy. 25 Key Components of Reading Preventing early literacy failure and promoting high rates of English learning can be accomplished with English-language learners if instruction is focused on the key components of reading. Consensus reports of research summaries on effective reading instruction and effective practices for teaching students with reading difficulties concur that learning to read requires explicit instruction in components of reading involving decoding words, fluency, vocabulary and comprehension (National Reading Panel, 2000; Snow et al., 1998). For example, in Preventing Reading Difficulties in Young Children (Snow et al., 1998), the National Research Council reported that for students to become successful in reading, teachers must integrate instruction involving the alphabetic principal, teaching for meaning, and opportunities to read. The following five essential components for learning to read were identified by the National Reading Panel: phonemic awareness, phonics, fluency, vocabulary, and reading comprehension. Research indicates that the core components of reading instruction for English- speaking students—phonemic awareness, phonics, fluency, vocabulary, reading comprehension—are the same for ELs, whether they are instructed to read in Spanish or English (August & Shanahan, 2006). Shanahan and Beck (2006) indicated that the effective literacy components and instructional practices for English-only students are equally effective with ELs. In the early stages of reading instruction, phonemic awareness and phonics appear to be critical because these skills transfer from L1 to L2. In addition, fluency, vocabulary, and reading comprehension skills can be lacking in ELs and appear 26 to be a highly significant factor in successful reading outcomes (August & Shanahan, 2006). The research on effective interventions for struggling readers includes interwoven elements such as building skills in the alphabetic principle from beginning decoding, to regular and irregular word reading, to reading sentences and longer text (short stories) combined with ongoing instruction in vocabulary, and comprehension (August & Shanahan, 2006; Gersten et al., 2007; Vaughn, Mathes, Linan-Thompson, & Francis, 2005). Although the research on effective interventions that focuses on the key components of literacy with ELs is limited, a few studies suggest that effective interventions for struggling readers are also effective with ELs. For example, Vaughn, Cirino, et al. (2006) conducted a study examining the effects of a reading intervention focused on the key components of literacy utilizing a pretest-posttest design. Spanish- speaking EL first graders (n = 361) were screened on measures of early reading development, and students who scored below the 25th percentile on a word-reading subtest were selected to receive the treatment intervention. The 35 students in the treatment condition received a 7-month reading intervention focused on the following six instructional practices: (a) phonemic awareness and decoding, (b) vocabulary development, (c) promotion of English language learning, (d) explicit teaching, (e) interactive teaching that maximized student engagement, and (f) opportunities for student response with teacher feedback (Vaughn, Cirino, et al., 2006). The students in the comparison group received the school‘s standard intervention for first graders, Reading Recovery in English and Other Languages (Clay, 1993). 27 At the end of the 7-month intervention period, students in the treatment condition significantly outperformed the students in the comparison condition on posttest measures of phonological awareness, word attack, word reading, and comprehension skills. The researchers concluded that ELs struggling with reading increased their scores on reading measures when they received systematic and explicit instruction that focused on the key components of literacy. Systematic and Explicit Instruction Research suggests that the architecture of the curriculum matters and certain teaching routines are effective at improving outcomes of at-risk students, including ELs. The research has highlighted explicit instruction in core reading competencies, controlling for task difficulty through systematic scaffolding, teaching students individually or in small groups, modeling, feedback, teaching when and where to apply strategies, ongoing and systematic feedback, and ongoing progress monitoring (Foorman & Torgesen, 2001; Lyon et al., 2006; Swanson et al., 2003; Swanson et al., 1999; Vaughn et al., 2000). Interventions that emphasize these components are associated with improved outcomes in reading-related language skills, such as phonological awareness, rapid naming, and letter and sound identification, as well as in reading skills involving decoding, fluency, and reading comprehension, with large effects from early interventions in the foundation skills of phonological awareness and the alphabetic principle (Torgesen, 2002). The research on the effectiveness of supplemental instruction for Spanish-speaking children is neither robust nor prominent. Emerging research indicates that English- 28 language learners and native English speakers follow similar paths in the development of early literacy skills (Gunn, Smolkowski, Biglan, Black, & Blair, 2005). Moreover, findings indicate that English-language learners can learn phonemic awareness and word identification skills in English at the same rate as native English speakers (Gersten & Geva, 2003). For example, Linan-Thompson and Hickman-Davis (2002) found that low- SES second-grade Spanish-speaking children who received explicit and systematic supplemental reading instruction improved their English reading skills as much as native- English-speaking children. The following is a review of the research on effective interventions that employed systematic and explicit instruction in the key components of reading for monolingual and bilingual students. Interventions With Monolingual Students Torgesen et al. (2001) conducted a longitudinal study that followed a group of children from kindergarten through third grade. These researchers were interested in examining the relation between the intensity of two interventions on later reading achievement. Children were selected to participate based upon the following criteria: (a) serious word-learning deficits, (b) standard scores on a word-reading test at least 1.5 SD below average, (c) estimated verbal intelligence above 75, and (d) below-minimum average on a phonological awareness test. The final group of 60 children between the ages of 6-10 were randomly assigned to two instructional programs that incorporated principles of effective instruction but differed in the depth and extent of instruction in phonemic awareness and phonemic 29 decoding skills. All children received 67.5 hours of one-to-one instruction in two 50-minute sessions per day for 8 weeks. The dependent measures included assessments on phonetic decoding, word reading, and phonological awareness (e.g., blending, segmenting and elision). Both instructional programs produced very large improvements in generalized reading skills that were stable over a 2-year follow-up period. The growth during the intervention compared to previous growth in learning-disability resource rooms produced effect sizes of 4.4 for one of the interventions and 3.9 for the other. Importantly, 40% of the children were no longer considered to require special education services. In another study, Simmons et al. (2007) examined the effects of beginning reading interventions on early phonemic, decoding, and spelling outcomes. Ninety-six kindergartners indentified as at risk for reading disability participated in the study. Students were randomly assigned to one of the three interventions and received 108 thirty-minute sessions of small-group instruction as a supplement to their core reading instruction in the classroom. The three instructional interventions varied systematically along two dimensions—time and design of instructional specificity: (a) 30 minutes with high design specificity (30/H), (b) 15 minutes with high design specificity plus 15 minutes of non-code-based instruction (15/H+15), and (c) a comparison condition that included the use of a commercially available reading program. The dependent measures utilized in this study included measures of early phonemic, decoding, spelling and vocabulary outcomes. Results indicated that 30 minutes of high design specificity in small-group intervention that focused on phonemic 30 awareness and decoding were comparable to 30 minutes of moderately specified instruction in increasing at-risk kindergarten students‘ phonemic awareness proficiency in initial sound isolation and phonemic segmentation. Thirty minutes of high design specificity also proved significantly more effective than 30 minutes of moderately specified design of instruction in increasing levels of fluent phonemic decoding, spelling fluency, and automatic retrieval and production of handwritten letters for all at-risk students. Furthermore, a program that employed high design specificity was significantly more effective than a program that employed 30 minutes of a moderately specified design-of-instruction approach in increasing levels of word attack and word identification. Interventions With Bilingual Students There is substantial evidence on instructional approaches to teaching students with reading difficulties, including English-language learners (Cirino et al., 2009). For example, the What Works Clearinghouse (Gersten et al., 2007) reports the findings from four randomized control trial studies conducted with English-language learners. Two of the four studies were found to have lasting effects on the reading achievement of ELs (Gersten et al., 2007). The common threads in the instructional approaches used in these studies included (a) explicit instruction in core reading competencies, (b) controlling for task difficulty through systematic scaffolding, (c) teaching students individually or in small groups, (d) modeling, (e) feedback, (f) teaching reading strategies, and (g) ongoing progress monitoring (Cirino et al., 2009; Gersten et al., 2007; Linan-Thompson & Ortiz, 31 2009). The following is a review of the research related to the effectiveness of early intervention with Spanish-speaking ELs at risk for reading problems. Chambers, Slavin, Madden, Cheung, and Gifford (2004) conducted a pre-post study investigating the effects of an adaptive version of the Success for All program (Slavin & Madden, 2001) on the reading outcomes of English-language learners in first grade. First-grade ELs (n = 172) were matched based on initial reading level and then randomly assigned to either the treatment or control condition. Those in the treatment condition received the Success for All program (Slavin & Madden, 2001) with embedded video segments. The embedded video included four types: (a) animations to present letter sounds, (b) puppet vignettes to present sound blending, (c) live-action skits to present vocabulary, and (d) a variety of segments from the television program Between the Lions to reinforce various skills. The brief video segments were interspersed in teachers‘ lessons intended to provide direct instruction and clear visual reinforcements of reading skills. Students in the control condition received a different core reading curriculum. The dependent measures were measures of phonetic decoding (word attack), real word reading (word identification), and comprehension. Results indicated that, controlling for PPVT, using Success for All with embedded video scored significantly higher than controls on Woodcock Word Identification (ES = +0.40), Word Attack (ES = +0.36), and Passage Comprehension (ES = +0.21). In another study, Gunn et al. (2000) evaluated the effects of supplemental instruction in reading for students in kindergarten through third grade. ELs and English- speaking students in kindergarten through second grade were screened on measures of 32 early reading skills and oral reading fluency. Students were then randomly assigned to receive or not receive supplemental reading instruction focused on phonological awareness and decoding skills. The daily supplemental reading instruction was delivered for 30-45 minutes in small groups. The lesson content was delivered using a teaching routine that employed modeling new content, providing guided practice, and implementing independent practice. Gunn et al. (2000) found that the intervention had statistically significant effects on reading achievement. The students who received the intervention demonstrated significantly higher scores on measures of reading fluency, letter-word identification and word attack than students who did not receive the intervention. One year later, treatment students outperformed comparison students on word reading, oral reading fluency, vocabulary and comprehension measures (Gunn et al., 2000). In a follow-up study, Gunn et al. (2005) found continued effects in favor of treatment students relative to comparison students. In addition, Gunn et al. (2005) found that ELs who did not speak English at the onset of the study profited as much from the interventions as ELs who spoke English at the onset of the study. Denton, Anthony, Parker, and Hasbrouck (2004) examined the effectiveness of two English reading tutoring interventions for Spanish-dominant English-language learners. Students in Grades 2 through 5 were selected to participate based on the following criteria: (a) recommended for tutoring by teacher, (b) enrolled in a bilingual- Spanish program, and (c) adequate oral English proficiency to benefit from tutoring in English. The final group of 93 students was assigned to one of the four conditions: 33 (a) Read Well (Sprick, Howard, & Fidanque, 1998); (b) Read Well comparison group; (c) Reading Naturally (Ibnot, 1992) treatment group; and (d) Read Naturally comparison group. The Read Well program (Sprick et al., 1998) provided explicit, systematic instruction in English decoding along with sustained practice of skills in decodable text and vocabulary development. A modified version of the Read Naturally program (Ibnot, 1992) provided instruction in fluency, contextualized vocabulary and comprehension strategies. Students in the treatment conditions received tutoring three times per week for 40 minutes over a 10-week period. The dependent measures of phonetic decoding, word attack and reading comprehension were used to compare progress for students in each of two experimental and two comparison groups. The students in the Read Well treatment group made significant progress in word identification compared to the students in the comparison condition. There were no statistically significant effects for students in the Read Naturally treatment group. From the studies reviewed, it is evident that there are interventions designed for monolingual students that are showing promise for improving reading achievement for ELs. Across all studies reviewed, the interventions utilized in the treatment condition included explicit instruction in phonetic decoding, reading fluency and vocabulary. Results from the studies suggest that this type of instruction leads to improved student outcomes on measures of word reading and word attack. The findings in these studies are consistent with the findings of the National Reading Panel (2000). Foorman and Moats (2004) provide a concise summary of the findings of the NRP, which suggest four key 34 points relevant to early reading instruction. First, explicit instruction in the alphabetic principle in conjunction with reading comprehension is important to effective reading instruction. Second, the benefits of small-group instruction have been shown to be as effective as one-to-one instruction. Third, the use of skilled paraprofessionals is as effective as the utilization of teachers. Finally, the interventions implemented in later grades are not as effective as those implemented in Grades 1 and 2 (Foorman & Moats, 2004). Reading Reform Efforts and English-Language Learners One of the ways schools are addressing the instructional needs of English- language learners is by including them in comprehensive reading reform efforts. No Child Left Behind (NCLB) and other policy initiatives at federal and state levels require that ELs be full participants in school-wide reform efforts. Increasingly, school reform efforts include a multitiered instructional framework for delivering reading instruction and monitoring student progress. In this framework, Tier 1 is comprised of a core reading program and a benchmark assessment system to determine when students are not meeting grade-level benchmarks. Students not meeting benchmarks receive Tier II instruction, which is typically provided in small homogeneous groups, in which the teacher utilizes the supplemental materials from the core reading program to reinforce or provide more opportunities for students to reach proficiency on important skills (Vaughn et al., 2005). Unfortunately, in general, core reading programs and supplemental interventions are not designed explicitly to support the reading acquisition of ELs. Intervention 35 programs intended for English-speaking students do not include instruction on the similarities and differences between English and Spanish. Nor do these programs explicitly teach the academic language and literacy content necessary for ELs to participate fully in classroom discourse (Gersten, 1999). To further complicate the problem, there is a lack of instructional programs developed specifically for ELs available to schools. On the other hand, it has been demonstrated that intervention programs that are designed based on the principles of explicit and systematic instruction coupled with English-language development are effective with EL students (Gersten, Santoro, & Jiménez, 2006; Linan-Thompson & Ortiz, 2009; Santoro, Jitendra, Starosta, & Sacks, 2005). Linguistic Transfer Research suggests that instruction focused on supporting linguistic transfer and language development is effective for promoting English language learning for ELs. Most studies of early reading acquistion have focused on the development of word reading and phonological decoding skills, reflecting a widespread assumption that much of the variability in reading comprehension is due to printed word identification and phonological decoding skill (Ehri, 1998; Hoover & Gough, 1990; Perfetti, 1985). There is considerable evidence that phonological processing is one of the major coginitive determinants of the development of word-level reading skills in the early phases of learning to read (Goswami & Bryant, 1990; Share & Stanovich, 1995; Wagner & Torgeson, 1987). Furthermore, relationships have been found between phonological 36 awareness and word reading in a wide variety of alphabetic languages (Jiménez, Gonzalez, & Haro García, 1996; Manis & Freedman, 2001; Wolf, Pfeil, Lotz, & Biddle, 1994). Phonological decoding is influenced to varying degrees by a given language‘s speech sounds (e.g., phonology) and written symbols (e.g., orthography), as well as how those sounds and symbols are processed cognitively (e.g., sound to symbol). The key phonological and orthographic factors that have influence on word-level reading and reading fluency in Spanish are the simplicity of sound-to-symbol correspondences and the syllabic structure of the language (Seymour, Aro, & Erskine, 2003). At the phoneme level, Spanish is parsimonious relative to English. In contrast with the approximately 42 phonemes of English (e.g., 15 vowel sounds and 24 consonant sounds), Spanish has approximately 24 (e.g., five distinct vowel sounds, 19 consonant sounds; August & Shanahan, 2006). The relative succinctness of the phonemes in Spanish streamlines the quantity of sounds that the learner must match with letters. With respect to orthographies, both English and Spanish have 26 graphemes. The orthographies can be analyzed according to their ―transparency‖—the degree to which they adhere to the alphabetic principle of one-to-one correspondence between sound and grapheme (Seymour et al., 2003). According to this metric, the orthography of Spanish is highly transparent; that is, the 24 phonemes in Spanish can be represented by 26 individual graphemes and three digraphs (e.g., ch, ell, rr). There are few exceptions to the alphabetic principle. For example, in all American dialects of Spanish, /s/ can be spelled with c as in centavo (cent) or s as in sentir (to feel), or the c can make the sound /k/ as in 37 cuaderno (notebook) or /s/ as in conocer (to be acquainted with). In sum, the child learning to read Spanish encounters a highly predictable system for reading and spelling. In contrast to Spanish, English is highly opaque, or inconsistent in its grapheme to phoneme correspondences. In many cases, an individual English letter has multiple possible pronunciations and a given sound can be spelled with several different letters. For example, in most dialects of U.S. English, the sound /sh/ can be spelled with sh as in ship, ti as in nation, su as in insurance, ci as in special, ch as in charlatan, and sch as in borscht. With respect to sound symbol relationships for spelling alone, the approximate 39 phonemes of English can be represented by literally hundreds of graphemes or grapheme combinations. The larger phoneme inventory of English, combined with the extraordinary high number of options for representing phonemes, complicates the English-language learner‘s choices for sound-symbol reading and spelling. Cummins (1979) and Cummins et al. (1984) discussed the relationship between L1 abilities and L2 acquisition. Cummins‘ linguistic interdependence hypothesis suggests that the acquisition of L2 is mediated by the level of L1 competence at the time the child begins to acquire the L2. Transfer would be expected for skills that are thought to be fundamental for reading acquisition in any language, such as phonological awareness and lexical access. Transfer should be enhanced when a child has received some instruction in L1 and has made a transition to L2 reading and language instruction (August, Calderon, & Carlo, 2001). Although the research on transfer of reading-related skills from one language to another has not been extensive, there is growing evidence for cross-language transfer of phonological awareness, single-word reading, and fluency (August et al., 2001; 38 Jiménez et al., 1996). For example, phonological awareness in Spanish or Korean appears to transfer to phonological awareness in English. This awareness can also predict reading and spelling development in both languages, even when the two languages are different from each other (e.g., English-Hebrew, French-English, and Spanish-English). Each language has different phonological characteristics, and ELs often encounter specific difficulties related to their native language (August et al., 2001; Jiménez et al., 1996). The purpose of the transition lessons in this dissertation study was to focus teacher instruction on the early reading skills that allowed ELs to utilize their knowledge of phonology and orthographies across two languages, Spanish and English. The lessons made explicit for students the similarities and differences between linguistic features of English and Spanish. In addition, the lessons provided ELs with the necessary scaffolding to link what they knew about decoding words in their native language with decoding new words in English. Language Development In addition to problems at the word-reading level, other potential problems that affect students‘ reading comprehension include the inability to activate word meanings. These problems are particularly apparent in ELs (Proctor, Carlo, August, & Snow, 2005). For example, Slavin and Cheung (2005) suggest that English learners need to learn many words to catch up with their native-English-speaking peers‘ word knowledge. As a result of converging evidence that vocabulary instruction is essential for teaching ELs to read, the authors of an IES practice guide on English-language learners recommend that 39 evidence-based vocabulary instruction should be a strong part of reading instruction and an integral part of English language development (Gersten et al., 2007). In addition to explicit vocabulary instruction, the research suggests that instruction for ELs should include a focus on developing academic language. Several researchers argue that knowledge of academic language is key to ELs‘ academic success (August & Shanahan, 2006; Carlo et al., 2004; Gersten et al., 2007). Although academic language has been considered an important factor in students‘ academic success (Francis, Rivera, Lesaux, Kieffer, & Rivera, 2006), it is a complex concept that has been defined and operationalized from a variety of perspectives and for a variety of purposes. According to Baumann and Graves (2010), there are a constellation of terms surrounding academic language. In the literature, academic language has been referred to as general academic vocabulary, academic literacy, academic background, general academic words, domain knowledge, academic competence, linguistic knowledge, domain-specific vocabulary, and content vocabulary (Baumann & Graves, 2010). Furthermore, the definitions of the various terms of academic language are often inconsistent and redundant. In their literature summary, Anstrom et al. (2010) identify three primary challenges in defining academic language. First, varying perspectives on the nature of language and academic language have resulted in multiple systems for understanding the construct. Because researchers from different philosophies and educational backgrounds approach academic language in very different ways, the range of conceptual frameworks and models vary from those with a primarily linguistic focus, to those that emphasize the social context, to those that emphasize use in specific content areas. 40 Second, defining academic language is further complicated by the complex nature of the academic language construct itself. In general, the linguistic elements that comprise the construct include discourse features such as language functions, grammar/structure, and vocabulary across the language modalities (listening, speaking, reading, and writing) and content areas (science, mathematics, language arts, and history/social studies). In addition, the increased complexity of linguistic features and sophistication of language used from year to year as students progress through the grades present unique challenges. Finally, as previously mentioned, the nature of the information that is available varies in kind and completeness. A growing number of definitions and discussions about academic language have appeared in the literature. For example, academic literacy is used by several theorists as a broad term that refers to the language used in school to help students acquire and use knowledge (Bailey & Heritage, 2008; Lea & Street, 2006). Proponents of academic literacy suggest that language used in schools is developmental with trajectories of increased sophistication from grade to grade, with specific linguistic details that can be the same or vary across content areas. In other literature, academic English is used and defined as part of overall English language proficiency that also includes more social uses of language both inside and outside the school environment. It is referred to as a variety of English, as a register, or as a style, and is typically used within specific sociocultural academic settings (Bailey & Butler, 2007; Gutierrez, 2008). The term academic language often appears in the literature in discussions of linguistic registers. Scarcella (2008) discusses both the types of language and the types of 41 cognitive knowledge, skills and strategies students must have to perform well in content classes. She describes the foundational knowledge of English and the basic skills in English as important for communication both outside and within the school setting (e.g., knowing how to read and write, how to produce key types of sentences, how to use verb tenses). Scarcella (2008) argues that prerequisite to the teaching and learning of subject- specific language, ELs should have a foundational knowledge of English. In addition, she asserts that basic vocabulary is critical; a large number of commonly known words must be acquired, including academic words, complex sentence structures, and discourse features that provide cohesion (p. 6). Snow and Uccelli (2009) provide a recent inventory of social and academic uses of language that draws on linguistic features already identified in the literature as a starting point. They suggest organizing linguistic features into the following categories: interpersonal stance, information load, organization of information, lexical choices, and representational congruence (i.e., how grammar is used to depict reality) with specific vocabulary and grammar structures necessary to actualize the features. They offer a pragmatic heuristic based on context and social interaction as the core for characterizing academic language that captures the specifics of lexicon, grammar, and discourse features. In contrast to the notion that academic language is a linguistic register, Pilgreen (2007) argued that academic language involves the knowledge of specific words, including the multiple meanings of words within and across content areas. Furthermore, 42 Pilgreen includes terms that teachers use as part of reading instruction or that writers of textbook programs use to describe instructional processes and tasks (p. 241). According to Gersten et al. (2007), academic language is defined as follows: ―Academic English is the language of the classroom, of academic disciplines (science, history, literary analysis) of texts and literature, and of extended, reasoned discourse. It is more abstract and decontextualized than conversational English‖ (p. 24). Given this complexity, it is not surprising that academic language is an evolving construct on which little agreement can be found in the literature. Although the research base is lean and inconsistent, there is consensus that students must be able to understand and use language in a variety of situations to be successful in school, though Valdes (2004) indicates that much more work needs to be done by the profession in understanding the kinds of language that will result in school success (p. 102). Furthermore, two prominent documents articulate that curricula designed to promote academic language are lacking; therefore, teachers are left on their own to modify instruction to include this focus (August & Shanahan, 2006; Gersten et al., 2007). Academic Language in the Context of This Dissertation Study Vocabulary instruction, including academic language, is one of the literacy components addressed in the transition lessons. Academic language in the context of the transition lessons was defined as word knowledge deemed necessary for understanding the teacher instruction and student texts throughout the lessons. There were three types of academic language addressed in the transition lessons: (a) instructional terminology, 43 (b) literacy/story content words, and (c) transition words. The first category is instructional terminology, which refers to the instructional language that the teacher uses to teach different skills and strategies in the lessons. Instructional terminology is addressed in the lessons to enable ELs to understand and participate in instruction. For example, words such as sound, blend, and consonant are explicitly taught at the beginning of the phonics lesson. Another type of academic language covered in the lessons was literacy content words. Literacy content words were defined as words that the student encountered in the stories and in the teacher instruction during the read-aloud section of the lessons. For example, ELs learn the meaning of the word title during prereading strategy instruction. Other literacy content words covered in the lessons included author, noun, verb, adjective and question. Finally, transition words and basic vocabulary were explicitly taught in the lessons. The words selected in the lessons were basic vocabulary and transition words that occurred frequently and uniformly across a wide range of reading material. For example, the words first, last, next were repeatedly taught in the lessons as part of the story sequencing activity. The overall purpose of teaching academic language in the transition lessons was to prepare ELs for the early reading instruction and text reading they would encounter in second-grade English classrooms. A more detailed description of the vocabulary and academic language addressed in the transition lessons can be found in Chapter III of this dissertation. 44 CHAPTER III METHODOLOGY The first and second chapters of this dissertation reviewed relevant literature and provided a rationale for the present study, which examines the effects of the transition lessons on the English reading achievement of students in a transitional bilingual program. As described in Chapter I, this dissertation study was part of a larger 4-year randomized control study and serves as the context for this study (Baker et al., 2007). In this chapter, I describe the sampling frame and the procedures used to assign children to treatment and control conditions and teacher participants. In addition, a description of the independent variable (i.e., treatment and control groups) and dependent variable (i.e., measures of reading outcomes) are provided. Lastly, the data analysis procedures are described in relation to the proposed research questions. Participants Student Participants Participants were recruited from four school districts from the SETR national study. The principal investigator from the larger study and student researcher presented to school administrators an overview of the transition lesson study, including goals and procedures of the project. Of the eight treatment schools in the four districts, seven schools agreed to participate. The school districts are labeled School District 1, School District 2, School District 3, and School District 4 in order to maintain the anonymity of 45 the participants in this study. The participating elementary schools within the school districts are labeled School A, School B, School C . . . School G. Three schools were from School District 1 (School A, School B, and School C) and two schools were from School District 2 (School D and School E). In addition, one school from School District 3 (School F) and one school (School G) from School District 4 participated in the study. The participating schools shared similar student demographics and qualified for Title I services. Student enrollment across schools ranged from 471-592 with the exception of one school (which had 236 students). All schools shared similar demographics: 22-41% were ELL students and 39-63% of the students qualified for free or reduced lunch, as noted in Table 1. The student participants were identified based on early literacy performance levels. Students performing in the strategic and intensive category on the TABLE 1. Characteristics of School and Student Participants School Enrollment % of ELL % of free or reduced lunch Number of student participants A 432 27 44 9 B 471 24 46 10 C 592 22 47 18 D 555 41 48 7 E 584 34 53 11 F 236 23 39 8 G 481 30 63 18 46 DIBELS Nonsense Word Fluency (NWF) and Oral Reading Fluency (ORF) subtests were deemed eligible to participate in the study. Students performing in the strategic and intensive category had to score less than 49 on the DIBELS NWF subtest and less than 19 on the ORF subtest. A description of the early literacy DIBELS measures is provided in the next section. Following review of the winter DIBELS data in the schools that agreed to participate, 78 students were identified to participate in the study. Teacher Participants The overall number of teachers per participating school ranged from 18-36 with the average years of experience ranging from 4.4-14.7, as noted in Table 2. The number of educational assistants across schools ranged from 5-17. TABLE 2. Characteristics of Teachers by School School Number of teachers Number of educational assistants Average years of experience % of teachers with master‘s degree A 24 9 9.8 62 B 28 12 4.4 57 C 36 17 7.5 63 D 35 14 11.3 54 E 29 5 13.1 44 F 18 9 14.7 56 G 27 11 10.6 74 47 In this study, the teacher participants were certified teachers and educational assistants who were responsible for teaching English-language learners in the participating schools. Of the 14 teacher participants, 11 (or 79%) were educational assistants. In School District 1, the teacher participants (n = 6) were educational assistants, with the exception of one who was a certified kindergarten teacher. The educational assistants were responsible for assisting the classroom teacher with providing small-group instruction in the first-grade transitional bilingual program during the school day and were hired as instructors in the after-school program. The certified teacher was a kindergarten teacher during the school day and was hired as an instructor in the after- school program. In School District 2, the teacher participants (n = 4) were educational assistants responsible for assisting the Title I department with providing small-group instruction for first-grade ELs. In School District 3, the teacher participants (n = 2) were certified teachers in the first-grade transitional bilingual program. In School District 4, the teacher participants (n = 2) were educational assistants responsible for providing intervention for ELs as part of the first-grade two-way bilingual program. Instructional Setting In this study, the instructional setting varied across school districts. The intervention was implemented during the after-school program in three of the schools and during school hours in four of the schools. In School District 1 (i.e., School A, School B, and School C), the instruction occurred during the after-school program. Thus, the 48 students were enrolled in the after-school program and the teachers were employed as instructors in the after-school program. In the remaining school districts (i.e., School District 2, School District 3 and School District 4), the instruction occurred during the regular school day. Thus, students were enrolled in the schools (i.e., School D, School E, School F, and School G) as first graders in the transitional bilingual program and teachers were employed as instructors at the school. As mentioned earlier, all participating schools were considered treatment schools in the larger SETR study. Therefore, the instructional setting for the larger study was the general education classroom, specifically during Spanish core reading instruction. In this smaller study, the instructional setting was 30 minutes of additional instruction outside of the regular classroom reading block and was conducted either after school or during school but after the scheduled ―daily‖ reading block. Assignment of Subjects to Condition Once participants meeting the selection criteria were identified, their scores on the fall SAT-10 Word Reading subtest were obtained. The SAT-10 word reading subtest was administered in the fall as part of the larger study. It was determined that the SAT-10 Word Reading subtest would be the most reliable indicator of the initial reading level of the students. Within each school, the students were rank-ordered according to the fall SAT-10 Word Reading subtest score. Adjacent scores were used to form matched pairs, an approach that followed the matched-ability, random assignment procedures discussed in Shadish, Cook, and Campbell (2002). Then a random assignment process was used to 49 assign individual students to either the treatment or control condition. A coin toss procedure was used to randomly assign students to condition. If heads came up on the coin, then the first student in the pair was assigned to the treatment condition and the other student was assigned to the control condition. On the other hand, if tails came up on the coin first, then the first student was assigned to the control condition and the other was assigned to the treatment condition. This process for assigning students was used in an attempt to equalize instructional groups on word-reading performance. Seventy-eight participants were matched and then randomly assigned to condition. Half of the participants were in the treatment group (n = 39) and the other half were in the control group (n = 39). Participants were from a variety of schools, with many of them coming from School G (n = 16, 20.5%) or School C (n = 16, 20.5%). A large number of the participants were in School District 1 (n = 36, 46.2%). Frequencies and percentages for the school demographics are presented for each group (treatment and control) in Table 3. Independent Variable There was one between-subjects factor with two levels in this study: One level was the implementation approach of the SETR transition lessons (i.e., Treatment Group), and the other level was the implementation of the standard school-based curriculum for English-language learners (i.e., Control Group). Students were randomly assigned to one of the two instructional groups and received instruction for at least 30 minutes per day from March 1, 2010, to June 1, 2010, which amounted to approximately 60 days of 50 instruction for a total of approximately 1,800 minutes or 30 hours of instruction. Instruction was delivered by trained instructional assistants or teachers in small groups ranging from four to seven students in both conditions. The student participants received the 30-minute intervention in addition to Spanish reading instruction in the core reading program and SETR templates, which on average was conducted for 60 minutes each day. The student participants in both the treatment and control conditions received instruction in the SETR templates as part of the core Spanish reading program in the larger study. TABLE 3. Frequencies and Percentages for School Demographics for Each Group (Treatment and Control) School demographics Treatment (n = 39) Control (n = 39) n % N % School A 4 10.3 4 10.3 B 6 15.4 6 15.4 C 8 20.5 8 20.5 D 4 10.3 3 7.7 E 5 12.8 6 15.4 F 4 10.3 4 10.3 G 8 20.5 8 20.5 District 1 18 46.2 18 46.2 3 9 23.1 9 23.1 2 4 10.3 4 10.3 4 8 20.5 8 20.5 51 Transition Lessons Transition Lessons Overview The Transition Lessons were a set of 12 lessons; each of the 12 lessons consisted of a 5-day ―daily lesson‖ plan (60 daily lessons). The daily lesson was designed to be implemented in 30-45 minutes. Each daily lesson was comprised of two sections designed to develop student decoding skills and language proficiency. The decoding section was designed to develop phonemic awareness, letter sound knowledge, word reading, vocabulary words introduced in the decodable books, and sentence reading. The language proficiency section was designed to build student academic language, content vocabulary and comprehension strategies through the use of read-alouds. All lessons were scripted and followed the principles of effective instructional design (Coyne et al., 2011). The lessons included a script for teachers to do the following: (a) explicitly model the use of learning strategies and new skills, (b) control task difficulty by scaffolding instruction, (c) provide multiple opportunities for students to respond in groups and individually, and (d) provide ongoing corrective feedback (see Figure 2). Instructional Scope and Sequence The instructional approach remained consistent throughout and across all 12 lessons. While each lesson template was designed to enhance previously learned skills, each one focused on a different skill or concept. Throughout the lessons, the teacher followed instructional routines that students were familiar with, in a systematic and 52 FIGURE 2. Transition lessons. 53 explicit manner. In addition, the explicit instruction was designed to help students learn the content-specific, instructional terminology necessary to fully participate in an English reading lesson. Thus, the teacher (a) introduced the objective; (b) reviewed previously taught skills (familiar elements) so that students were made aware of the similarities between English and Spanish; (c) taught linguistic differences (new elements) between the two languages (e.g., the letter ―z‖ makes a different sound in English compared to Spanish); (d) introduced vocabulary, including academic vocabulary from the decodable and read-aloud stories; (e) explained instructional terms relevant to the reading component addressed in the lesson; and (f) provided guided and independent practice in decoding, reading fluently and developing basic comprehension skills. One purpose of the transition lessons was to focus teacher instruction on the early reading skills that allow ELs to utilize their knowledge of phonology and orthographies across two languages, Spanish and English. The lessons made explicit for students the similarities and differences between linguistic features of English and Spanish (e.g., the letter ―v‖ does not exist in Spanish and is therefore a new sound for Spanish-speaking ELs). In addition, the lessons provided ELs with the necessary scaffolding to understand the instructional terminology relevant to the skills and literacy components covered in the lesson. For example, as part of the phonemic awareness instruction, students need to understand and know the word sound to complete segmenting and blending phoneme activities. Table 4 lists English elements and instructional terminology by literacy components covered in the lessons. 54 TABLE 4. Scope and Sequence for the Literacy Component and English Elements Covered in the Transition Lessons Literacy Component English Element covered in lessons Consonant sounds English consonant sounds with no direct equivalent in Spanish: /d/ - dig /j/ - juice /r/ - rope /v/ - van /z/ - zipper /sh/ - shell /zh/ - treasure /th/ (voiceless) thin final /dg/ - lodge Vowel Sounds English vowel sounds not present in Spanish: Short vowels: /a/- man /e/- pen /i/- tip /u/ -up Long vowels: /a/- ake /u/-ute /i/-ime /e/-meat Instructional Terminology Letters consonants vowels sounds decodable words irregular words fluent accurately Decoding Description The decoding section of the transition lessons began with a phonemic awareness warm-up activity. During the phonemic awareness activity, explicit instruction was 55 provided to help students produce individual speech sounds. The scope and sequence for the phonemic awareness instruction across the 5-day plan was as follows: (a) Day 1, isolating initial sounds; (b) Day 2, segmenting sounds; (c) Day 3, blending sounds; (d) Day 4, manipulating sounds; and (e) Day 5, review. Table 5 provides an example of the phonemic awareness lesson from Lesson 1, Day 1. TABLE 5. Example of Phonemic Awareness (Lesson 1, Day 1) Instructional focus T: Today we are going to learn the sounds of letters in English. Instructional language terms T: Sound is the English word for the Spanish word sonido. What is sonido in English? (make signal) Ss: Sound. Teaching routine T: I will say a word and you are going to tell me the first sound in the word. My turn: Mat. First sound? (wait 1 second then make a signal) /m/ Sat. First sound? (wait 1 second then make a signal) /s/ T: Let‟s do it together. Mat. First sound? T & Ss: /m/ T: Sat. First Sound? T & Ss: /s/ Practice T: Your turn. Continue with: Sam, Sal, Al, last, at Students should practice this activity as a whole group. Provide individual turns. Error correction If a student makes an error, stop the student and say, “My turn, the first sound in __________, is “/___/ .” Everybody, what is the first sound in __________? The phonics section continued with activities focused on explicitly teaching the letter names and sounds introduced in the phonemic awareness activity. During the phonics activity, explicit instruction was provided to help students read sound-symbol 56 representations. Letter cards (i.e., laminated cards that are 4.25 inches by 3.67 inches with a single letter of the English alphabetic printed on each card) were included with the teacher materials. The scope and sequence of instruction in the letter and sounds portion of the decoding section was as follows: (a) Day 1, letter names and sounds introduction; (b) Day 2, letter names and sound practice; (c) Day 3, letter and sound dictation; (d) Day 4, Road Race fluency game; and (e) Day 5, fluency practice with letter cards. The ―Road Race‖ fluency game was a template resembling a racetrack with start and stop points, containing the letters and words taught in the lesson. Table 6 provides an example of the phonics section focused on letter names and sounds introduction from Lesson 1, Day 1. The word work portion of the decoding section provided students the opportunity to read and write words containing the letter-sound correspondences taught in the current and previous lessons. In addition, students were taught a select number of sight words in each lesson. In the transition lessons, sight words were considered words that contained irregular letter patterns or included letter patterns that had not yet been introduced in the lessons. The sight words were selected because they were included in the decodable stories in the sentence-reading portion of the lesson. Word cards were included in the teaching materials. The scope and sequence of the word work portion was as follows: (a) Day 1, sounding out and reading decodable and sight words; (b) Day 2, reading decodable and sight words; (c) Day 3, writing decodable and sight words through dictation; (d) Day 4, ―Road Race‖ fluency game with decodable and sight words; and (e) Day 5, fluency 57 TABLE 6. Teacher Script for Phonics Lesson (Lesson 1, Day 1) Instructional Focus T: Now we are going to learn the names of some letters in English. Instructional language terms T: Letter is letra in Spanish. L and M are letters (show students these letter cards). T: Say it with me. T & Ss: L and M are letters. Teaching routine T: I will show you the letter card and then I will ask you to say the name of the letter in English. T: My turn, this is L. What is the name of this letter? T: L T: Let‟s do it together. What is the name of this letter? T & Ss: L T: Your turn. What is the name of this letter? Ss: L Continue with other letters: m, s, t, a Practice T: Let‟s practice together. Show students the cards in random order and ask: What is the name of this letter? Remember Letter is letra in Spanish. T: Now let‟s do individual turns. Error correction If a student makes an error, stop the student and say, “My turn, the name of the letter is ___. Everybody, what is the name of the letter? _____ Instructional focus T: Now we are going to learn the sounds of these letters in English. Teaching routine T: I will show the letter card and then I will ask you, “What is the sound?” You will say the sound. T: My turn, the sound of this letter is /m/. What is the sound? /m/ T: Let‟s do it together. What is the sound? T & Ss: /m/ T: Now it‟s your turn. What is the sound? Ss: /m/ Repeat with each letter, except a. T: Did you notice that these letters have the same sound in Spanish and in English? Yes, they do! 58 TABLE 6. (Continued) Practice T: I will show you the letter cards and you will tell me the sound of the letter. Ready? Show students the cards in random order. Provide individual turns. Error correction If a student makes an error, stop the student and say, “My turn, the sound of the letter is ___. Everybody, what is the sound of the letter? You can remind students that the sound is the same as in Spanish. Instructional focus T: Now we are going to learn a new sound in English. Teaching routine Show the card with the letter a. T: My turn, the sound of this letter in English is /a/. What is the sound? /a/ T: Let‟s do it together. What is the sound? T & Ss: /a/ T: Your turn, what is the sound? Ss: /a/ (The sound of this letter is different in Spanish. In Spanish the sound is pronounced /a/ as in “father.” In English, it is pronounced /a/ as in “apple.”) Practice T: Let‟s practice with all the letters we learned today. I will show a letter card and you will say the sound. Show students the cards in random order and ask students, “What is the sound?” Provide individual turns. Error correction If a student makes an error, stop the student and say, “My turn, the sound of the letter is ___. Everybody, what is the sound of the letter? practice with word cards. Table 7 provides an example of the teaching script for word work from Lesson 1, Day 1. 59 TABLE 7. Teaching Script for Word Work (Lesson 1, Day 1) Instructional focus T: Now we are going to use the sounds we just learned to read words. Instructional language terms T: Word is palabra in Spanish. What does Palabra mean in English? Make a signal, then have students respond. Ss: Word. Teaching routine T: We will say each sound, then we will read the word. Watch me as I read this word. My turn. (Model, placing your finger under each letter and say each sound. Sweep your finger under the word and read the word.) T: /m/ /a/ /t/, mat. T: Let‟s do it together. (Point to each letter and say the sound. Sweep your finger under the word and read the word.) T & Ss: /m/ /a/ /t/, mat. T: Your turn. (Point to each letter and say the sound. Sweep your finger under the word and read the word.) Ss: /m/ /a/ /t/, mat. Practice Continue with remaining words Sam, Sal, Al, last, sat, at Provide individual turns. Error correction If students make an error, stop the students and repeat the model, say each sound and then the word together. Then continue with practice. The vocabulary portion of the decoding section introduced the meaning of simple decodable words or Tier 1 words (Beck, McKeown, & Kucan, 2002) that might not be in the student‘s oral expressive vocabulary. Explicit instruction with word and picture cards was provided to help students learn words needed for understanding the decodable stories. The decodable vocabulary words were introduced on Day 1 and practiced throughout the week. Table 8 provides an example of the vocabulary instruction in the phonics section of the lessons. 60 TABLE 8. Example of Vocabulary Instruction in Phonics Section (Lesson 1, Day 1) Instructional focus T: We are going to learn the meaning of some of the words we just read. Teaching vocabulary T: (Show a picture of a mat.) This is a mat. A mat is a small rug. (Point to the picture of the mat.) This is last. Last in Spanish is último. (Show a picture of a line of children and point to the last child.) This is the last child in the line. Are you sometimes last in the line? Sat is when somebody sat down in the past. I sit down now, but I sat down yesterday.* (Model for students. * Students will likely not understand the definition, but they will understand the meaning of the word after reading the story several times.) Wrap-up T: Now we are going to read a story that has some of these words. The sentence-reading portion of the decoding section of the lessons gave the student an opportunity to practice reading the sound-symbol correspondences, decodable words, and irregular words in connected text. The sentence-reading activities involved students reading sentences on cards and answering brief comprehension questions after reading. The sentence-reading activities began on Day 2 in the lessons, and continued through Day 5.The structure and explicit directions and feedback provided by the teacher were consistent across all lessons, as noted in Table 9. Read-Aloud Description The read-aloud section was organized by strategies and skills a reader used before, during and after reading a story. During the explicit vocabulary and comprehension instruction, students were taught unknown vocabulary words (i.e., academic language) so they could access the story meaning and actively participate in the reading lesson. 61 TABLE 9. Example of Sentence Reading (Lesson 1, Day 2) Instructional focus T: Let‟s read the sentences with the words we learned. Instructional language terms T: A sentence is oración in Spanish. What is Oración in English? Ss: Sentence. T: Let‟s say the word “sentence” together. T & Ss: Sentence. T: Let‟s say the syllables for sentence. T & Ss: _________ _______ Teaching routine Place sentence strips in front of students: Sam sat. Sal sat. Al sat last. T: My turn, I will read each sentence. Read each sentence. T: Let‟s read together. T & Ss read sentence together. T: Your turn. Ss read sentences chorally. T: I will ask you a question about the sentences. Listen: Who sat last? Students turn to partners to answer question. Students should respond in complete sentences: Al sat last. Error correction If students make an error, stop the students and sound out the word that was incorrect; then read it fast. Return to the beginning of the sentence and read it again. Then continue with practice. Teachers were provided visual aids such as picture and word cards to teach vocabulary. In addition, comprehension cards and scripted comprehension questions including recommended feedback were provided. All lessons in the read-aloud section followed the same format with only the content changing across lessons. Table 10 includes an example of the reading aloud instruction from Lesson 2, Day 3. 62 TABLE 10. Example of Reading Aloud Instructions (Lesson 2, Day 3) Instructional focus T: Today we are going to read another text. It is about masks. Teaching routine introducing the text (title, masks, feel) T: The title is “The Masks.” Show students the passage title. What is the title? Title means título in Spanish. The title tells what we will read about. It tells us that we will learn about masks. Show word card for mask. A mask is something you wear over your face so people cannot see who you are (show students pictures or props of masks). Masks make you look like someone or something else. Allow students to respond in a way to develop background knowledge. Ask questions like: When have you seen masks before? Have you ever worn a mask? Model responses as needed. Discuss how different masks show different feelings. Show the word card for feel. Masks show different feelings. We feel happy, sad, mad, or tired. When we feel we show an emotion or mood with our face. Let‟s look at these masks. This mask looks like it feels happy. Show and discuss other feelings. Ask questions like: What does sad look like? What does surprised look like? What does mad look like? Do you ever feel happy, sad, or mad? Vocabulary (glad, author, favorite/least favorite) T: Before we read, we will learn the meaning of some words For each word: (1) Show students the word card. (2) Say each word or phrase. (3) Have students say the word or phrase after you. (4) Say the word’s definition. (5) Have students repeat the definition. (6) Demonstrate the meaning of words with actions, objects or pictures. Have students do the actions after you. T: The word is glad (show word card). What is the word? Glad means happy. What does glad mean? T: The word is author (show word card). What is the word? An author is the person who writes the story. An author is a person who does what? T: The last word is favorite (show word card). What is the word? Favorite is something you like the best. What does favorite mean? Now watch. Here are some pictures of food. The picture of ice cream is my favorite because I like ice cream the best. The picture of spinach is my least favorite (emphasize least). I don‟t like spinach. Model more examples. Here are some color squares. I will show you my favorite and my least favorite. Model example. What is my favorite color? What is my least favorite color? Have students practice using favorite and least favorite for some of the examples. 63 TABLE 10. (Continued) Read Aloud T: I will read the text. When I read, raise your hand when you hear the words: glad, favorite, and least favorite. T: I will read the passage again. You will follow along with your finger. Give students their student booklet. Students open booklet to the story on page 6. Students follow along. Read the text to the students as they follow along. If students are able to read most of the words in the passage, they may read chorally with you. T: While you read, think about the following questions (write questions on the board): T: Does the author have a brother or a sister? T: How many masks does the brother have? T: Which mask is the author‟s favorite mask? T: Which mask is the author‟s least favorite mask? Comprehension Use the text to facilitate discussion. For example, re-read sentences and discuss content from individual sentences. Use “How do you know?” as a follow-up prompt. For example, “Does the author have a brother or a sister? How do you know?” Note. The teaching sequences for the Read Aloud Instruction were designed without reference to appropriate the design-of-instruction guidelines and procedures (Englemann & Carnine, 1982). The read-aloud instruction was centered around a read-aloud story. There were two read-aloud stories per week. Days 1 and 2 were devoted to the first story and Days 3, 4, and 5 were devoted to the second story. The stories were developed to provide a rich context to develop vocabulary knowledge and academic language but also provided an opportunity for students to practice reading decodable words. Therefore, the read-aloud text included targeted vocabulary, academic and story content, as well as decodable words containing the spelling patterns taught during the decoding section. In the teacher script, the different types of words (e.g., targeted vocabulary, academic language and decodable words) were identified through italics and bold font. In addition, in the student copy of the text, the decodable words were highlighted so the student was prompted to try 64 reading them on his or her own. Of the read-aloud stories, five were fiction texts and nineteen were nonfiction texts. Appendix A includes detailed lesson maps reflecting the scope, sequence and content of the 12 transition lessons. Standard School-Based Instruction for English-Language Learners Teachers in the control condition implemented the standard school-based instruction for ELs. The type of instruction or program utilized in the control groups varied across school districts. In School District 1 (i.e., School A, School B, and School C), the teachers used a variety of instructional teaching strategies from their Houghton Mifflin (Reading, 2003) core reading curriculum and supplemental materials for English- language learners. The teachers used leveled reading books to build vocabulary, reinforce comprehension strategies and to teach work-attack skills. ―Leveled‖ books are a series of short paperbacks that have been assigned a reading level according to verbiage on the page and that introduce a number of new words of increasing difficulty with each advancing level. As part of the core reading program, they are intended to be used to reinforce reading skills and strategies during small-group instruction (Fountas & Pinnell, 1996). In School District 2 (i.e., School D, School E) the teachers used the intervention program, Fast Track Phonics, with the students in the control condition. Fast Track Phonics is a highly visual activities program designed for students who are learning to read English. Each unit contains carefully controlled high-frequency words embedded in the context of simple, decodable sentences, with clear, colorful illustrations to bolster 65 new readers‘ comprehension and confidence. Instructional features of the program include activities that highlight vowels, blends, diagraphs and diphthongs in words and opportunities for students to build fluency with reading words, sentences and decodable text. At School F, the teacher of the control group used the program DISTAR (Adams & Englemann, 1996). The DISTAR program is a direct instruction reading program that incorporates the following features: engaged time, frequent student response, immediate teacher feedback, and error correction. The DISTAR program provides opportunities for the students to learn letter patterns through word and sentence reading practice. In addition, the program combined oral language development with vocabulary and grammar instruction. At School G, the teacher implemented the Harcourt intervention program (Trophies, 2005) with the students in the control condition. The Harcourt intervention is a supplemental instructional guide that serves as a supplement to the Harcourt core reading program (Trophies, 2005). The guide includes lessons that reinforce content in the areas of phonemic awareness, phonics, fluency, vocabulary and comprehension. Table 11 provides a summary of literacy components and instructional features for each program. Dependent Variable The dependent variable was the reading achievement of English-language learners in first grade. Student data were collected on the following dimensions of early reading: (a) phonemic decoding and word reading, (b) oral reading fluency (c) vocabulary 66 development, and (d) comprehension. The measures listed and described below were employed in the study as pretest and posttest measures. TABLE 11. Literacy Components and Instructional Features of the Standard Intervention Programs Implemented in the Control Condition Program Literacy component Instructional features PA Phonics Fluency Vocabulary Comprehension Houghton Mifflin supplemental instruction No No Yes Yes Yes  Leveled readers aligned with core program  Instructional strategies from core program Fast track phonics Yes Yes Yes No No  Scope and sequence of skills  Teacher script  Explicit and systematic  Decodable readers DISTAR reading No Yes Yes Yes No  Scope and Sequence of skills  Teacher script  Explicit and systematic  Decodable readers Harcourt intervention No Yes Yes Yes Yes  Scope and sequence  Teachers manual  Aligned with core program  Student books aligned with core program Nonsense Word Fluency (NWF, DIBELS; Good & Kaminski, 2002) is a one- minute measure of nonsense word reading. Students are presented with a list of randomly ordered vowel-consonant and consonant-vowel-consonant units nonsense words (e.g., uk, puj). The words are all decodable, and the students may read the words sound by sound, 67 with partial blends, or as whole words. Two scores are derived from this test: (a) total number of correct letter-sounds produced in one minute (CLS), and (b) total number of words recoded completely and correctly (WRC) in one minute. Students must produce the most common sound for each letter to receive credit. Accurate recoding of nonsense words results in 2 or 3 points for the letter-sounds score (depending on whether the word is a two- or three-letter word). Alternate-form reliability for NWF subtests range from .67 to .88. NWF predictive validity coefficients range from .73 to .91 (Good & Kaminski, 2002). Oral Reading Fluency (DORF, DIBELS; Good, Kaminski, & Dill, 2002) is a General Outcome Measure of students‘ ability to accurately and fluently read connected text. Students read a passage aloud for one minute, and the score is the number of words read correctly. Omitted or substituted words and words where the student hesitates longer than 3 seconds are scored as errors. If a student self-corrects a word within 3 seconds, the word is scored as correct. The student is given three passages to read, and the final score recorded is the median correct words per minute from the three passages. Alternate-form reliability for administration of a single passage ranges from .89 to .96. Concurrent correlations with the Test of Reading Fluency (1987) range from .91 to .96 across alternate forms of first-grade DORF passages (Good, Simmons, & Kame'enui, 2001). Stanford Achievement Test, Tenth Edition (SAT-10; Harcourt Brace Educational Measurement, 2003) is a group administered, norm-referenced test of overall reading proficiency. The measure is not timed, although guidelines with flexible time recommendations are given. The word-reading, sentence-reading subtests of the SAT-10 68 were administered as part of the pre-post intervention and served as a measure of reading achievement in the areas of word reading and reading comprehension. The SAT-10 reading comprehension subtest was administered as part of the posttesting battery of assessments. The internal consistency reliability coefficients for the total reading score were .97 at Grade 1. The correlations between the SAT-10 total reading scale and the Otis-Lennon School Ability Test ranged from .61 to .74. The Group Reading Assessment and Diagnostic Evaluation (GRA+DE; Williams, 2001) is a grouped, administered, norm-referenced test of overall reading achievement. The word meaning and listening comprehension subtests of the GRA+DE were administered as part of posttest data collection only and served as a measure of reading achievement in the areas of vocabulary and listening comprehension. On the word meaning subtest the students were required to silently read a target word and look at a set of four pictures. Students then marked the picture that best defines the meaning of the word. There were 27 items with one raw score point awarded for each correct response. The listening comprehension items required the student to listen to and understand orally presented, connected speech and to choose one of the four pictures that best corresponded to what was read by the teacher. The purpose is to measure receptive comprehension without printed cues. The Level 1 subtest included items that focused on vocabulary, grammar and inference skills. The decision to include the GRA+DE subtests was made by the researcher after the completion of pretest data collection. The total test Alpha and Split-Half reliabilities for the first-grade subtests ranged from .87 to .96. The correlation 69 between the GRA+DE total test standard scores and the California Achievement Test (CAT) was .87. The normative sample is representative of the U.S. student population. Bilingual Verbal Ability Tests (BVAT; Muñoz-Sandoval, Cummins, Alvarado, & Ruef, 1998) measure a child‘s ability to use two languages to negotiate the meaning of academic content. It consists of three subtests from the Woodcock-Johnson Tests of Achievement-Revised (Woodcock & Johnson, 1989): Picture Vocabulary, Oral Vocabulary, and Verbal Analogies. The test yields an English proficiency score and a score that indicates the language skills the child has in his or her first language. The norming sample included 5,602 subjects from over 100 different U.S. communities. Subsets of the norming sample representing populations with low percentages of occurrence in the United States were oversampled. Concurrent validity of the BVAT with the Language Assessment Scales (Duncan & De Avila, 1985) and the Woodcock Muñoz Language Survey Reading-Writing cluster (Woodcock & Muñoz-Sandoval, 1993) in kindergarten was within the range of .6 to .9. The median alternate form reliability observed across 12 grade levels was .84 in a sample of 542 bilingual participants. Transition Lessons Pre-Post Intervention Assessment, an assessment designed by the researchers in the study, was comprised of eight subtests that were highly aligned with the components of the intervention. The transition lesson assessment was used as a secondary assessment designed to capture the content and routines of the transition lessons. It is a researcher-developed assessment that lacks the necessary psychometrics of a standardized outcome measure. Two versions of each subtest were developed and administered pre- and postintervention. The transition lesson assessment included the 70 following subtests: (a) word reading fluency-decodable (WR-D), (b) word reading fluency-sight words (WR-S), (c) Tier-one Vocabulary Knowledge (V-Tier 1), (d) Depth of Vocabulary Knowledge (DOK), (e) Comprehension Questions, (f) Story Sequencing, and (g) Grammar Word Sort (W-Sort). A total test score was derived by combining the scores from each subtest. A brief description of each subtest is provided below and a copy of the complete test is included in Appendix B. The word reading fluency-decodable (WR-D) subtest is a list of decodable words randomly selected from the words taught in the transition lessons. Student performance was measured by how many correct words the student could read in 30 seconds. The word reading fluency-sight words (WR-S) subtest is a list of sight words taught in the transition lessons. Student performance was measured by how many words the students could read in 30 seconds. Both measures were designed based on the format of the DIBELS NWF subtests and followed similar administration procedures. The Tier-one vocabulary knowledge (V-Tier 1) subtest was developed to assess children‘s knowledge of words targeted in the transition lessons. The vocabulary knowledge subtest was based on the Peabody Picture Vocabulary Test-Third Edition (PPVT-III; Dunn & Dunn, 1981). The PPVT-III is a commonly used norm-referenced measure of receptive vocabulary in which children choose one of four pictures that corresponds to the target word given orally by the test administrator. The vocabulary knowledge subtest contained a random sample of 10 words (five on pretest and five on posttest) out of 33 targeted in the Tier 1 vocabulary portion of the intervention. The subtest required the student to choose one picture of four that represented the target word 71 provided by the test administrator. The directions for test administration and the format were consistent with the procedures in the PPVT. The test format was as follows: A plate of four pictures was presented to the child and the administrator asked, ―Point to the _____.‖ One point was awarded for each correct response. The Depth of Knowledge (DOK) subtest is a researcher-developed measure that has been used in previous research studies (e.g., Fien et al., in press; Santoro, Baker, Chard, & Howard, 2007; Santoro, Chard, Howard, & Baker, 2008) to assess student knowledge of academic language and content vocabulary. The depth of knowledge subtest in the transition lesson assessment contained 10 words (five on pretest and five on posttest) randomly selected from the 30 vocabulary words targeted in the read-aloud portion of the transition lessons. For each target word, the student was required to define the meaning of the word and to use the word in a sentence. One point was awarded for accurately defining the word and one point was awarded for accurately using the word in a sentence, for a total of two points for each word. Points are awarded if the definition and sentence provided by the child expresses full knowledge of the target word. For example, definition and sentences that include a synonymous word or phrase are considered to be accurate or an indication of knowledge of the target word. A second examiner verified the score using the tool; the overall interrater reliability was .95. The test items on the following two subtests were developed based on read-aloud stories created by the researcher that included academic language and vocabulary words from the transition lessons. The story was read aloud to the student prior to completing each subtest. The comprehension question (Comp) subtest presented open-ended 72 questions pertaining to the details in the story. If the questions were answered correctly, then students received two points. If the open-ended question was not answered or was answered incorrectly, it was followed with a series of three related yes/no questions. Students received one point for answering the yes/no questions correctly. The story sequencing (Sequence) subtest was designed to measure the ability to sequence events in a story after hearing the story read aloud. Students were asked to put sentences from the story in order on the sequencing mat. The examiner provided prompts to students by saying, ―Start with the first thing that happened in the story, then the next two things, then the last thing that happened. If you need help reading the sentences, I can help you.‖ Students were awarded one point for starting the sequence correctly and then one point for each sequence of two sentences and one point for the last sentence with a total of five points possible. The Grammar Word Sort (W-Sort) subtest was designed to assess student knowledge of whether a word presented in the context of a sentence was a noun, verb or adjective. The grammar word sort subtest contained 10 words (five on pretest and five on posttest) that were randomly selected from the words taught in the transition lessons. The test examiner presented a sentence card containing a highlighted word and a corresponding card containing the highlighted word. Then the examiner asked the student to determine whether the word was a noun, adjective or verb and to place the word card on the sorting mat under the correct heading. Students were awarded one point for each correctly sorted word. The grammar word sort subtest contained three practice items, and 73 examiners were instructed to discontinue the task if the student was unable to complete the practice items correctly. Data Collection The screening and pretest measures included the Standard Achievement Test (SAT-10) word reading subtest, the DIBELS NWF and ORF subtests, and the Transition Lessons Pre-Assessment. The SAT-10 was given in September 2009 as part of the larger SETR study. The DIBELS NWF and ORF subtests were given in January 2010 as part of the larger SETR study. The Transition Lesson Assessment Battery was given by trained data collectors in February 2010 prior to the beginning of the transition lesson study. The posttest measures included the Standard Achievement Test (SAT-10), the DIBELS NWF and ORF subtests, the GRA+DE listening comprehension and word meaning subtests and the Transition Lessons Post Assessment. All posttests were administered in June 2010 by a team of trained data collectors. Procedures Teacher Training The teachers were trained in the implementation of the transition lessons prior to the beginning of the project. The focus of the teacher training was in the following areas: (a) understanding the key features and design of the transition lessons, (b) maximizing instructional effectiveness through lesson pacing, and (c) learning about explicit instruction and maximizing student success. 74 In late February, the teachers identified to implement the transition lessons in the participating schools were assembled for a day of training. The training was held at the administration office of School District A. I developed a presentation that would inform the participants of the purpose and research base for the transition lessons and to guide them through the details of implementing the transition lessons. The content of the presentation included the following topics: (a) overview of lesson materials; (b) instructional routines (e.g., model, lead, test and corrective feedback); (c) teaching vocabulary and academic language; (d) English sound spelling examples; (e) fidelity of implementation; and (e) lesson content and structure of the phonics and read-aloud sections. The participants had an opportunity to observe a model lesson and practice each of the components of transition lessons. In addition, all transition lesson binders and materials were distributed at the training. Training of Data Collectors Prior to each data collection, the researcher trained a team of data collectors to administer all pre- and postassessments. Each training session included an overview of each assessment and provided an opportunity for data collectors to practice administering each of the assessments. Data collectors practiced administering the subtest until interrater agreement with the researcher reached at least .90. Interrater agreement was determined by dividing the number of items scored in agreement by the total number of items scored. In addition, the team of data collectors was trained to complete the SAT-10 and GRA+DE fidelity checklists. Critical components of the assessment administration 75 were identified and an observation checklist was developed to evaluate fidelity, as shown in Appendix C. Field reliability was also obtained for each data collector. In pairs, the data collectors completed a fidelity checklist during GRA+DE and SAT-10 test administration. A shadow scoring procedure was used for field reliability on 20% of the DIBELS, SAT-10, GRA+DE and transition lesson assessment administrations during both data-collection waves. Interrater reliability based on percent agreement was .89 for the GRA+DE, SAT-10 and Transition Assessment measures, and .99 for DIBELS NWF and ORF. After completion of data collection for each wave, the paired data collectors verified scoring. When uncertainties in scoring arose, the scorers worked in pairs, or collectively, to determine the correct score. Then, one data collector collected all completed protocols and entered data into an electronic spreadsheet. The author checked the accuracy of data entry for each protocol. Item-entry was verified for 20% of the protocols (n = 93). Three of the 93 protocols that were verified had a single item-entry error that resulted in a raw score change of one point. A second round of verification was conducted on all data entry, as scores were compared with the database in the larger study by another research assistant associated with the project. Fidelity and Feasibility of Implementation Fidelity to the implementation of the program was measured in the experimental groups and the control groups. Critical components of the intervention were identified 76 and an observation form was developed to evaluate implementation, as shown in Appendix D. The first half of the observation form included a checklist of specific teacher behaviors such as delivering explicit instruction, opportunities for student practice and providing teacher feedback. A Likert scale was used to rate each teaching behavior observed during the lesson based on the degree of implementation (i.e., consistently, sometimes, rarely, never). The second half of the observation form was used to document the components of literacy (i.e., phonemic awareness, phonics, sentence reading, vocabulary, comprehension) addressed in the lesson. Amount of time (i.e., minutes) devoted to each component was recorded on the form. Detailed notes on instructional practices and activities implemented by the instructor were captured on the observation form. The researcher observed each instructor twice over the course of 12 weeks. The researcher coached teachers and instructional assistants using feedback developed from the observation tool. The coaching sessions focused on reviewing and practicing instructional procedures. In addition, independent student activities were developed by the researcher for teachers to utilize as behavior and on-task instructional tools. At the completion of the study, teachers and instructional assistants in the treatment condition completed a feasibility survey. The teachers and instructional assistants were asked to rate different aspects of implementing the transition lessons. Data were gathered on items focused on implementation ease, structure of the lessons, and alignment with other instructional programs. The survey developed to evaluate feasibility of the transition lessons can be found in Appendix E. Results for the fidelity check and feasibility survey are reported in Chapter IV. 77 Data Analysis Data were analyzed according to procedures that align with the research question to be answered. The following section describes the analysis procedure along with the research question. A gain-score analysis was employed in this study to compare the effectiveness of the two treatment conditions (i.e., transition lessons and standard school-based intervention) on scores obtained from the pretest and posttest measures of reading achievement. First, gain scores were calculated on each of the reading measures for both conditions. Then an analysis of variance (ANOVA) was conducted to test for statistically significant differences between conditions. Analysis of gain scores was used to answer the following three research questions. To determine the effect of the conditions (i.e., transition lessons vs. control condition), the researcher conducted separate ANOVAs to answer the following research questions: 1. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on word-reading and passage- reading development as measured by the SAT-10 word-reading and sentence-reading subtests, DIBELS NWF and ORF subtests? 2. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on vocabulary development as measured by the Depth of Knowledge (DOK) subtest of the transition assessment? 78 3. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on overall reading achievement as measured by the transition pre-post assessment? It was hypothesized that the students in the treatment condition (i.e., transition lessons) would show larger gains than the students in the control condition (i.e., standard curriculum) on all outcome measures. In addition, it was hypothesized that the effect of the treatment condition would vary according to the different components of reading (i.e., word reading vs. vocabulary and comprehension). The GRA+DE word meaning and listening subtests were only administered during the posttest data-collection wave. The researchers made the decision to add these assessments to measure vocabulary and comprehension development because there were no standardized, published assessments utilized during the pretest data-collection wave for these reading domains. In addition, the SAT-10 reading comprehension subtest was administered only during the posttest data-collection wave. The following research question was answered by analyzing the data from the GRA+DE and SAT-10 subtests using an analysis of covariance (ANCOVA) to compare group means of the treatment and control conditions. The pretest scores from the Bilingual Verbal Ability Test (BVAT) were used as the covariate. 4. Do Spanish-speaking EL students who receive the transition lessons in the treatment group outperform students in the control group on vocabulary development and 79 listening comprehension as measured by the GRA+DE word meaning and listening comprehension subtests and the SAT-10 reading comprehension subtest? It was hypothesized that the students in the treatment condition (i.e., transition lessons) would outperform the students in the control condition (i.e., standard school- based curriculum) on the GRA+DE and SAT-10 outcome measures. 80 CHAPTER IV RESULTS In Chapter III, participants, independent and dependent variables, and study procedures were described. Methods for data analysis were outlined, including analysis of variance on gain scores and analysis of covariance to compare group means between treatment and control conditions. In this chapter, descriptive statistics are provided for each pretest and posttest measure in the study, and analyses and results are described in the context of study research questions. The primary data for this study included student raw scores on pretest and posttest measures of the DIBELS NWF, ORF and the Transition Assessment. In addition, standard scores on the pretest and posttest measures of SAT-10 word reading and sentence reading were collected and analyzed in this study. Gain scores were calculated for each pretest and posttest measure and used in the analysis to answer the first three research questions. As discussed in Chapter III, data were collected on three outcome measures only during the posttest phase of data collection. Standard scores on the posttest measures of the GRADE word meaning and listening comprehension subtests and SAT-10 reading comprehension measure were included in this study. The Bilingual Verbal Ability Test (BVAT) was administered as part of the pretest wave in the larger SETR study. BVAT (W) scores were reported and used to conduct the analysis. W scores are a conversion of the raw scores using the Rasch ability scale. The W scale has equal-interval measurement 81 characteristics and the interpretation advantages of Rasch-based measurement (Woodcock, 1978, 1982, as cited in Muñoz-Sandoval et al., 1998). The posttest standard scores on the GRADE subtest and SAT-10 subtests were used with the BVAT W scores as the covariate in the analysis to answer Research Question 4. In the first part of this chapter, descriptive statistics for the pretest and posttest measures are reported, followed by the calculated gain scores for each measure by condition. Then descriptive statistics are reported for the posttest-only measures by condition. In the next part of this chapter, results for each analysis used to answer the four research questions are reported. In the last part of this chapter, results from the fidelity of implementation observations and feasibility survey are reported. Descriptive Statistics Pretest and Posttest Test Scores Participants completed the DIBELS NWF and ORF subtests, the DOK vocabulary test and SAT-10 word reading and sentence reading subtests before the intervention (pretest) and after the intervention (posttest). From pretest to posttest, treatment and control means for all of the tests increased. However, this is also true for the standard deviations, with the exception of the SAT-10 sentence reading scores and DOK vocabulary scores for the control group. The trend of the data at pretest was that the control group had a lower average score than the treatment group. The exceptions to this were the DIBELS NWF and the DIBELS ORF scores in which the treatment group had a lower mean than the control 82 group. The largest difference between treatment and control groups occurred on the SAT-10 sentence reading measure, in which there was a 12.56 point difference in favor of the control group. The trend of the data at posttest was that the treatment group had a lower average score than the control group. The exceptions to this were the DOK vocabulary and the Transition Assessment scores in which the control group had a lower mean than the treatment group. The largest difference between treatment and control groups occurred in the DIBELS NWF test scores, in which there was a 7.08 point difference in favor of the control group. Measures With Posttest-Only Scores Students also completed the GRADE listening comprehension, GRADE word meaning, and SAT-10 comprehension tests. There was no visible trend for these tests. The means of the tests were all similar, with the largest difference being a 0.82-point difference between treatment and control for the SAT-10 comprehension test in favor of treatment group. Standard deviations were similar as well, with the exception being the GRADE word meaning scores. For this test, the standard deviation was higher for the treatment group than the control group, suggesting more variability in the treatment group scores for the GRADE word meaning test. Students also took the BVAT bilingual verbal abilities test. This test will be used as a covariate for the later analyses when using posttest-only scores. The means for the 83 BVAT bilingual verbal abilities test were very similar for the treatment and control groups. Means and standard deviations for all test scores are presented in Table 12. Preliminary Analyses Preliminary analyses of variance (ANOVA) were conducted to assess if there were differences in the pretest scores by group (control vs. experimental). In the examination of ANOVA assumptions, six Kolmogorov Smirnov tests were conducted to assess the normality of DIBELS NWF and ORF subtests, the DOK vocabulary test, SAT-10 word reading and sentence reading subtests, and the BVAT bilingual verbal abilities test. The results of the tests were not statistically significant, verifying the assumption of normality. Six Levene‘s tests were also conducted to assess the equality of variance. The results of the Levene‘s tests were not statistically significant, verifying the assumption of equality of variance. The results of all six ANOVAs were not statistically significant, suggesting that there were no differences in the pretest scores and the BVAT bilingual verbal abilities score by group (treatment vs. control). Results of all six of the ANOVAs are presented in Table 13. Research Question 1 Given the lack of pretest differences across measures and between groups, the gain score analysis by group on the full range of measures was subsequently conducted 84 TABLE 12. Means and Standard Deviations for All Test Scores by Group (Treatment and Control) Treatment (n = 39) Control (n = 39) Measure M SD M SD DIBELS NWF Pretest 40.00 24.71 47.03 25.37 Posttest 61.18 33.87 68.26 35.03 Gain score* 21.18 32.71 21.23 36.66 DIBELS ORF Pretest 13.41 12.71 13.67 12.18 Posttest 36.18 24.23 39.00 22.12 Gain score* 22.77 22.63 25.33 20.49 SAT-10 Word Reading Pretest 425.82 27.66 424.46 28.79 Posttest 489.85 45.83 491.56 41.17 Gain score* 64.03 41.12 67.10 45.84 SAT-10 Sentence Reading Pretest 449.46 31.03 436.90 34.18 Posttest 518.51 39.91 521.72 34.04 Gain score* 69.05 35.19 84.82 45.60 DOK Vocabulary Pretest 3.33 2.51 3.18 2.27 Posttest 5.23 2.76 4.59 2.16 Gain score* 1.90 2.86 1.41 2.41 Transition Assessment Pretest 27.90 14.07 27.13 13.31 Posttest 52.03 18.67 48.74 16.34 Gain score* 24.13 13.84 21.62 12.97 85 TABLE 12. (Continued) Treatment (n = 39) Control (n = 39) Measure M SD M SD GRADE Listening Comprehension** Pretest - - - - Posttest 13.77 2.92 13.72 2.67 Gain score - - - - GRADE Word Meaning** Pretest - - - - Posttest 21.69 5.36 22.31 3.76 Gain score - - - - SAT-10 Comprehension** Pretest - - - - Posttest 509.03 32.17 508.21 34.97 Gain score - - - - BVAT bilingual verbal abilities score 454.44 11.65 454.79 8.75 *To create gain scores the pretest score was subtracted from the posttest score. **Tests only have a posttest score; thus, gain scores were not calculated. RQ1: Are there statistically significant differences in word reading, passage reading, SAT-10 word reading, and SAT-10 sentence reading gain scores by group (treatment vs. control)? In the examination of Research Question 1, four analyses of variance (ANOVA) were conducted to assess differences in word reading, passage reading, SAT-10 word reading, and SAT-10 sentence reading gain scores by group (treatment vs. control). Gain scores were created by finding the difference in scores from pretest to posttest. In the 86 TABLE 13. ANOVAs for Pretest and BVAT Bilingual Verbal Abilities Scores by Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 DIBELS NWF Between 962.51 962.51 1.54 .219 .020 Error 47,662.97 627.14 DIBELS ORF Between 1.28 1.28 0.01 .928 .000 Error 1,1778.10 154.98 SAT-10 Word Reading Between 36.01 36.01 0.05 .832 .001 Error 6,0573.44 797.02 SAT-10 Sentence Reading Between 3078.21 3,078.21 2.89 .093 .037 Error 8,0971.28 1,065.41 DOK Vocabulary Between 0.46 0.46 0.08 .777 .001 Error 434.41 5.72 Transition Assessment Between 11.54 11.54 0.06 .805 .001 Error 14,251.95 187.53 BVAT bilingual verbal abilities score Between 2.51 2.51 0.02 .878 .000 Error 8,059.95 106.05 106.05 87 examination of ANOVA assumptions, four Kolmogorov Smirnov tests were conducted to assess the normality of word reading, passage reading, SAT-10 word reading and SAT-10 sentence reading gain scores. The results of the tests were not statistically significant, verifying the assumption of normality. Four Levene‘s tests were also conducted to assess the equality of variance. The results of the Levene‘s tests were not statistically significant, verifying the assumption of equality of variance. Word Reading The result of the ANOVA for word reading gain scores was not statistically significant, F(1, 76) = 0.00, p = .995, suggesting no differences existed in the word reading gain scores by group (treatment vs. control). Results of the ANOVA are presented in Table 14. TABLE 14. ANOVA for Word Reading Gain Scores by Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 Group Between 0.05 0.05 0.00 .995 0.00 Error 91,736.67 1,207.06 Passage Reading The result of the ANOVA for passage reading gain scores was not statistically significant, F(1, 76) = 0.28, p = .601, suggesting no differences existed in the passage 88 reading gain scores by group (treatment vs. control). Results of the ANOVA are presented in Table 15. TABLE 15. ANOVA for Passage Reading Gain Scores by Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 Group Between 128.21 128.21 0.28 .601 0.00 Error 35,419.59 466.05 SAT-10 Word Reading The result of the ANOVA for SAT-10 word reading gain scores was not statistically significant, F(1, 76) = 0.10, p = .756, suggesting no differences existed in the SAT-10 word reading gain scores by group (treatment vs. control). Results of the ANOVA are presented in Table 16. TABLE 16. ANOVA for SAT-10 Word Reading Gain Scores by Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 Group Between 184.62 184.62 0.10 .756 0.00 Error 144,102.56 1,896.09 SAT-10 Sentence Reading The result of the ANOVA for SAT-10 sentence reading gain scores was not statistically significant, F(1, 76) = 2.92, p = .091, suggesting no differences existed in the 89 SAT-10 sentence reading gain scores by group (treatment vs. control). Results of the ANOVA are presented in Table 17. TABLE 17. ANOVA for SAT-10 Sentence Reading Gain Scores by Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 Group Between 4,849.04 4,849.04 2.92 .091 0.04 Error 126,077.64 1,658.92 Research Question 2 RQ2: Are there statistically significant differences in DOK vocabulary scores by group (treatment vs. control)? In the examination of Research Question 2, a univariate analysis of variance (ANOVA) was conducted to determine if there was a difference in DOK vocabulary gain scores by group (treatment vs. control). To test the assumptions of the ANOVA, a Kolmogorov Smirnov test was conducted to assess the normality of DOK vocabulary gain scores. The results of the tests were statistically significant, indicating a violation of the assumption of normality. In many cases, the ANOVA is considered a robust statistic in which assumptions can be violated with relatively minor effects (Howell, 2010). A Levene‘s test was also conducted to assess the equality of variance. The results of the Levene‘s test were not statistically significant, verifying the assumption of equality of variance. 90 The result of the ANOVA for DOK vocabulary gain scores was not statistically significant, F(1, 76) = 0.66, p = .419, suggesting that DOK vocabulary gain scores were not different by group (treatment vs. control). Results for the ANOVA are presented in Table 18. TABLE 18. ANOVA for DOK Vocabulary Gain Scores Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 Group Between 4.63 4.63 0.66 .419 0.01 Error 533.03 7.01 Research Question 3 RQ3: Are there statistically significant differences in the overall reading gain scores by group (treatment vs. control)? In the examination of Research Question 3, a univariate analysis of variance (ANOVA) was conducted to determine if there was a difference in overall reading gain scores by group (treatment vs. control). In the examination of the ANOVA assumptions, a Kolmogorov Smirnov test was conducted to assess the normality of overall reading gain scores. The results of the tests were not statistically significant, verifying the assumption of normality. A Levene‘s test was also conducted to assess the equality of variance. The results of the Levene‘s test were not statistically significant, verifying the assumption of equality of variance. 91 The result of the ANOVA for overall reading gain scores was not statistically significant, F(1, 76) = 0.68, p = .411, suggesting that overall reading gain scores were not different by group (treatment vs. control). Results for the ANOVA are presented in Table 19. TABLE 19. ANOVA for Overall Reading Gain Scores Group (Treatment vs. Control) Source SS MS F (1, 76) p η2 Group Between 123.13 123.13 0.68 .411 0.01 Error 13,673.59 179.92 Research Question 4 RQ4: Are there statistically significant differences in GRADE listening comprehension scores, GRADE word meaning scores, and SAT-10 reading comprehension scores by group (treatment vs. control) after controlling for BVAT bilingual verbal abilities score? In the examination of Research Question 4, three analyses of covariance (ANCOVA) were conducted to assess if there were differences in GRADE listening comprehension scores, GRADE word meaning scores, and SAT-10 reading comprehension scores by group (treatment vs. control) after controlling for BVAT bilingual verbal abilities scores. 92 Examining the Assumptions of ANCOVA In the examination of the ANCOVA assumptions, three Kolmogorov Smirnov tests were conducted to assess the normality of GRADE listening comprehension scores, GRADE word meaning scores, and SAT-10 reading comprehension scores. The results of the tests were statistically significant for GRADE listening comprehension scores and GRADE word meaning scores, violating the assumption of normality. In many cases, the ANCOVA is considered a robust statistic in which assumptions can be violated with relatively minor effects (Howell, 2010). Three Levene‘s tests were also conducted to assess the equality of variance. The results of the Levene‘s tests were not statistically significant, verifying the assumption of equality of variance. Correlation analyses were conducted to assess the linear relationship between BVAT bilingual verbal abilities scores and GRADE listening comprehension scores, GRADE word meaning scores, and SAT-10 reading comprehension scores. The results of the correlations were statistically significant for GRADE listening comprehension scores and SAT-10 reading comprehension scores (see Table 20). The correlation was not TABLE 20. Correlations Between BVAT Bilingual Verbal Abilities Score and GRADE Listening Comprehension Score, GRADE Word Meaning Score, and SAT-10 Reading Comprehension Score Measure BVAT bilingual verbal abilities score GRADE listening comprehension score 0.55** GRADE word meaning score 0.21 SAT-10 reading comprehension score 0.25* *p < 0.05. **p < 0.01. 93 statistically significant for GRADE word meaning scores, suggesting no relationship existed between BVAT bilingual verbal abilities scores and GRADE word meaning scores. Thus, the assumption of linearity was violated for GRADE word meaning scores. The assumption of homogeneity of regression slopes was assessed viewing the scatterplots between BVAT scores and GRADE listening comprehension scores, GRADE word meaning scores, and SAT-10 reading comprehension scores with separate regression lines for treatment and control. For the scatterplots for BVAT scores, GRADE listening scores and SAT-10 reading comprehension scores, the separate regression lines were similar, verifying the assumption of homogeneity (see Figures 3 and 4). However, the regression lines for BVAT score and GRADE word meaning scores were different (see Figure 5), violating the assumption of homogeneity. Because of the violations in the assumption of ANCOVA, the decision was made to not include BVAT as a covariate in the analysis of GRADE word meaning because using the covariate would be misleading. Because the different regression lines indicate an interaction between the covariate and group (treatment vs. control), the analysis was excluded (Stevens, 2009). GRADE Listening Comprehension The result of the ANCOVA for GRADE listening comprehension scores was not statistically significant, F(1, 75) = 0.04, p = .845, suggesting that there were not differences in GRADE listening comprehension scores by group (treatment vs. control) after controlling for BVAT bilingual verbal abilities scores. Results of the ANCOVA are presented in Table 21. 94 FIGURE 3. Scatterplot with regression lines (treatment and control) for BVAT Bilingual Verbal Abilities score and GRADE listening comprehension score. GRADE Word Meaning The result of the ANOVA for GRADE word meaning scores was not statistically significant, F(1, 76) = 0.35, p = .559, suggesting that GRADE word meaning scores were not different by group (treatment vs. control). Results of the ANOVA are presented in Table 22. 95 FIGURE 4. Scatterplot with regression lines (treatment and control) for BVAT Bilingual Verbal Abilities score and SAT-10 reading comprehension score. SAT-10 Reading Comprehension The result of the ANCOVA for SAT-10 reading comprehension scores was not statistically significant, F(1, 75) = 0.02, p = .881, suggesting that there were no differences in SAT-10 reading comprehension scores by group (treatment vs. control) 96 FIGURE 5. Scatterplot with regression lines (treatment and control) for BVAT Bilingual Verbal Abilities score and GRADE word meaning score. TABLE 21. ANCOVA for GRADE Listening Comprehension Scores by Group After Controlling for BVAT Bilingual Verbal Abilities Score Source SS MS F (1, 75) p η2 BVAT bilingual verbal abilities score (covariate) 177.97 177.97 32.17 .001 0.30 Group 0.21 0.21 0.04 .845 0.00 Error 414.85 5.53 97 after controlling for BVAT bilingual verbal abilities scores. Results of the ANCOVA are presented in Table 23. TABLE 22.ANOVA for GRADE Word Meaning Scores by Group Source SS MS F (1, 76) p η2 Group Between 7.39 7.39 0.35 .599 0.01 Error 1,628.62 21.43 TABLE 23. ANCOVA for SAT-10 Reading Comprehension Scores by Group After Controlling for BVAT Bilingual Verbal Abilities Score Source SS MS F (1, 75) p η2 BVAT bilingual verbal abilities score (covariate) 5,436.00 5,436.00 5.07 .027 0.06 Group 24.25 24.25 0.02 .881 0.00 Error 80,365.33 1,071.54 Fidelity of Implementation Instructional Component Observation data on fidelity of implementation were collected in both the treatment (n = 6) and control condition (n = 6) at two time points during the project. Due to travel constraints and the distance of one of the school districts, observation data were collected from 12 of the 14 instructors in the project. Each instructor was rated on 13 items related to instruction using a 4-point Likert scale. The researcher gave each item a rating of either never, rarely, sometimes or consistently based on the instruction observed 98 during the 30-minute observation. Items on the fidelity of implementation checklist were combined to attain an overall score for the following instructional components: (a) teacher models, (b) group responses, (c) individual responses, (d) teacher feedback, (e) practice, (f) signaling, and (g) brisk pacing. First, descriptive statistics were calculated for each instructional component for both the treatment and control condition. The effect of intervention condition on instructional components was evaluated using independent observation t tests. The results of the independent t tests were not statistically significant, suggesting that fidelity of implementation mean scores were not different by group (treatment vs. control). An examination of the mean scores for each instructional component by group suggests that instructors in both conditions had high levels of fidelity of implementation. On average, each item received a rating of 2.00 (i.e., sometimes) to 3.00 (i.e., consistently) in both conditions. Results are reported in Table 24. TABLE 24. Instructional Components by Condition Instructional Component Treatment Control t-statistic (df = 10) p-value M SD M SD Teacher model 2.58 0.49 2.67 0.52 -0.29 .780 Group responses 2.75 0.42 2.67 0.41 0.35 .734 Individual responses 2.75 0.42 2.75 0.42 0.00 1.000 Feedback 2.58 0.49 2.17 0.41 1.60 .141 Practice 2.58 0.58 2.41 0.66 0.46 .654 Signaling 2.25 0.82 2.17 0.75 0.18 .858 Brisk pacing 2.45 0.66 2.67 0.75 -0.73 .484 Note. 30-minute observations were conducted. Instruction was rated on a 4-point Likert scale where 0 = never, 1 = rarely, 2 = sometimes, and 3 = consistently. M = mean, SD = standard deviation. 99 Literacy Component In addition to time spent observing fidelity of implementation, the amount of time spent on the core reading components (i.e., phonemic awareness, phonics, word work, fluency, vocabulary and comprehension) was captured during the observation. Furthermore, time devoted to linking elements of the English instruction to Spanish knowledge was also collected in both conditions—e.g., time in the instruction devoted to linking letter sounds in English to Spanish or linking definitions of words in English to word definitions in Spanish. The effect of intervention condition on the amount of time spent on the core components of reading instructions was evaluated using independent observation t tests. The results indicated that there was a statistically significant difference between the treatment and control condition on amount of time devoted to the different components of reading, including the transition elements. The treatment condition spent significantly more time on phonemic awareness, vocabulary and comprehension instruction than the control condition. Conversely, the control condition spent significantly more time on phonics, word work and sentence reading than the treatment condition. Furthermore, the amount of time spent on transition elements for the treatment condition (M = 2.91, SD = 0.29) and the control condition (M = 1.00, SD = 0.55) were statistically significantly different, t(10) = 7.58, p = .000. The results of the amount of time spent on the core components of reading by condition are summarized in Table 25. 100 TABLE 25. Time Spent on Core Components of Reading by Condition Core components Treatment Control t-statistic (df = 10) p-value M SD M SD Phonemic awareness 2.04 0.55 0.83 0.98 2.99 .013 Phonics 3.94 0.48 9.42 4.80 -2.78 .019 Word work 5.48 0.64 8.28 0.77 -6.86 .000 Sentence reading 2.17 0.26 4.92 1.11 -5.89 .000 Vocabulary 7.13 0.29 1.98 0.69 16.73 .000 Comprehension 8.03 0.27 4.42 1.96 4.48 .001 Transition elements 2.91 0.29 1.00 0.55 7.58 .000 Note. 30-minute observations were conducted. Time was measured in minutes. M = mean, SD = standard deviation. Feasibility Survey At the end of the project, the treatment instructors completed a short survey on the feasibility of implementing the transition lessons. The purpose of the survey was to gather information on the ease of following the teacher script and specific feedback on the different reading components covered in the lessons. Survey results were collected from 100% of the treatment instructors (n = 7). On the survey, four out of the seven questions required the instructor to respond to the question by selecting a rating from the following scale: 1= not at all, 2 = somewhat, 3 = moderately, or 4 = very. On the first question (i.e., How closely did you follow the transition lessons?), the results indicate that on average the instructors selected moderately closely as their answer to the question (M = 3.00, SD = .82). On the second question (i.e., How different is the 101 structure used in the transition lessons from other ELD instruction?), three of the seven instructors rated the transition lessons as not at all different from the instruction they were providing during other parts of the school day (M = 1.86, SD = 1.07). Conversely, one of the seven instructors indicated that the lessons were very different from other instruction during the school day. On the third question (i.e., How likely are you to continue using the lessons after the project is finished?), the results indicate that most of the instructors were moderately likely to continue using the transition lessons after project completion (M =3.43, SD = .53). On the fourth question (i.e., How easy are the lessons to implement?), all of the instructors except one (n = 6) responded that the lessons were somewhat easy to implement (M = 3.14, SD = .38). On the last question that utilized the Likert scale (i.e., Was the read aloud helpful for building oral language?), four of the seven instructors (57%) responded by selecting moderately useful (M = 3.43, SD = .53). The remaining three instructors selected very useful. The last two questions of the survey followed a different format. Question 6 asked the instructor to indicate the section of the lessons to which the students respond best (i.e., phonemic awareness, phonics, vocabulary or read-aloud). Four of the seven instructors (57%) indicated the read-aloud section and the remaining three instructors identified vocabulary as the section of the lessons to which students responded best. Question 7 asked which section of the lesson (i.e., phonemic awareness, phonics, vocabulary or read-aloud) would instructors skip due to time constraints. The results 102 indicated that five of the seven instructors (71%) would skip phonemic awareness and identified that students had mastered the task. Finally, the survey included a section for the instructors to provide comments. There were two noteworthy comments gleaned from the surveys. One comment, articulated across several surveys, was that instructors had difficulty completing the lessons in the time allocated for instruction. In addition, several instructors indicated that the script was ―too wordy‖ and reported that they had to make adaptations to complete the lesson in the allotted time. 103 CHAPTER V DISCUSSION In this dissertation study, I investigated the effect of an intervention designed specifically for ELs who are transitioning from learning to read in their native language in first grade to learning to read in English in second grade. ELs who were identified at risk for reading difficulties in English in the winter of first grade were randomly assigned to either the transition lessons intervention (i.e., treatment condition) or the standard school- based intervention (i.e., control condition). Students in the intervention condition received 30 minutes of daily transition lessons for 12 weeks in addition to the instruction they received in the regular classroom during the English or Spanish literacy block. Students in the control condition received the same amount of additional instruction in a different supplemental program typically used for students at risk for reading difficulties. As discussed previously, the purpose of this dissertation study was to assess the effect of the transitions lessons within the larger, national SETR project. A randomized control trial was conducted within the SETR national treatment group to test the efficacy of the transition lessons within the context of the larger, national SETR project. Given that the creation of the transition lessons was an important objective of the larger SETR project and required a substantial amount of time to develop, this dissertation study examined whether these transition lessons would have a significant impact on EL English reading and language proficiency outcomes at the end of first grade. 104 Results from this study indicated that the transition intervention did not appear to be more effective than the standard school-based intervention provided by the schools. Although the transition intervention treatment design significantly increased student opportunities to develop their vocabulary and comprehension skill, the increase in instructional time spent on the core components of beginning reading did not appear to have a significant effect on overall student reading outcomes when compared to the effects of an intervention that focuses more on alphabetic principle or decoding. In this chapter, a summary of results is presented and the major findings are discussed in light of current research. In addition, limitations of the study and suggestions for future research are provided. Summary of Results The first three research questions were examined employing a series of analysis of variance statistical tests conducted on student data from assessments that measure word, sentence and passage reading. It was hypothesized that the students in the treatment condition (i.e., transition lessons) would show larger gains than the students in the control condition (i.e., standard curriculum) on all word, sentence and passage reading measures. Therefore, the null hypothesis was that there would be no statistically significant differences in mean gain scores between the treatment and control condition. The results of the analyses indicated that the null hypothesis could not be rejected, because the difference between the treatment and control condition was small and could have been due to chance variability. 105 Although the difference in gain scores between the treatment conditions was not statistically significant, the pretest to posttest growth results suggest that students increased the number of correct letter sounds read per minute by 1.8 correct letter sounds and the number of correct words per minute by two words per week. The Transition Lesson Assessment was designed specifically to be utilized before and after the transition lessons implementation. The subtests in the assessment follow similar routines and include words taught in the transition lessons. The transition assessment is a researcher-developed assessment and lacks the psychometric properties of traditional standardized reading measures. However, it was designed to capture the effect of the instruction in the transition lessons that might not otherwise be captured on other standardized reading measures. Results indicate that the treatment students (i.e., those who received transition lessons) did not perform any differently on this measure than the students in the control condition (i.e., school intervention). The fourth research question was examined by conducting an analysis of covariance on the standard scores obtained from the GRA+DE listening comprehension and SAT-10 reading comprehension subtests, using the BVAT fall scores as the covariate. In addition, an analysis of variance was conducted on the standard scores obtained from the GRA+DE word meaning subtest. It was hypothesized that the students in the treatment condition (i.e., transition lessons) would outperform the students in the comparison condition (i.e., standard curriculum) on the GRA+DE and SAT-10 outcome measures. The results of the analyses indicated the null hypothesis (i.e., no statistically significant differences in mean scores between treatment and control groups on 106 vocabulary and comprehension measures) could not be rejected. The difference in group means is small, not statistically significant and could be due to chance variability. Lack of Statistically Significant Effects Two main reasons have been identified that potentially explain the lack of results. First, although the treatment and control conditions were designed to be greatly different in instructional architecture, it appears that both conditions employed principles of similar architecture and explicit instruction in the transition lessons and the standard school- based intervention. Second, it appears that instructors in both conditions provided instruction in the core components of beginning reading (e.g., phonological awareness, alphabetic understanding, vocabulary, fluency and reading comprehension), including instruction in the transition elements. Similarities in the Type of Instruction Between Conditions The transition lessons in the treatment condition were designed based on instructional design principals. The transition lessons provided a framework for teachers to do the following: (a) explicitly model the use of learning strategies and new skills, (b) control task difficulty by scaffolding instruction, (c) provide multiple opportunities for students to respond in groups and individually, and (d) provide ongoing corrective feedback. It was hypothesized that if the treatment instructors adhered to the lessons, these elements of the teaching routines would be captured during the observations. In the control condition, the programs implemented were also designed based on similar 107 instructional design features, and in the case of two programs, the architecture of the instruction was arguably the same in their design. As described in Chapter III, the school- based intervention programs in the control condition were programs the schools had been implementing with at-risk students as part of their tiered instructional models. Two of the four programs, Fast Track Phonics and DISTAR, were research-based programs designed to target deficits in alphabetic principle and word reading. The remaining two programs were supplemental intervention guides that aligned with the core reading program implemented in the literacy block. Other studies reported in the literature, have found that intervention programs based on explicit instruction designed for monolingual students are effective for ELs. For example, Linan-Thompson and Hickman-Davis (2002) found that low-SES second-grade Spanish-speaking children who received explicit and systematic supplemental reading instruction improved their English reading skills as much as native-English-speaking children. In a study by Gunn et al. (2005), both Hispanic ELLs and non-Hispanic native English speakers (ESs) who received supplemental instruction performed better on measures of word attack, word identification, and oral reading fluency, thus supporting research that ELLs may respond to intervention similarly to their ESs counterparts. Further support for the findings of Gunn et al. was provided by Denton et al. (2004), whose study examined two English literacy interventions‘ effect on reading progress for Spanish-speaking bilingual students. Denton et al. hypothesized that supplemental instruction found to be effective for native English readers would likewise benefit 108 children who were learning to read English as their second language. The results of the study indicated that intervention students outperformed comparison students for English Letter-Word Identification after adjusting for initial performance, F(1, 107) = 9.49, p < .003, with a modest effect size (d = 0.43), intervention students‘ performance in the average range, and comparison students‘ performance in the low average range. Gains were approximately 1.5 normative standard deviations for intervention students and one standard deviation for comparison students. In the current dissertation study, results on the fidelity of implementation checklist suggest that instructors in both conditions were adhering to the same explicit instructional principles. According to the observations, instructors in both conditions provided teacher models of new material and opportunities for students to respond individually and as a group. In addition, instructors in both conditions followed student mistakes with corrective feedback and practice opportunities. The results on the fidelity of observation checklist indicated that these teaching behaviors were not statistically different between conditions. Transition Elements Another important implication of the fidelity of observation results was the time devoted to transition elements (i.e., linking English instruction to Spanish knowledge). The transition lessons were specifically designed for students in a transition bilingual reading program who were learning to read in Spanish in kindergarten and first grade and then transitioning to learning to read in English in second grade. The teacher script in the 109 transition lessons included instructions for linking English phonemes to Spanish phonemes, teaching instructional language with cognates, and introducing or reinforcing academic language and story vocabulary that ELs would encounter in English texts. It was expected that time devoted to teaching these elements would be observed and documented in the treatment condition if the instructors adhered to the teacher script. It was assumed that there would not be a need to collect or document time devoted to teaching these transition elements in the control condition, because the intervention programs implemented did not include explicit instruction in transition elements. While the results of the observation data suggest that the treatment condition spent more time on transition elements (M = 2.04 min.) than the control condition (M = 1.00 min), instruction on transition elements was observed in both groups. A number of explanations could account for instruction in transition elements in the control condition. First, instructors were native Spanish speakers and responsible for teaching ELs during other parts of the school day; therefore, one could argue that they understood how to link reading components in Spanish to reading components in English (e.g., the majority of the consonants have the same letter sounds in Spanish and English). Second, many of the instructors had participated in the professional development of the larger SETR study that focused on providing explicit instruction in early reading skills and the link between Spanish reading instruction and English reading instruction. Consequently, instructors in the control condition were unwittingly able to deliver instruction that linked new English reading content to Spanish reading content. For example, it was observed that when a student did not understand a new word encountered 110 in the text, the instructor provided the word and its definition in Spanish. Therefore, the results of the fidelity of implementation and further inspection of the interventions used in the control condition suggest that the type of instruction delivered in both conditions was similar and thus yielded similar results. Differential Time on Core Reading Components The second major finding was that although the transition intervention significantly increased student opportunities to develop their vocabulary and comprehension skills, this increase in instructional time did not appear to have a significant effect on student overall reading outcomes when compared to the control condition that focused more on alphabetic principle or decoding. For example, in this study, results from the fidelity of implementation on instructional time devoted to the core reading components (i.e., phonemic awareness, phonics, word work, vocabulary and comprehension) indicated that there was a statistically significant difference between conditions. In the treatment condition, 13 minutes were devoted to teaching a combination of phonemic awareness, phonics and fluency, while, in the control condition, 24 minutes were devoted to those same components. Alternatively, in the treatment condition, 18 minutes were devoted to vocabulary and comprehension, while only 8 minutes were devoted to those same components in the control condition. The transition lessons were designed to briefly introduce and reinforce letter sound correspondences, decoding words and sentence fluency (10 minutes). The remainder of the lessons was devoted to building vocabulary knowledge, academic language and comprehension 111 strategies (20 minutes). Further examination of the programs implemented in the control groups and the results of the observations indicate that instructors in the control condition spent more time on decoding, sentence and passage reading skills than vocabulary and comprehension. The results in this study suggest that this difference in time devoted to different reading components did not affect the reading growth on word-reading measures or student outcomes on vocabulary and comprehension measures. Two potential reasons could account for this nondifferential effect. First, it appears that direct, explicit instruction in phonemic awareness, decoding and word attack skills results in improved reading growth when using assessments sensitive to capturing growth in these areas. Second, growth in vocabulary and comprehension takes longer to achieve and is more difficult to measure with typical first-grade outcome measures. For example, in the Denton et al. (2004) study, the students in the Read Well program received 10 minutes daily of explicit instruction in English decoding for 11 weeks. The differences in growth in English word reading between the tutored and nontutored students were statistically significant. On the other hand, in the Denton et al. (2004) study, results on comprehension measures did not yield the same differences between the treatment conditions. The authors concluded that although the students who received the Read Well intervention made gains in decoding, their automaticity and fluency were likely not sufficient to facilitate comprehension. Similarly, in the Gunn et al. (2000) study, Hispanic first-grade students who were tutored in a systematic, explicit phonics program made significant gains in decoding skills after one year of instruction. On the other hand, Gunn et al. (2005) noted that significant gains both for Hispanic and for 112 non-Hispanic students on oral reading fluency, vocabulary and comprehension were not observed until after the second year of the study. Limitations Several limitations of this study require elaboration. First, a potential threat to internal validity was treatment diffusion between conditions. As discussed earlier, this study was part of a larger national study designed to examine the effect of systematic teaching routines on student reading outcomes. In this study, the schools, teachers and students were from the treatment condition of the larger study and, therefore, had access to training and instructional materials utilized in the SETR study. Although the training and instructional materials in the larger study did not include the transition lessons, the SETR templates were ostensibly based on the same theory and design principles. In addition, the programs implemented in the control condition were arguably based on the same instructional design principles (i.e., explicit instruction, student practice and teacher feedback) and covered the same core reading components (i.e., phonemic awareness, phonics, vocabulary and comprehension). As reported earlier, the data collected during observations of implementation indicated that the instruction in both conditions was similar. Therefore, the contrast between the instruction in the treatment and control condition was not as large as expected. In essence, the study might have compared two different versions of the same instruction (i.e., Packaged Program A+ verses Packaged Program A-). 113 Second, the student grouping within conditions potentially affected implementation in this study. According to the literature on intervention research, instruction is most effective when delivered to a homogenous group of students with the goal of providing targeted instruction to meet individual needs (Briggs, Edmonds, & Twiddy, 2000; Torgesen et al., 2001; Vaughn & Linan-Thompson, 2003). Notably, in a study conducted by Briggs et al. (2000) on the effect of different grouping patterns on student outcomes, results indicated that student engagement was significantly higher when students with similar needs were grouped together for instruction. In this study, student participants were matched based on initial word-reading level and then randomly assigned to either treatment or control condition for purpose of equalizing groups. As a result, the matching and random assignment procedure resulted in establishing groups of students with varying reading skills or heterogeneous groups within their at-risk status. Importantly, data collected during observations suggested that the varying abilities of the students may have adversely affected the instructor‘s ability to manage student behavior during instruction, including the ability of the teacher to provide adequate opportunities for students to respond individually. Although the researcher provided the instructors with tools for managing student behaviors, which could have mediated the problem, it warrants noting as a possible limitation. Third, the results of this study have limited generalization to other populations. The purpose of this study was to examine the effect of an intervention program designed to improve the reading outcomes of first-grade English-language learners in a bilingual reading program. A convenient sample of English-language learners and teachers from 114 schools with a bilingual reading program were obtained. Therefore, the results of the study will be generalized only to other settings similar to those in this study and cannot be generalized to English-only programs. Furthermore, this study was conducted with Spanish-speaking ELs who were identified as at risk for demonstrating early literacy skills. Consequently, the results do not directly generalize to typically achieving ELs. Lastly, another limitation of this study is that the ELs received the intervention instruction in both conditions in the context of the larger SETR study. The results of this study examined the additional 30 minutes of instruction but did not take into account the instruction during the remainder of the day. Future Research This study sought to compare the effect of a transition intervention and a standard school-based intervention on the reading development of ELs in first grade. The results of this study suggest that both a transition intervention and standard school-based intervention had the same effect on the reading outcomes of ELs. Importantly, the results suggest that both conditions were equally effective in accelerating student progress. As previously discussed in Chapter II, a gap in achievement between ELs and white students continues to exist across grade levels in schools (Lee et al., 2007).Therefore, an important question to be answered in future research is, ―What else do ELs need to accelerate their reading trajectory?‖ Further studies should be conducted to examine the specific components of the transition elements on the reading growth of ELs transitioning from learning to read in 115 Spanish to learning to read in English. In this study, both interventions (treatment and control) were comparable in terms of teacher‘s delivery of instruction and the use of research-based programs with the only intended difference in the explicit instruction in the transition elements in the treatment condition. The original design resulted in a tightly controlled study at the onset, but as mentioned previously, there was incidental instruction linking English instruction with Spanish instruction in the control condition because of the context of the transition study within the larger SETR study (i.e., SETR templates and training provided to instructors on linking Spanish and English reading instruction). Future studies should aspire to measure the effect of explicit instruction in transition elements such as linguistic transfer, academic language and instructional terminology in comparison with a control condition in which the transition elements are absent. Importantly, further study should include standardized pretest and posttest measures that capture language and vocabulary development. Additionally, it would be valuable to follow the students longitudinally to determine whether instruction in these transition elements affects reading achievement in later grades. Summary Given the increasing number of English-language Learners who are part of the educational system in the United States and the limited educational supports available, it is imperative that more research be conducted to promote the educational success of this student population. Studies on effective reading interventions with native English speakers have increased substantially in the past 10 years (Vaughn et al., 2005), while 116 studies on effective reading interventions for ELs remain scarce. The National Literacy Panel (NLP) identified only 17 studies on instructional approaches with ELs, which included dissertations and technical reports. Often, these studies lacked even minimal descriptions or explanations of the common instructional routines, including descriptions of the professional development provided to teachers to make instruction in the literacy components maximally effective for ELs (Shanahan & Beck, 2006; Vaughn, Cirino, et al., 2006). As discussed, it has been identified in the literature that there are a limited number of programs that target the needs of English-language Learners available to schools. In particular, there is an insufficient number of programs designed explicitly to close the achievement gap between Hispanics and Whites. Interventions are needed that accelerate the growth of ELs so they can catch up to their peers and maintain instructional gains. Moreover, schools are grappling with how to include ELs in a multitiered instructional framework for delivering reading instruction and monitoring student progress. The results of this study contribute to the existing research suggesting that interventions currently on the market for at-risk monolingual students are also effective with English-language learners. There is no need to wait until students have achieved a certain level of language proficiency in English to include them in small-group instruction that targets their specific reading difficulties as identified by formative assessments. Results of national assessments still indicate that ELs are performing substantially lower than non-ELs. Although the field has advanced in the identification of students at risk for reading difficulties as well as the type of instructional support students need, 117 further research on the specific elements that would help ELs accelerate their reading performance is warranted. Likewise, further investigation in determining whether ELs require a different approach or an intervention program designed specifically for them is recommended. It‘s clear that the following question needs to be more intensely examined: ―Do current programs anchored in reading research accomplish the task of accelerating ELs‘ reading gains?‖ 118 APPENDIX A TRANSITION LESSON MAPS 119 LESSON 1 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: m, s, l, t, short a Phonemic Awareness: Segmenting : m, s, l, t, short a Phonemic Awareness: Blending: m, s, l, t, short a Phonemic Awareness: Sound Manipulation: m, s, l, t, short a Phonemic Awareness: Review Phonics: Letter names and sounds, introduction m, s, l, t, short a Phonics: Letter names and sounds, practice m, s, l, t, short a Phonics: Letter names and sounds, practice writing m, s, l, t, short a Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Sam, Sal, Al, last, mat, sat Word work: reading Sam, Sal, Al, last, mat, sat Word work: spelling Sam, Sal, Al, last, mat, sat Word work: ―Road Race‖ Word work: Fluency Sam, Sal, Al, last, mat, sat Vocabulary: mat, sat, last Vocabulary: mat, sat, last Vocabulary: Review Vocabulary: Review Sentence reading and writing Sentence reading Sam sat. Sal sat. Al sat last. Comprehension question word intro: who Who sat last? Sentence reading Sam sat. Sal sat. Al sat last. Comprehension question word review: who Who sat last? Who sat first? Sentence reading Sam sat. Sal sat. Al sat last. Comprehension question word review: who Who sat last? Who sat first? Sentence reading: Fluency Sam sat. Sal sat. Al sat last. Read aloud passage: Sam the Rat (fiction) Vocabulary: on, on top of, inside Academic language: Questions with ―Does.‖ Text will be read once and students listen, then teachers and students will read the text together and answer location questions. Read aloud passage: Sam the Rat (fiction) Vocabulary: on, on top of, inside Academic language: Questions with ―Does‖, and with wh (who, what, where), first, last. Text will be read once and students listen, then teachers and students will read the text together and answer questions. Read aloud passage: Pam Cooks Sap (non-fiction) Vocabulary: sap, tap, tack Academic language: title, questions with ―Does‖, and with wh (who, what, where), first, next, then, last. Text will be read twice. Students listen the first time and follow along the second time. Read aloud passage: Pam Cooks Sap (non- fiction) Vocabulary review: sap, tap, tack Academic language: title, questions with ―Does‖, and with wh (who, what, where), first, next, then, last. Text is read once. Students follow along or read with teacher. Read aloud passage: Pam Cooks Sap (non- fiction) Vocabulary review sap, tap, tack Academic language review: title, questions with wh (who, what, where), first, then, last, sequence of events. Text is read once. Students follow along or read with teacher. 120 LESSON 2 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a New: n, v, p Phonemic Awareness: Segmenting: Review: m, s, l, t, a New: n, v, p Phonemic Awareness: Blending: Review: m, s, l, t, a New: n, v, p Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, s, a New: n, v, p Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: n, v, p Phonics: Letter names and sounds, practice Review: m, s, l, t, a New: n, v, p Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a New: n, v, p Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading van, man, lap, map, Pam Word work: reading Regular: van, man, lap, map, Pam Sight: the, is Word work: spelling van, map, Pam, sat Sight: the, is Word work: ―Road Race‖ Word work: Fluency Sam, last, mat, sat, van, man, lap, map, Pam Sight: the, is Vocabulary: van, lap, man Vocabulary: van, lap, man, on Vocabulary Review: on, van Vocabulary Review van, lap, man, on Sentence reading and writing Sentence reading Sam is a man. Sam sat in the van. The map is on Sam‘s lap. Comprehension question word intro: who, what, where Sentence reading Sam is a man. Sam sat in the van. The map is on Sam‘s lap. Comprehension question word review: who, what, where Sentence reading Sam sat in the van. The map is on Sam‘s lap. Comprehension question word review: who, what, where Sentence reading: Fluency Sam is a man. Sam sat in the van. The map is on Sam‘s lap. Comprehension question word review: who, what, where Read aloud passage: A Kid and his Friends (fiction) Vocabulary: kid, friends, hill Academic language: Questions with ―Does,‖ title, and, with, wh (who, what, where), first, next, then, last, sequence of events, ―How do you know?,‖ because Read aloud passage: A Kid and His Friends (fiction) Vocabulary: kid, friends, hill, on top of Academic language: Questions with ―Does‖, and with wh (who, what, where), first, next, then, last, sequence of events, ―How do you know?,‖ because Read aloud passage: The Masks Vocabulary: mask, feel, glad, favorite/least favorite Academic language: author, which, how many, review Read aloud passage: The Masks Vocabulary: mask, feel, glad, favorite/least favorite Academic language: author, which, how many, review Read aloud passage: The Masks Vocabulary: mask, feel, glad, favorite/least favorite Academic language: author, which, how many, review, 121 LESSON 3 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p New: f, c /k/, z, k, d, short i Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p New: f, c /k/, z, k, d, short i Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p New: f, c /k/, z, k, d, short i Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p New: f, c /k/, z, k, d, short i Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: f, c /k/, z, k, d, short i Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p New: f, c /k/, z, k, d, short i Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p New: f, c /k/, z, k, d, short i Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: can, fit, tip, zip, kid, sit, did Word work: reading Regular: can, fit, tip, zip, kid, sit, did Sight: are, no Word work: spelling can, fit, tip, zip, kid, sit, did Sight: are, no Word work: ―Road Race‖ Word work: Fluency can, fit, tip, zip, kid, sit, did Sight: are, no Vocabulary: kid, zip, tip, fit Vocabulary: kid, zip, tip, fit Vocabulary Review: kid, zip, tip, fit Vocabulary Review kid, zip, tip, fit Sentence reading and writing Sentence reading The kid is fit. Dan and the kid can zip and tip. Dan and the kid did sit. Comprehension question word intro: who, what, where Sentence reading. The kid is fit. Dan and the kid can zip and tip. Dan and the kid did sit. Comprehension question word review: who, what, where Sentence reading The kid is fit. Dan and the kid can zip and tip. Dan and the kid did sit. Comprehension question word review: who, what, where Sentence reading: Fluency The kid is fit. Dan and the kid can zip and tip. Dan and the kid did sit. Comprehension question word review: who, what, where Read aloud passage: Fins (non-fiction) Vocabulary: fins, ocean, zip, thin, above Academic Language: information, question, mostly, topic, learning Read aloud passage: Fins (non- fiction) Vocabulary: Fins, ocean, zip, thin, above Academic Language: information, question, mostly, topic, learning Read aloud passage: Crabs (non-fiction) Vocabulary: crabs, jab, protect Academic Language: information, question, mostly, learning, wh questions (what), title Read aloud passage: Crabs (non-fiction) Vocabulary: crabs, jab, protect Academic Language: information, question, mostly, learning, wh questions (what), title Read aloud passage: Crabs (non-fiction) Vocabulary: crabs, jab, protect Academic Language: information, question, mostly, learning, wh questions (what), title 122 LESSON 4 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c /k/, z, d, i New: b,g, short o Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c /k/, z, d, i New: b, g, short o Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c /k/, z, d, i New: b, g, short o Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c /k/, z, d, i New: b, g, short o Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: b, g, short o Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p, f, k /k/, z, d, i New: b, g, short o Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p, f, k /k/, z, d, i New: b, g, short o Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: big, top, bag, log, bat, fog, mop Word work: reading Regular: big, top, bag, log, bat, fog, mop Sight: I, am and, a, have Word work: reading Regular: big, top, bag, log, bat, fog, mop Sight: I, am and, a, have Word work: ―Road Race‖ Word work: Fluency big, top, bag, log, bat, fog, mop Sight: I, am and, a, have Vocabulary: big, bag, top Vocabulary: big, bag, top Vocabulary Review: big, bag, top Vocabulary Review: big, bag, top Sentence reading and writing Sentence reading I am Dan. I have a top and a bat. I have a big bag. The top and the bat are in the bag. Sentence reading I have a top and a bat. Sentence reading I have a big bag. The top and the bat are in the bag. Sentence reading: Fluency I have a top and a bat. I have a big bag. The top and the bat are in the bag. Read aloud passage: Fog (non-fiction) Vocabulary words: fog, forms, drops, air, cloud, ground Read aloud passage: Fog (non- fiction) Vocabulary words: fog, forms, drops, air, cloud, ground Read aloud passage: Hogs (non-fiction) Vocabulary words: friendly, mammal, animals, clean, make, Read aloud passage: Hogs (non-fiction) Vocabulary words: friendly, mammal, animals, clean, make Read aloud passage: Hogs (non-fiction) Vocabulary words: fog, forms, drops, air, cloud, ground, friendly, mammal, animals, clean, make, Academic Language: Questions with ―Does‖ and ―wh,‖ ―How do you know?‖ Because, different, Introduce nouns Academic Language: Questions with ―Does‖ and ―wh,‖ ―How do you know?‖ Because, different, review nouns Academic Language: information, question, mostly, learning, wh questions (what), title, Different, review nouns Academic Language: information, question, mostly, learning, wh questions (what), title, different, review nouns Academic Language: information, question, mostly, learning, wh questions (what), title, different, review nouns 123 LESSON 5 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p, f, c, k, z, d, i, b, g, o New: x, c /s/, short e Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p, f, c, k, z, d, i, b, g, o New: x, c /s/, short e Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p, f, c, k, z, d, i, b, g, o New: x, c /s/, short e Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p, f, c, k, z, d, i, b, g, o New: x, c /s/, short e Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: x, c /s/, short e, ll Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p, f, c k, d, i, b, g, o New: x, c /s/, short e, ll Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p, f, c k, d, i, b, g, o New: x, c /s/, short e, ll Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: fox, den, box, ten, cent, bell, tell, vet, cell Word work: reading Regular: fox, den, box, ten, cent, bell, tell, vet Sight: has, he, with Word work: spelling fox, den, box, ten, cent, vet, cell Sight: has, he, with Word work: ―Road Race‖ Word work: Fluency Regular: fox, den, box, ten, cent, vet, bell, tell, cell Sight: has, he, with Vocabulary: bell, den, vet, tell, cent, cell Vocabulary: bell, den, vet, tell, cent, cell Vocabulary Review: bell, den, vet, tell, cent, cell Vocabulary Review: bell, den, vet, tell, cent, cell Sentence reading and writing Sentence reading Max is a fox. Max is in the den. He has a box with ten cents, and a cell. Sentence reading Max is in the den. He has a box with ten cents, and a cell. Sentence reading Max is a fox. He has a box with ten cents, and a cell. Sentence reading: Fluency Review Max is a fox. Max is in the den. He has a box with ten cents, and a cell. Read aloud passage: Bells (non-fiction) Vocabulary words: ship, ring, men, women, quickly Read aloud passage: Bells (non-fiction) Vocabulary words: ship, ring, men, women, quickly Read aloud passage: A Trip to the Vet (non-fiction) Vocabulary words: Vet, checked, shot, note, pad, drugstore Read aloud passage: A Trip to the Vet (non-fiction) Vocabulary words: Vet, checked, shot, note, pad, drugstore Read aloud passage: A Trip to the Vet (non- fiction) Vocabulary words: Vet, checked, shot, note, pad, drugstore Academic Language: title, information, question, mostly, topic, learning, because, Introduce adjectives and review nouns Academic Language: title, information, question, mostly, topic, learning, because Introduce adjectives and review nouns Academic Language: what happens, after, questions with ―Does‖ and ―wh,‖ title, author, because, and, first, then, last, sequence of events, ―How do you know?‖ Introduce adjectives and review nouns Academic Language: what happens, after, questions with ―Does‖ and ―wh,‖ title, author, because, and, first, then, last, sequence of events, ―How do you know?‖ Introduce adjectives and review nouns Academic Language: what happens, after, questions with ―Does‖ and ―wh,‖ title, author, because, and, first, then, last, sequence of events, ―How do you know?‖ Introduce adjectives and review nouns 124 LESSON 6 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll New: r, h Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll New: r, h Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll New: r, h Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll l New: r, h Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: r, h Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll New: r, h Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll New: r, h Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: rat, hen, hot, hat, pen, red Word work: reading Regular: rat, hen, hot, hat, pen, red Sight: she, but Word work: spelling rat, hen, hot, hat, pen, red Sight: she, but Word work: ―Road Race‖ Word work: Fluency rat, hen, hot, hat, pen, red Sight: she, but Vocabulary: hen, hot, hat, pen Vocabulary: hen, hot, hat, pen Vocabulary Review: hen, hot, hat, pen Vocabulary Review: hen, hot, hat, pen Sentence reading and writing Sentence reading The hen sits in a pen. It is hot, but she is not mad. She is glad. She has a red hat! Sentence reading It is hot, but she is not mad. She is glad. Sentence reading The hen sits in a pen. It is hot, but she is not mad. Sentence reading: Fluency Review The hen sits in a pen. It is hot, but she is not mad. She is glad. She has a red hat! Read aloud passage: A Hen (non-fiction) Vocabulary words: nest, rest, wings, feathers, beak, peck, neck Read aloud passage: A Hen (non-fiction) Vocabulary words: nest, rest, wings, feathers, beak, peck, neck Read aloud passage: The Hen, the Cat, and the Rat (fiction) Vocabulary words: around, heavy, stuck, got lost, celebrated Read aloud passage: The Hen, the Cat, and the Rat (fiction) Vocabulary words: around, heavy, stuck, got lost, celebrated Read aloud passage: The Hen, the Cat, and the Rat (fiction) Vocabulary words: around, heavy, stuck, got lost, celebrated Academic Language: title, information, question, mostly, topic, learning, Introduce verbs, review nouns and adjectives Academic Language: title, information, question, mostly, topic, learning Introduce verbs, review nouns and adjectives Academic Language: title, author, questions with ―Does‖ and ―wh,‖ first, next, then, sequence of events, because Introduce verbs, review nouns and adjectives Academic Language: title, author, questions with ―Does‖ and ―wh,‖ first, next, then, sequence of events, because Introduce verbs, review nouns and adjectives Academic Language: title, author, questions with ―Does‖ and ―wh,‖ first, next, then, sequence of events, because Introduce verbs, review nouns and adjectives 125 LESSON 7 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h New: Short u, /j/ (jump) Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h New: short u, /j/ (jump) Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h New: short u, /j/ (jump) Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h New: short u, /j/ (jump) Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: Short u, /j/ (jump) Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h New: Short u, /j/(jump) Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h New: Short u, /j/ (jump) Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: jump, sun, fun, bugs, bump, tub Word work: reading Regular: jump, sun, fun, bugs, bump, tub Sight: they, into, from Word work: spelling: jump, sun, fun, bugs, bump, tub Sight: they, into, from Word work: ―Road Race‖ Word work: Fluency jump, sun, fun, bugs, bump, tub. Sight: they, into, from Vocabulary: jump, bump, tub, fun, bugs Vocabulary: jump, bump, tub, fun, bugs Vocabulary Review: jump, bump, tub, fun, bugs Vocabulary Review: jump, bump, tub, fun, bugs Sentence reading and writing Sentence reading The cats jump in the sun. They bump into the tub. They run from the bugs. They have fun. Sentence reading The cats jump in the sun They bump into the tub. Sentence reading The cats jump in the sun. They have fun. Sentence reading: Fluency Review The cats jump in the sun. They bump into the tub. They run from the bugs. They have fun. Read aloud passage: A Bug Hunt (non-fiction) Vocabulary words: yesterday, hunt, hide, jump Read aloud passage: A Bug Hunt (non-fiction) vocabulary words: yesterday, hunt, hide, jump Read aloud passage: A Duck (non-fiction) vocabulary words: describe, curious, graceful, action Read aloud passage: A Duck (non-fiction) vocabulary words: describe, curious, graceful, action Read aloud passage: A Duck (non-fiction) vocabulary words: yesterday, hunt, hide, jump, describe, curious, graceful, action Academic Language: title, information, question, mostly, topic, learning Academic Language: title, information, question, mostly, topic, learning Academic Language: noun, adjectives, verbs Academic Language: noun, adjectives, verbs Academic Language: noun, adjectives, verbs 126 LESSON 8 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j New: long a Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j New: long a Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j New: long a Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j New: long a Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: long a (a_e) Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c k, d, i, b, g, o x, c /s/, e, ll r, h, u, j New: long a (a_e) Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, d, i, b, g, o x, c /s/, e, ll r, h, u, j New: long a (a_e) Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: cake, bake take, lake, make, made, mom Word work: reading Regular: cake, made, take, lake, make, mom Sight: my, friend, her Word work: spelling: cake, made, take, lake, make, mom Sight: my, friend, her Word work: ―Road Race‖ Word work: Fluency cake, made, take, lake, make, bake, mom Sight: my, friend, her Vocabulary: lake, bake, take Vocabulary: lake, bake, take Vocabulary Review: lake, bake, take Vocabulary Review: lake, bake, take Sentence reading and writing Sentence reading My mom and I bake a cake. She and I take it to her friend. Her friend is at the lake. Sentence reading She and I take it to her friend. Sentence reading She and I take it to her friend. Her friend is at the lake. Sentence reading: Fluency Review My mom and I bake a cake. She and I take it to her friend. Her friend is at the lake.. Read aloud passage: A Lake (non-fiction) Vocabulary words: like, picnic, chase, always Read aloud passage: A Lake (non-fiction) vocabulary words: like, picnic, chase, always Read aloud passage: Max and Tim (non-fiction) vocabulary words: drowning, brave, saved Read aloud passage: Max and Tim (non- fiction) vocabulary words: drowning, brave, saved Read aloud passage: Max and Tim (non-fiction) vocabulary words: drowning, brave, saved like, picnic, chase, always Academic Language: title, information, question, mostly, topic, learning Academic Language: title, information, question, mostly, topic, learning Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action 127 LESSON 9 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, n, v, p long a, n, v, p New: long i Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j long a, n, v, p New: long i Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p New: long i Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, long a, n, v, p New: long i Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: long i (i_e) Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p New: long i (i_e) Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p New: long i(i_e) Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: bike, ride, five, like, side, lake Word work: reading Regular: bike, ride, five, like, side, lake Sight: we, our, of Word work: spelling: bike, ride, five, like, side, lake Sight: we, our, of Word work: ―Road Race‖ Word work: Fluency bike, ride, five, like, side, lake Sight: we, our, of Vocabulary: side, ride Vocabulary: side, ride Vocabulary Review: side, ride Vocabulary Review: side, ride Sentence reading and writing Sentence reading Mike and I have five friends. We like to ride our bikes up the hill. We like to ride our bikes on the side of the lake. Sentence reading Mike and I have five friends. We like to ride our bikes up the hill. Sentence reading We like to ride our bikes up the hill. We like to ride our bikes on the side of the lake. Sentence reading: Fluency Review Mike and I have five friends. We like to ride our bikes up the hill. We like to ride our bikes on the side of the lake. Read aloud passage: Hikes (non-fiction) Vocabulary words: hike, rough, trails, streams Read aloud passage: Hikes (non-fiction) vocabulary words: hike, rough, trails, streams Read aloud passage: Mike and the Red Kite (non- fiction) vocabulary words: kite, sway, forgot, gust Read aloud passage: Mike and the Red Kite (non-fiction) vocabulary words: kite, sway, forgot, gust Read aloud passage: Mike and the Red Kite (non-fiction) vocabulary words: kite, sway, forgot, gust, hike, rough, trails, streams Academic Language: title, information, question, mostly, topic, learning, describe Academic Language: title, information, question, mostly, topic, learning, describe Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action 128 LESSON 10 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, n, v, p long a, n, v, p, long i New: long o (o_e), w Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j long a, n, v, p, long i New: long o (o_e), w Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i New: long o (o_e), w Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, long a, n, v, p long i New: long o (o_e), w Phonemic Awareness: Review: long o (o_e), w Phonics: Letter names and sounds, introduction New: long o (o_e), w Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i New: long o (o_e), w Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i New: long o (o_e), w Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: rose, hose, nose, mole, smells, will, wilt Word work: reading Regular: rose, hose, nose, mole, smells, will, wilt Sight: with, her (review) Word work: spelling: Regular: rose, hose, nose, mole, smells, will, wilt Sight: with, her(review) Word work: ―Road Race‖ Word work: Fluency Regular: rose, hose, nose, mole, smells, will, wilt Sight: with, her(review) Vocabulary: rose, hose, mole, smells, wilt Vocabulary: rose, hose, mole, smells, wilt Vocabulary Review: rose, hose, mole, smells, wilt Vocabulary Review: rose, hose, mole, smells, wilt Sentence reading and writing Sentence reading The mole has a red rose. The mole smells the rose with her nose. The rose smells nice. But the mole has no hose. Will the rose wilt? Sentence reading The mole has a red rose. The mole smells the rose with her nose. Sentence reading The rose smells nice. But the mole has no hose. Will the rose wilt? Sentence reading: Fluency Review The mole has a red rose. The mole smells the rose with her nose. The rose smells nice. But the mole has no hose. Will the rose wilt? Read aloud passage: Moles (non-fiction) Vocabulary words: through, hole, whole Read aloud passage: Moles (non-fiction) vocabulary words: through, hole, whole Read aloud passage: Clowns (non-fiction) vocabulary words: colorful, costumes, clumsy, spill Read aloud passage: Clowns (non-fiction) vocabulary words: colorful, costumes, clumsy, spill Read aloud passage: Jokes (non-fiction) vocabulary words: colorful, costumes, clumsy, spill Academic Language: title, information, question, mostly, topic, learning, describe Academic Language: title, information, question, mostly, topic, learning, describe Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action 129 LESSON 11 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, n, v, p long a, n, v, p, long i, long o, qu New: long e (ee, ea), qu Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j long a, n, v, p, long i, long o, qu New: long e (ee, ea), qu Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i long o, qu New: long e (ee, ea), qu Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, long a, n, v, p long i long o, qu New: long e (ee, ea), qu Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: long e (ee, ea), qu Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i, long o, qu New: long e (ee, ea), qu Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i long o, qu New: long e (ee, ea), qu Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: ee, ea: weak, eat, meat, beans, meal, need, sleep, queen Word work: reading Regular: ee, ea: weak, eat, meat, beans, meal, need, sleep, queen Sight: after, every Word work: spelling: Regular: ee, ea: weak, eat, meat, beans, meal, need, sleep, queen Sight: after, every Word work: ―Road Race‖ Word work: Fluency Regular: ee, ea: weak, eat, meat, beans, meal, need, sleep, queen Sight: after, every Vocabulary: weak, queen, meat, meal Vocabulary: weak, queen, meat, meal Vocabulary Review: weak, queen, meat, meal Vocabulary Review: weak, queen, meat, meal Sentence reading and writing Sentence reading The weak queen likes to eat meat. The weak queen eats beans and meat in every meal. The weak queen needs to sleep after her meals. Sentence reading The weak queen likes to eat meat. Sentence reading The weak queen eats beans and meat in every meal. The weak queen needs to sleep after her meals. Sentence reading: Fluency Review The weak queen likes to eat meat. The weak queen eats beans and meat in every meal. The weak queen needs to sleep after her meals. Read aloud passage: Willy the Seal (fiction) Vocabulary words: island, relax, under, over Read aloud passage: Willy the seal (fiction) vocabulary words: island, relax, under, over Read aloud passage: Seeds (non-fiction) vocabulary words: pointy, outside, special, important Read aloud passage: Seeds (non-fiction) vocabulary words: pointy, outside, special, important Read aloud passage: Seeds (non-fiction) vocabulary words: pointy, outside, special, important Academic Language: Title, questions with ―Does and Wh‖, first, then, last Academic Language: Title, questions with ―Does and Wh‖, first, then, last Academic Language: title, information, question, mostly, topic, learning, author Academic Language: title, information, question, mostly, topic, learning, author Academic Language: title, information, question, mostly, topic, learning, author 130 LESSON 12 Day 1 Day 2 Day 3 Day 4 Day 5 Phonemic Awareness: Initial Sounds: Review: m, s, l, t, a, n, v, p f, c, k, z, d, i, b, g, o, x, c /s/, e, ll, r, h, u, j, n, v, p long a, n, v, p, long i, long o, qu long e, w, New: long u (u_e) Phonemic Awareness: Segmenting: Review: m, s, l, t, a, n, v, p, f, c k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j long a, n, v, p, long i, long o, qu, long e, w New: long u (u_e) Phonemic Awareness: Blending: Review: m, s, l, t, a, n, v, p f, c, k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i long o, qu long e, w, New: long u (u_e) Phonemic Awareness: Sound Manipulation: Review: m, s, l, t, a, n, v, p f, c, k, z, d, i, b, g, o x, c /s/, e, ll, r, h, u, j, long a, n, v, p long i, long o, qu long e, w, New: long u (u_e) Phonemic Awareness: Review Phonics: Letter names and sounds, introduction New: long u (u_e) Phonics: Letter names and sounds, practice Review: m, s, l, t, a, n, v, p f, c, k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i, long o, qu, long e, w New: long u (u_e) Phonics: Letter names and sounds, practice writing Review: m, s, l, t, a, n, v, p f, c k, z, d, i, b, g, o x, c /s/, e, ll r, h, u, j, long a, n, v, p, long i, long o, qu long e, w, New: long u (u_e) Phonics: Letter names and sounds, practice activity ―Road Race‖ Phonics: Fluency practice with letter cards Word work: reading Regular: cute, mule, huge, use, sea, needs, swim Word work: reading Regular: cute, mule, huge, use, sea, needs, swim Sight: cannot, tube Word work: spelling: Regular: cute, mule, huge, use, sea, needs, swim Sight: cannot, tube Word work: ―Road Race‖ Word work: Fluency Regular: cute, mule, huge, use, sea, needs, swim Sight: cannot, tube Vocabulary: cute, mule, huge, tube, sea Vocabulary: cute, mule, huge, tube, sea Vocabulary Review: cute, mule, huge, tube, sea Vocabulary Review: cute, mule, huge, tube, sea Sentence reading and writing Sentence reading The cute mule has a huge tube. He will use the huge tube in the sea. The cute mule cannot swim. He needs the huge tube. Sentence reading The cute mule has a huge tube. He will use the huge tube in the sea. Sentence reading The cute mule cannot swim. He needs the huge tube. Sentence reading: Fluency Review The cute mule has a huge tube. He will use the huge tube in the sea. The cute mule cannot swim. He needs the huge tube. Read aloud passage: Rules (non-fiction) Vocabulary words: rude, kicking, enforce, manners Read aloud passage: Rules (non-fiction) vocabulary words: rude, kicking, enforce, manners Read aloud passage: My Mule Lily (fiction) vocabulary words: ranch, shallow, flu, ―little by little‖ Read aloud passage: My Mule Lily (fiction) vocabulary words: ranch, shallow, flu, ―little by little‖ Read aloud passage: My Mule Lily (fiction) vocabulary words: ranch, shallow, flu, ―little by little‖ Academic Language: Title, questions with ―Does and Wh‖, first, then, last. Academic Language: Title, questions with ―Does and Wh‖, first, then, last. Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action Academic Language: noun, adjectives, verbs, describe, action 131 APPENDIX B TRANSITION LESSONS PRE- AND POSTTEST ASSESSMENT 132 Transition Lessons Pre and Posttest Assessment Student Name:___________________________________________ Date:______________ School Name:_____________________________________________ Summary of Scores Subtest Correct Errors Word List Fluency: Decodable (60) Word List Fluency: Sight words-irregular (24) Vocabulary Knowledge (8) Depth of Vocabulary (10) Comprehension Questions (6) Sequencing (5) Sentence Pattern Word Sort (5) (117) (34) TOTAL: Notes: 133 Word List Fluency Assessment Materials: Examiner‘s Protocol Student Probe Pencil Clipboard Stopwatch Administration Directions: 1. Place the student copy in front of the student. 2. Place the examiner‘s copy in front of you. 3. Say the specific directions (top of examiner‘s probe) to the student before each administration (decodable and irregular words): 4. Say “Begin” and start your stopwatch. If the student fails to say the first word on the list after 3 seconds, tell them the word and mark it as incorrect. 5. Follow along on your copy. Put a slash (/) through words read incorrectly (see scoring procedures). 6. If a student stops or struggles with a word for 3 seconds, tell the student the word and mark it as incorrect. 7. At the end of 30 seconds, place a bracket (]) after the last word and say, “Stop.” If student finishes sight word fluency test before the 30 seconds, record time. 134 Word List Fluency: Examiner‟s Probe Decodable List Correct:______ Errors: _______ ―Please read these words out loud. If you get stuck, I will tell you the word so you can keep reading. You can stop reading when you hear me say, "stop”. Start here (point to the first word on the page and drag your finger to the right to show directionality). “Begin.” vet Al nose hose sea mule use make van rat needs bake fun bat sun fin bump tub Five like made mom hot Rose tell beans mop bell Hat bike den meat tip Side zip pen mole last Bugs kid hen jump bag Weak lake cute map sleep Take box need fox cent Lap Sam log man top Fog last Sight Word List Correct:______ Errors: _______ Time:_______ the after every her have they cannot with friend into are I we our from she of he my no tube is am a 135 Vocabulary Knowledge Materials: Examiner‘s Protocol Clipboard Pencil Administration Directions: 1. Place examiner probe on clipboard and position so that student cannot see what you record. 2. Say these specific directions to the student: “I have some pictures to show you (show all the pictures on first page) “I will say something; then I want you to put your finger on the picture of what I have said. Let‟s try one. Put your finger on ball.” Correct Response: Incorrect Response: If child responds correctly by pointing to ball, say: Good! That is ball. If child responds incorrectly, demonstrate by pointing to the ball and say: This is ball. Try again. Put your finger on ball. If child responds correctly say: Good! That is ball. If child responds incorrectly, move on. “Here are more pictures. Each time I say something, point to the best picture of what I have said. You may not be sure which picture to point to, but I want you to look carefully at all the pictures and point to the best picture of what I have said. Ready? Point to (start item word)” Word Correct (X) Incorrect Response picnic fog trail crab beak sad thin fin 136 Vocabulary Depth of Knowledge Assessment Materials: Examiner‘s Protocol Clipboard Pencil Administration Directions: 3. Place examiner probe on clipboard and position so that student cannot see what you record. 4. Say these specific directions to the student: “I‟m going to say some words. I want you to tell me what each word means AND use the word in a sentence. For example, if I say the word “sad” you could say, “Sad is when you are not happy. I was sad when my ice-cream fell on the floor.” “Now it‟s your turn. (One-second pause). Remember to tell me what the word means AND use the word in a sentence. Tell me about the word „ball‟.” CORRECT RESPONSE: If student gives a correct response, say: PARTIAL RESPONSE If student gives a definition OR uses the word in a sentence say: INCORRECT RESPONSE: If student gives an incorrect response, says, ―I don‘t know,‖ or doesn‘t respond say: “Very good.” Gives a definition, but does not use the word in sentence: “Nice job telling me what the word means, but remember I want you to also use the word in a sentence. Listen. A ball is a toy that you bounce. I threw a ball with my friends after school.” OR Uses the word in a sentence, but does not give a definition: “Nice job using the word in a sentence, but remember I want you to also tell me what the word means. Listen. A ball is a toy that you bounce. I threw a ball with my friends after school.” “Listen. A ball is a toy that you bounce. I threw a ball with my friends after school.” If you don‟t know what a word means, it is OK to say, “I don‟t know.” OK. Here is your first word. 5. Record the exact words the student provides in the space provided. Administer each item by saying “Tell me about the word ______.” If the student does not reply say repeat the prompt once. If the student still does not respond, mark ―NR‖ (for ‗no response‖ on the answer sheet and go to the next word. If the child responds by saying, ―I don‘t know‖ write the ―DK‖ (for ‗don‘t know‘) on the answer sheet. 6. If a student responds by providing a definition OR using the word in a sentence for two consecutive items say, “Remember to tell me what the word means AND use it in a sentence.” This reminder may be given twice. 137 7. If the student gives a partial or ambiguous response, prompt by saying, “Tell me more about the word _______.” This prompt should be used up to two times if the student has not provided a definition AND used the word in context. 8. If the child acts out a word (e.g., snore), prompt the child by saying, “Tell me what ___ means using words.” (If child is not able to provide the definition in words, write ―acted out‖ on the score sheet. (Scoring = this would score 1 point). 9. If the child begins to ramble or becomes off-task, redirect the student back to the task. 8. Continue administering the remaining words until you complete the list. Administer all words regardless of student accuracy. Encourage responses with neutral praise (Example: OK, GOOD, NICE JOB). If child becomes frustrated it is ok to tell them that they won‘t know some of the words and that is ok! 138 Vocabulary Depth of Knowledge: Examiner‟s Probe Prompt & Response Define 1 pt. Use 1 pt. Total 1 On top of 2 inside 3 curious 4 protect 5 above 139 Read Aloud Assessment Materials: Examiner‘s protocol Stimulus story: My Birthday Sentence Strips (4) Sentence Pattern Chart (Adjectives – Nouns - Verbs) Sentence Cards with target words highlighted and word cards (silly, new, friends, ran, ate) Pencil Clipboard ―Now I will read you a story. The title of the story is ―My Birthday‖ While I read, I want you to think about what the story is mostly about. I also want you to think about what you learned. [Read story]. Comprehension Questions “Next, I would like you to answer some questions about the story, My Birthday.” All primary questions will be asked in an open-ended format. If questions are answered correctly then check the box next to the question and go on to the next question. If the open-ended question is not answered or is answered incorrectly it will be followed with a series of three related yes/no questions. Do not ask the follow up questions if the open-ended question is answered correctly. Questions may be repeated once if necessary. If a student gives a vague response, query by saying “tell me more.” This prompt may be used once for each question. Open Ended Questions and responses (2 pts) Yes or No Questions Circle one for each Yes or No question (1 pt) What was the story mostly about? [ ]  Was the story about friends at school?  Was the story about friends at a birthday party?  Was the story about friends at the zoo? Y N NR Y N NR (+) (1 pt) Y N NR What did the friends do at the birthday party? [ ]  Did the friends swim in a pool?  Did the friends ride bikes and play games?  Did the friends play baseball?? Y N NR Y N NR (+) (1 pt) Y N NR Why did the friends run fast to the picnic tables? [ ]  Did the friends run fast because they were in a race?  Did the friends run fast because they were being chased by a bear?  Did the friends run fast because ants were crawling on them? Y N NR Y N NR Y N NR(+) (1 pt) 140 Sequencing “I am going to read the story again. I want you to think about what happened first, what happened next, and what happened at the end.” [Read story]. “Here are four sentences that tell something that happened in the story.” [Show students sentence strips. Then, read aloud or have student read each sentence strip.]Note which option was used. “I want you to put these sentences in order. Start with the first thing that happened in the story, then the next two things, then the last thing that happened. If you need help reading the sentences, I can help you.” [If needed, you may read the sentence strips to the students as they sort the story sequence.] After the student finishes the sort, review the story sequence by asking the following questions. Put a check mark next to the story sequence components that were sorted in correct order. Students get 1 pt. for starting the sequence correctly and then 1 pt for each sequence of two sentences and 1 pt for the last sentence-5 points total. Questions Story Sequence (number in box to represent student sequence) What is the first thing that happened? 1 [ ] (Friends were at the park for a birthday party) 2 [ ] (The friends wore helmets while riding bikes.) 3 [ ] (Ants crawled on the friends while they were sitting by the pond.) 4 [ ] (The friends sang happy birthday and ate ice cream cake.) What is the next thing that happened? Then what happened? What is the last thing that happened? 141 Sentence Pattern Word Sort “Now I am going to show you some sentence cards. Each sentence has a word that is highlighted. (1 noun, 2 adjectives, 2 verbs). If the highlighted word in the sentence is a noun, put the word on the “noun” part of the chart. [Show student, can prompt in Spanish-sustantivo)] If the highlighted word in the sentence is an adjective, put the word on the “adjective” part of the chart. [Show student, prompt in Spanish-adjetivo] If the highlighted word in the sentence is a verb, put the word on the “verb” part of the chart.” [Show students, can prompt in Spanish-verbo]. “Now lets try one together” (Use the following sentences as a practice item, follow the directions above).The horse likes to run and jump. Lilly is a white horse. Lilly runs with Carmen. “Now it is your turn. (Administer the sentences from the story) (After the student finishes the sort, review the sentence pattern chart with the student. Put a check mark next to the words that were sorted correctly).In the last two rows of the table, write how students sorted the words. Adjectives Nouns Verbs silly [ ] friends [ ] ran [ ] new [ ] ate [ ] 142 SETR Transition Lessons Assessment (Posttest) Student Name:___________________________________________ Date:______________ School Name:_____________________________________________ Summary of Scores Subtest Correct Errors Word List Fluency: Decodable (60) Word List Fluency: Sight words-irregular (24) Vocabulary Knowledge (8) Depth of Vocabulary (10) Comprehension Questions (6) Sequencing (5) Sentence Pattern Word Sort (5) (117) (34) TOTAL: Notes: Depth of Knowledge: Pre-test words in italics: _________correct _______errors 143 Word List Fluency Assessment Materials: Examiner‘s Protocol Student Probe Pencil Clipboard Stopwatch Administration Directions: 1. Place the student copy in front of the student. 2. Place the examiner‘s copy in front of you. 3. Say the specific directions (top of examiner‘s probe) to the student before each administration (decodable and irregular words): 4. Say “Begin” and start your stopwatch. If the student fails to say the first word on the list after 3 seconds, tell them the word and mark it as incorrect. 5. Follow along on your copy. Put a slash (/) through words read incorrectly (see scoring procedures). 6. If a student stops or struggles with a word for 3 seconds, tell the student the word and mark it as incorrect. 7. At the end of 30 seconds, place a bracket (]) after the last word and say, “Stop.” If student finishes sight word fluency test before the 30 seconds, record time. 144 Word List Fluency: Examiner‟s Probe Decodable List (30 seconds) Correct:______ Errors: _______ ―Please read these words out loud. If you get stuck, I will tell you the word so you can keep reading. You can stop reading when you hear me say, "stop”. Start here (point to the first word on the page and drag your finger to the right to show directionality). “Begin.” swim pen sea fun bugs mop zip bag queen tell seeds take use bat jump pin sun hub hit like made mom five rose cute hat huge lap beans wilt Sam meat top side mule log take past tube lid hen hump box sleep fog bike map weak bake sox huge fox cent bell den tan red tip make mask Sight Word List (30 seconds) Correct:______ Errors: _______ Time:_______ tube am she he my they our a I of is friend we cannot no every into her have from the are after with 145 Vocabulary Knowledge Materials: Examiner‘s Protocol Clipboard Pencil Administration Directions: 10. Place examiner probe on clipboard and position so that student cannot see what you record. 11. Say these specific directions to the student: “I have some pictures to show you (show all the pictures on first page) “I will say something; then I want you to put your finger on the picture of what I have said. Let‟s try one. Put your finger on ball.” Correct Response: Incorrect Response: If child responds correctly by pointing to ball, say: Good! That is ball. If child responds incorrectly, demonstrate by pointing to the ball and say: This is ball. Try again. Put your finger on ball. If child responds correctly say: Good! That is ball. If child responds incorrectly, move on. “Here are more pictures. Each time I say something, point to the best picture of what I have said. You may not be sure which picture to point to, but I want you to look carefully at all the pictures and point to the best picture of what I have said. Ready? Point to (start item word)” Word Correct (X) Incorrect Response hide sap angry kid mask cloud streams mountain 146 Vocabulary Depth of Knowledge Assessment Materials: Examiner‘s Protocol Clipboard Pencil Administration Directions: 12. Place examiner probe on clipboard and position so that student cannot see what you record. 13. Say these specific directions to the student: “I‟m going to say some words. I want you to tell me what each word means AND use the word in a sentence. For example, if I say the word “sad” you could say, “Sad is when you are not happy. I was sad when my ice-cream fell on the floor.” “Now it‟s your turn. (One-second pause). Remember to tell me what the word means AND use the word in a sentence. Tell me about the word „ball‟.” CORRECT RESPONSE: If student gives a correct response, say: PARTIAL RESPONSE If student gives a definition OR uses the word in a sentence say: INCORRECT RESPONSE: If student gives an incorrect response, says, ―I don‘t know,‖ or doesn‘t respond say: “Very good.” Gives a definition, but does not use the word in sentence: “Nice job telling me what the word means, but remember I want you to also use the word in a sentence. Listen. A ball is a toy that you bounce. I threw a ball with my friends after school.” OR Uses the word in a sentence, but does not give a definition: “Nice job using the word in a sentence, but remember I want you to also tell me what the word means. Listen. A ball is a toy that you bounce. I threw a ball with my friends after school.” “Listen. A ball is a toy that you bounce. I threw a ball with my friends after school.” If you don‟t know what a word means, it is OK to say, “I don‟t know.” OK. Here is your first word. 14. Record the exact words the student provides in the space provided. Administer each item by saying “Tell me about the word ______.” If the student does not reply say repeat the prompt once. If the student still does not respond, mark ―NR‖ (for ‗no response‖ on the answer sheet and go to the next word. If the child responds by saying, ―I don‘t know‖ write the ―DK‖ (for ‗don‘t know‘) on the answer sheet. 15. If a student responds by providing a definition OR using the word in a sentence for two consecutive items say, “Remember to tell me what the word means AND use it in a sentence.” This reminder may be given twice. 147 16. If the student gives a partial or ambiguous response, prompt by saying, “Tell me more about the word _______.” This prompt should be used up to two times if the student has not provided a definition AND used the word in context. 17. If the child acts out a word (e.g., snore), prompt the child by saying, “Tell me what ___ means using words.” (If child is not able to provide the definition in words, write ―acted out‖ on the score sheet. (Scoring = this would score 1 point). 18. If the child begins to ramble or becomes off-task, redirect the student back to the task. 19. Continue administering the remaining words until you complete the list. Administer all words regardless of student accuracy. Encourage responses with neutral praise (Example: OK, GOOD, NICE JOB). If child becomes frustrated it is ok to tell them that they won‘t know some of the words and that is ok! 148 Vocabulary Depth of Knowledge: Examiner‟s Probe Prompt & Response Define 1 pt. Use 1 pt. Total 1 outside 2 through 3 heavy 4 around 5 brave 1 On top of 2 inside 3 curious 4 protect 5 above 2 nd group of words from pretest 149 Read Aloud Assessment Materials: Examiner‘s protocol Stimulus story: Yesterday Sentence Strips (4) Sentence Pattern Chart (Adjectives – Nouns - Verbs) Sentence Cards with target words highlighted and word cards (strong, bright, clowns, jumps, swims) Pencil Clipboard ―Now I will read you a story. The title of the story is ―Yesterday‖ While I read, I want you to think about what the story is mostly about. I also want you to think about what you learned. [Read story]. Comprehension Questions “Next, I would like you to answer some questions about the story, Yesterday.” All primary questions will be asked in an open-ended format. If questions are answered correctly then check the box next to the question and go on to the next question. If the open-ended question is not answered or is answered incorrectly it will be followed with a series of three related yes/no questions. Do not ask the follow up questions if the open-ended question is answered correctly. Questions may be repeated once if necessary. If a student gives a vague response, query by saying “tell me more.” This prompt may be used once for each question. Open Ended Questions and responses (2 pts) Yes or No Questions Circle one for each Yes or No question (1 pt) What was the story mostly about? [ ]  Was the story about two boys at the library?  Was the story about two boys riding bikes on a play date?  Was the story about two boys at the skating rink? Y N NR Y N NR (+) (1 pt) Y N NR What did the boys do on the play date? [ ]  Did the boys ride bikes, play chase and run from bees?  Did the boys play inside with trucks?  Did the boys play basketball?? Y N NR (+) (1 pt) Y N NR Y N NR Why did the boys jump in the frog pond? [ ]  Did the boys jump in the pond to have a swimming race?  Did the boys jump in the pond because they were being chased by a dog?  Did the boys jump in the pond to escape the swarm of bees? Y N NR Y N NR Y N NR(+) (1 pt) 150 Sequencing “I am going to read the story again. I want you to think about what happened first, what happened next, and what happened at the end.” [Read story]. “Here are four sentences that tell something that happened in the story.” [Show students sentence strips. “I want you to put these sentences in order. Start with the first thing that happened in the story, then the next two things, then the last thing that happened. If you need help reading the sentences, I can help you.” [If needed, you may read the sentence strips to the students as they sort the story sequence.] After the student finishes the sort, review the story sequence by asking the following questions. Put a check mark next to the story sequence components that were sorted in correct order. Students get 1 pt. for starting the sequence correctly and then 1 pt for each sequence of two sentences and 1 pt for the last sentence-5 points total. Questions Story Sequence (number in bracket student‘s sequence) What is the first thing that happened? 1 [ ] (The boys played chase.) 2 [ ] (The boys rode bikes.) 3 [ ] (A candy bar melted in Steve‘s pocket and bees chased them .) 4 [ ] (The boys jumped in the frog pond to escape the bees.) What is the next thing that happened? Then what happened? What is the last thing that happened? 151 Sentence Pattern Word Sort “Now I am going to show you some sentence cards. Each sentence has a word that is highlighted. (1 noun, 2 adjectives, 2 verbs). If the highlighted word in the sentence is a noun, put the word on the “noun” part of the chart. [Show student, can prompt in Spanish-sustantivo)] If the highlighted word in the sentence is an adjective, put the word on the “adjective” part of the chart. [Show student, prompt in Spanish-adjetivo] If the highlighted word in the sentence is a verb, put the word on the “verb” part of the chart.” [Show students, can prompt in Spanish-verbo]. “Now lets try one together” (Use the following sentences as a practice item, follow the directions above).The horse likes to run and jump. Lilly is a white horse. Lilly runs with Carmen. “Now it is your turn. (Administer the sentences from the story) (After the student finishes the sort, review the sentence pattern chart with the student. Put a check mark next to the words that were sorted correctly).In the last two rows of the table, write how students sorted the words. Adjectives Nouns Verbs huge [ ] bees [ ] rode [ ] shallow [ ] played [ ] 152 APPENDIX C ADMINISTRATION FIDELITY CHECKLIST 153 Administration Fidelity Checklist District_________________ School________________________ Teacher ID#_____ Assessor(s)____________________________ Reliability Observer________________ Grade_____ Subtest(s)_________________________________ Date_____________ 1. Assessment atmosphere: a) Did students refrain from talking to one another during the test? Y N b) Did students have plenty of space to work? Y N c) Were students‘ desks cleared of unnecessary material? Y N d) Were students arranged to reduce the possibility of looking at each others‘ booklets? Y N e) Was the test session free of interruptions? Y N f) Did the assessors set a positive tone for the test session? Y N 2. Did the assessor(s) confirm that students have their own test booklet and not someone else‘s? Y N 3. Did students have access to extra sharpened pencils during the test? Y N 4. Did all students have as much time as they needed to complete the test? Y N 5. Did the assessor read the directions exactly as written? Y N 6. Did the assessor confirm that all students understood the directions and answers to sample items? Y N 7. Did the assessor supplement general directions with his/her own explanations, when necessary, without giving help on specific test questions? Y N 8. Did the assessor read all test items exactly as written, or use the specified Correction procedure (i.e., ―You should have marked…‖)? Y N 9. Did the assessor(s) move around the room to monitor students and ensure that they were on the correct problem and marking one answer for each question? Y N 10. Did the assessors refrain from giving help on specific test questions? Y N 11. Did assessor(s) review each test booklet to verify all questions were answered? Y N Notes: Reliability score________ 154 155 APPENDIX D FIDELITY OF IMPLEMENTATION CHECKLIST FOR SETR TRANSITION LESSONS 156 Fidelity of Implementation Checklist for SETR Transition Lessons I. Teacher/school information Site: Oregon Teacher: School: Date: Beginning Time: Ending Time: Observer: Instruction Format: small group intervention II. Teacher behaviors checklist The instructor: Consistently Sometimes Rarely Never 1 Provided a complete explanation of the activity. 2 Demonstrated step-by-step how to do the task ―I do it.‖ 3 Practiced doing the task with the group ―We do it.‖ 4 Had the students do the task on their own ―You do it.‖ 5 Provided 3-6 individual turns. 6 Provided turns to students that made errors. 7 Provided turns in a predictable manner. 8 Corrected errors immediately. 9 Utilized an explicit instruction approach to correct errors: I do it, We do it, You do it. 10 Elicited unison whole-group oral responses from the students 11 Utilized suggested or similar auditory and/or visual signaling procedures 157 12 Maintained a brisk pace of the lesson 13 Monitored the students during the lesson and provided them with feedback accordingly. Code: Consistently: these teaching behaviors were observed every time the transition materials were used Sometimes: these teaching behaviors were observed the majority of the time when transition materials were used Rarely: these teaching behaviors were observed less than half the time when transition materials were used Never: these teaching behaviors were not observed when transition materials were used III. Transition Lesson Observed (Record minutes devoted to each item) Day 1 Day 2 Day 3 Day 4 Day 5 ___PA___ ___PA___ ___PA____ ___PA____ ___PA____ * ___Phonics____ ___Phonics___ ___Phonics___ ___Phonics____ ___Phonics___ ___Word Work ___Word Work ___WW: spelling ___Word Work ___Word Work ___Vocab____ ___Vocab___ ___Vocab___ ___Vocab___ ___Vocab___ ___Sentence Reading____ ___Sentence Reading___ ___Sentence Reading___ ___Sentence Reading___ ___Sentence Reading___ __Read Aloud_ _Read Aloud__ __Read Aloud_ __Read Aloud__ __Read Aloud_ Transition Elements ________min IV. Notes: 158 APPENDIX E FEASIBILITY OF THE READING INTERVENTION WITH SPANISH-SPEAKING STUDENTS 159 Feasibility of the Reading Intervention With Spanish-Speaking Students: Maximizing Instructional Effectiveness in English and Spanish Using Systematic and Explicit Teaching Routines (SETR) Transition Lessons 1. How closely did you follow the transitions lessons as written? not at all somewhat moderately very closely closely closely closely 1 2 3 4 2. How different is the structure used in the transitions lessons from the structure in your English Language Development or English reading program? not at all somewhat moderately very different different different different 1 2 3 4 3. How likely are you to continue using the transitions lessons after the project is finished? not at all somewhat moderately very likely likely likely likely 1 2 3 4 4. How easy would you say the transition lessons are to implement? very somewhat somewhat very difficult difficult easy easy 1 2 3 4 (circle one) 5. To which section of the lessons did the students respond better? phonemic awareness phonics vocabulary read aloud 6. If you were running out of time, which section of the lesson (phonemic awareness, phonics, vocabulary, read aloud; circle one) would you skip because It was redundant students mastered students found boring 7. Did you find the read aloud section of the lesson useful to develop student oral language proficiency? Not useful somewhat useful moderately useful very useful 8. Please add any other comment that you think might help improve the transition lessons. 160 REFERENCES CITED Adams, G. L., & Engelmann, S. (1996). Research on direct instruction: 25 years beyond DISTAR. Seattle, WA: Educational Achievement Systems. Anstrom, K., DiCerbo, P., Butler, F., Katz, A., Millet, J., & Rivera, C. (2010). A Review of the literature on academic English: Implications for K-12 English language learners. Arlington, VA: George Washington University Center for Equity and Excellence in Education. August, D. A., Calderon, M., & Carlo, M. (2001, March/April). The transfer of skills from Spanish to English. NABE News, 11–12, 42. August, D. A., & Hakuta, K. (1997). Improving schooling for language minority children: A research agenda. Washington, DC: National Academy Press. August, D. A., & Shanahan, L. (2006). Developing literacy in second-language learners: Report of the National Literacy Panel on Language Minority Children and Youth. Washington, DC: National Literacy Panel on Language-Minority Children and Youth. Bailey, A. L., & Butler, F. A. (2007). A conceptual framework of academic English language for broad application to education. In A. L. Bailey (Ed.), The language demands of school: Putting academic English to the test (pp. 68–102). New Haven, CT: Yale University Press Bailey, A. L., & Heritage, H. M. (2008). Formative assessment for literacy, Grades K-6: Building reading and academic language skills across the curriculum. Thousand Oaks, CA: Corwin Press. Baker, C. (1993). Foundations of bilingual education and bilingualism. Clevedon, England: Multilingual Matters. Baker, S. K., & Baker, D. L. (2008). English language learners and response to intervention: Improve quality of instruction in general and special education. In E. L. Grigorenko (Ed)., Education individuals with disabilities: IDEIA 2004 and beyond (pp. 249–268). New York, NY: Springer. Baker, S., Gersten, R., Haager, D., Dingle, M., & Goldenberg, C. (2006). Assessing the relationship between observed teaching and practice reading growth in first grade English learners: A validation study. Elementary School Journal, 107(2). 199– 219. 161 Baker, S. K., Thompson, S. L., & Santoro, L. E. (2007). Reading intervention with Spanish speaking students: Maximizing instructional effectiveness in English and Spanish (CFDA Report No. 84.305). Washington, DC: Institute of Education Sciences, U.S. Department of Education. Baumann, J. F., & Graves, M. F. (2010). What is academic vocabulary? Journal of Adolescent and Adult Literacy, 54, 4–12. Beck, I. L., McKeown, M. G., & Kucan, L. (2002). Bringing words to life: Robust vocabulary instruction. Solving problems in the teaching of literacy. New York, NY: Guilford. Beck, I. L., Perfetti, C. A., & McKeown, M. G. (1982). Effects of long-term vocabulary instruction on lexical access and reading comprehension. Journal of Educational Psychology, 74, 506–521. Briggs, K. L., Edmonds, M. S., & Twiddy, K. (2000). Evaluation report of the Student Success Initiative, 1999-2000: A report to the 77th Texas Legislature (Report No. AD01 105 06). Austin, TX: Texas Education Agency. Carlo, M. S., August, D., McLaughlin, B., Snow, C. E., Dressler, C., Lippman, D. N., . . . White, C. (2004). Closing the gap: Addressing the vocabulary needs of English- language learners in bilingual and mainstream classrooms. Reading Research Quarterly, 39(2), 188–215. Center for Research on Education, Diversity and Excellence. (2003). ELL program models. Retrieved from http://crede.berkeley.edu/tools/teaching.html Chambers, B., Slavin, R. E., Madden, N. A., Cheung, A., & Gifford, R. (2004). Effects of success for all with embedded video on the beginning reading achievement of Hispanic children. Baltimore, MD: Johns Hopkins University, Center for Research on the Education of Students Placed at Risk. Chapa, J., & De La Rosa, B. (2004). Latino population growth, socioeconomic and demographic characteristics, and implications for educational attainment. Education and Urban Society, 36(2), 130–149. Chiappe, P., Siegel, L., & Wade-Woolley, L. (2002). Linguistic diversity and the development of reading skills: A longitudinal study. Scientific Studies of Reading, 6(4), 369–400. Cirino, P. T., Vaughn, S., Linan-Thompson, S., Mathes, P. G., Cardenas-Hagan, E., Fletcher, J. M. & Francis, D. J. (2009). One year follow-up outcomes of Spanish and English interventions for English language learners at risk for reading problems. American Educational Research Journal, 46(3), 744–781. 162 Clay, M. (1993). Reading recovery in English and other languages. Keynote address presented at the West Coast Literacy Conference, Palm Springs, CA. Coyne, M., Kame'enui, E., & Carnine, D. (2011). Effective teaching strategies that accommodate diverse learners (4th ed.). Upper Saddle River, NJ: Pearson. Cummins, J. (1979). Linguistic interdependence and the educational development of bilingual children. Review of Educational Research, 49(2), 222–251. Cummins, J. (1980). The construct of language proficiency in bilingual education. In J. E. Alatis (Ed.), Georgetown University round table on languages and linguistics (pp. 309–317). Washington, DC: Georgetown University Press. Cummins, J. (1984). Wanted: A theoretical framework for relating language proficiency to academic achievement among bilingual students. In C. Rivera (Ed.), Language proficiency and academic achievement (pp. 70–89). Clevedon, England: Multilingual Matters. Cummins, J. (2005). Teaching for cross-language transfer in dual language education. Istanbul, Turkey: Teachers of English to Speakers of Other Languages (TESOL) Symposium on Dual Language Education. Cummins, J., Swain, M., Nakajima, K., Handscombe, J., Green, D., & Tran, C. (1984). Linguistic interdependence among Japanese and Vietnamese immigrant students. In C. Rivera (Ed.), Communicative competence approaches to language proficiency assessment: Research and application (pp. 60–81). Clevedon, England: Multilingual Matters. Denton, C. A., Anthony, J. L., Parker, R., & Hasbrouck, J. E. (2004). Effects of two tutoring programs on the English reading development of Spanish-English bilingual students. Elementary School Journal, 104(4), 289–305. Duncan, S. E., & De Avila, E. A. (1985). PRE-LAS user's manual: Form A. San Rafael, CA: Linguametrics Group. Dunn, L. M., & Dunn, L. M. (1981). Peabody Picture Vocabulary Test—Revised. Circle Pines, MN: American Guidance Services. Ehri, L. C. (1998). Grapheme–phoneme knowledge is essential for learning to read words in English. In J. L. Metsala & L. C. Ehri (Eds.), Word recognition in beginning literacy (pp. 3–40). Mahwah, NJ: Erlbaum. Englemann, S., & Carnine, D. (1982). Theory of instruction: Principles and applications. New York, NY: Irvington. 163 Fien, H., Smith, J., Baker, S. K., Chaparro, E., Preciado, J., & Baker, D. L. (in press). Including English learners in a multi-tiered approach of early reading instruction and intervention. Assessment for Effective Intervention. Foorman, B. R., & Moats, L. C. (2004). Conditions for sustaining research based practices in early reading instruction. Remedial and Special Education, 25(1), 5160. Foorman, B. R., & Torgesen, J. K. (2001). Critical elements of classroom and small group instruction promote reading success in all students. Learning Disabilities Research and Practice, 16(4), 203–212. Fountas, I. C., & Pinnell, G. S. (1996). Guided reading: Good first teaching for all children. Portsmouth, NH: Heinemann. Francis, D., Lesaux, N., & August, D. (2006). Language of instruction. In D. August & T. Shanahan (Eds.), Developing literacy in second language learners: Report of the National Literacy Panel on Language Minority Children and Youth (pp. 365– 414). Mahwah, NJ: Erlbaum. Francis, D. J., Rivera, M., Lesaux, N., Kieffer, M., & Rivera, H. (2006). Practical guidelines for the education of English language learners: Research-based recommendations for instruction and academic interventions. Portsmouth, NH: Center on Instruction. Francis, D. J., Shaywitz, S. F., Stuebing, K. K., Shaywitz, B. A., & Fletcher, J. M. (1996). Developing lag verses deficit models of reading disability: A longitudinal individual growth curve analysis. Journal of Educational Psychology, 88, 3–17. Garcia, G. (2000). Bilingual children‘s reading. In M. L. Kamil, P. B. Mosenthal, P. D. Pearson, & R. Barr (Eds.), Handbook of reading research (Vol. 3, pp. 813–834). Mahway, NJ: Erlbaum. Gersten, R. (1996). The double demands of teaching English language learners. Educational Leadership, 53(5), 18–22. Gersten, R. (1999). Lost opportunities: Challenges confronting four teachers of English- language learners. Elementary School Journal, 100(1), 37–56. Gersten, R., & Baker, S. (2000). What we know about effective instructional practices for English-language learners. Exceptional Children, 66, 454–470. 164 Gersten, R., Baker, S. K., Shanahan, T., Linan-Thompson, S., Collins, P., & Scarcella, R. (2007). Effective literacy and English language instruction for English learners in the elementary grades: A practice guide (NCEE Report No. 2007-4011). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides Gersten, R., & Geva, E. (2003). Teaching reading to early language learners. Educational Leadership, 60(7), 44–49. Gersten, R., Santoro, L. E., & Jiménez, R. (2006). Modulating instruction for English- language learners. In M. Coyne, E. Kame‘enui, & D. Carnine (Eds.), Effective teaching strategies that accommodate diverse learners (pp. 231–244). Upper Saddle River, NJ: Merrill. Geva, E., Wade-Woolley, L., & Shany, M. (1997). Development of reading efficiency in first and second language. Scientific Studies of Reading, 1, 119–144. Good, R. H., & Kaminski, R. (2002). Dynamic indicators of basic early literacy skills (6th ed.). Eugene, OR: Institute for the Development of Education Achievement. Good, R. H., Kaminski, R. A., & Dill, S. (2002). DIBELS Oral Reading Fluency. In R. H. Good & R. A. Kaminski (Eds.), Dynamic indicators of basic early literacy skills (6th ed.). Eugene, OR: Institute for Development of Educational Achievement. Retrieved from http://dibels.uoregon.edu/ Good, R. H., Simmons, D. C., & Kame'enui, E. J. (2001). The importance and decision making utility of a continuum of fluency-based indicators of foundational reading skills for third-grade high-stakes outcomes. Scientific Studies of Reading, 5(3), 257–288. Goswami, U., & Bryant, P. (1990). Phonological skills and learning to read. Hove, England: Erlbaum. Greene, J. P. (1998). A meta-analysis of the effectiveness of bilingual education. Claremont, CA: Thomas Rivera Policy Institute. Gunn, B., Biglan, A., Smolkowski, K., & Ary, D. (2000). The efficacy of supplemental instruction in decoding skills for Hispanic and non-Hispanic students in early elementary school. Journal of Special Education, 34(2), 90–103. Gunn, B., Smolkowski, K., Biglan, A., Black, C., & Blair, J. (2005). Fostering the development of reading skill through supplemental instruction: Results for Hispanic and non-Hispanic students. Journal of Special Education, 39(2), 66–85. 165 Gutierrez, K. D. (2008). Developing a sociocritical literacy in the third space. Reading Research Quarterly, 43, 148–164. Hakuta, K., Butler, Y. G., & Witt, D. (2000). How long does it take English learners to attain proficiency?( Policy Report No. 2000-1). Berkeley, CA: University of California Linguistic Minority Research Institute. Hoover, W. A., & Gough, P. B. (1990). The simple view of reading. Reading and Writing: An Interdisciplinary Journal, 2, 127–160. Howell, D. C. (2010). Statistical methods for psychology (7th ed.). Belmont CA: Wadsworth Cengage Learning. Hubler, D. (2005, August 10). Task force to gauge progress of English-language learners. Education Daily, 1–2. Ibnot, C. (1992). Read naturally. St. Paul, MN: Reading Naturally. Jiménez, R. T. (1994). Understanding and promoting the reading comprehension of bilingual students. Bilingual Research Journal, 18, 99–119. Jiménez, R. T. (1997). The strategic reading abilities and potential of five low-literacy Latina/o readers in middle school. Reading Research Quarterly, 32, 224–243. Jiménez, R. T., Gonzalez, J. E., & Haro García, C. R. (1996). The reading strategies of bilingual Latina/o students who are successful English readers: Opportunities and obstacles. Reading Research Quarterly, 31, 90–112. Juel, C. (1988). Learning to read and write: A longitudinal study of children in first and second grade. Journal of Educational Psychology, 80, 437–447. Kame'enui, E. J., & Carnine, D. (1998). Effective teaching strategies that accommodate diverse learners. Upper Saddle River, NJ: Simon & Schuster. Kindler, A. L. (2002). Survey of the states’ limited English proficient students and available educational programs and services. Washington, DC: Office of English Language Acquisition, Language Enhancement and Academic Achievement for Limited English Proficient Students, U.S. Department of Education. Klingner, J. K., & Vaughn, S. (1996). Reciprocal teaching of reading comprehension strategies for students with learning disabilities who use English as a second language. Elementary School Journal, 96, 275–293. Lea, M. R., & Street, B. V. (2006). ―Academic literacies‖ model: Theory and applications. Theory Into Practice, 45, 386–377. 166 Lee, J., Grigg, W., & Donahue, P. (2007). The nation’s report card: Reading 2007 (NCES Report No. 2007–496). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Lee, J., & Schallert, D. L. (1997). The relative contribution of L2 language proficiency and L1 reading ability to L2 reading performance: A test of the threshold hypothesis in an EFL context. TESOL Quarterly, 31, 713–739. Lesaux, N., & Siegel, L. (2003). The development of reading in children who speak English as a second language. Developmental Psychology, 39(6), 1005–1020. Linan-Thompson, S., & Hickman-Davis, P. (2002). Supplemental reading instruction for students at risk for reading disabilities: Improve reading 30 minutes at a time. Learning Disabilities Research and Practice, 17(4), 242–251. Linan-Thompson, S., & Ortiz, A. A. (2009). Response to intervention and English- language learners: Instructional and assessment considerations. Seminars in Speech and Language, 20(2), 105–120. Lyon, G. R., Fletcher, J. M., Fuchs, L. S., & Chhabra, V. (2006). Learning disabilities. In E. Mash & R. Barkley (Eds.), Treatment of childhood disorders (3rd ed., 512– 591). New York, NY: Guilford. Manis, F. R., & Freedman, L. (2001). The relationship of naming speed to multiple reading measures in disabled and normal readers. In M. Wolf (Ed.), Dyslexia, fluency, and the brain (pp. 65–92). Timonium, MD: York Press. McCardle, P., Mele-McCarthy, J., Cutting, L., Leos, K., & D‘Emilio, T. (2005). Learning disabilities in English language learners: Identifying the issues. Learning Disabilities Research and Practice, 20, 1–5. Muñoz-Sandoval, A., Cummins, J., Alvarado, C., & Ruef, M. (1998). Bilingual Verbal Ability Test. Itasca, IL: Riverside. National Center for Education Statistics. (2004). Beginning postsecondary students longitudinal study. Washington, DC: U.S. Department of Education. Retrieved from http://www.necs.ed.gov/das National Center for Education Statistics. (2005). Overview of public elementary and secondary schools and districts: 2004-2005 (Report No. 411). Washington, DC: U.S. Department of Education. Retrieved from http://eric.ed.gov/PDFS/ ED483072.pdf 167 National Reading Panel. (2000). Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction (NIH Publication No. 00-4769). Washington, DC: National Institute of Child Health and Human Development. No Child Left Behind Act, Pub. L. No. 107–110, 115 Stat. 1425 (2002). Pilgreen, J. (2007). Teaching the language of school to secondary English learners. In J. Lewis & G. Moorman (Eds.), Adolescent literacy instruction: Policies and promising practices (pp. 238–262). Newark, DE: International Reading Association. Perfetti, C. A. (1985). Reading ability. New York, NY: Oxford University Press. Perfetti, C. A. (1999). Comprehending written language: A blueprint of the reader. In C. M. Brown & P. Hagoort (Eds.), The neurocognition of language (pp. 167– 208). Oxford University Press. Planty, M., Hussar, W., Snyder, T., Kena, G., KewalRamani, A., Kemp, J., . . . Dinkes, R. (2009). The condition of education 2009 (NCES Report No. 2009-081). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://www.edpubs.org Proctor, C. P., Carlo, M., August, D., & Snow, C. (2005). Native Spanish speaking children reading in English: Toward a model of comprehension. Journal of Educational Psychology, 97(2), 246–256. Quiroga, T., Lemos-Britton, Z., Mostafapour, E., Abott, R. D., & Berninger, V. W. (2002). Phonological awareness and beginning reading in Spanish-speaking ESL first graders. Journal of School Psychology, 40, 85–111. Ramirez, J., Pasta, D. J., Yuen, S., Billings, D. K., & Ramey, D. R. (1991). Final report: Longitutidnal study of structural immersion strategy, early-exit, and late-exit transitional bilingual education programs for language-minority children (Report to U.S. Department of Education). San Mateo, CA: Aquirre International. Reading. (2003). Boston, MA: Houghton Mifflin. Reese, L., Garnier, H., Gallimore, R., & Goldenberg, C. (2000). Longitudinal analysis of the antecedents of emergent Spanish literacy and middle-school English reading achievement of Spanish-speaking students. American Educational Research Journal, 37(3), 633–662. Rossell, C. H., & Baker, K. (1996). The educational effectiveness of bilingual education. Research in the Teaching of English, 30, 7–74. 168 Santoro, L., Baker, S. K., Chard, D., & Howard, L. (2007). The comprehension conversation: Using purposeful discussion during read-alouds to promote student comprehension and vocabulary. In B. Taylor & J. E. Ysseldyke (Eds.), Effective instruction for struggling readers, K-6 (pp. 206–263). New York, NY: Teachers College Press. Santoro, L., Chard, D., Howard, L., & Baker, S. (2008). Making the very most of classroom read alouds to promote comprehension and vocabulary. Reading Teacher, 61(5), 396–408. Santoro, L., Jitendra, A. K., Starosta, K., & Sacks, G. (2005). Reading well with read well: Enhancing the reading performance of English language learners. Remedial and Special Education, 27(2), 105–115. Saunders, W., O‘Brien, G., Lennon, D., & McLean, J. (1998). Making the transition to English literacy successful: Effective strategies for studying literature with transition students. In R. M. Gersten & R. T. Jiménez (Eds.), Promoting learning for culturally and linguistically diverse students: Classroom applications from contemporary research (pp. 99–127). Belmont, CA: Wadsworth. Scarcella, R. (2008). Academic language: Clarifying terms. AccELLerate! The Quarterly Newsletter of the National Clearinghouse for English Language Acquisition, 1(1), 5–6. Seymour, H. K. S., Aro, M., & Erskine, J. M. (2003). Foundation literacy acquisition in European orthographies. British Journal of Psychology, 94, 143–174. Shadish, W. R., Cook, D. T., & Campbell, D. T. (2002). Experimental and quasi experimental designs for generalized causal inference. New York, NY: Houghton Mifflin. Shanahan, T., & Beck, I. (2006). Effective literacy teaching for English-language learners. In D. August & T. Shanahan (Eds.), Report of the National Literacy Panel on language-minority children and youth: Acquiring literacy in a second language. Washington, DC: U.S. Department of Education. Share, D. L., & Stanovich, K. E. (1995). Cognitive processes in early reading development: Accommodating individual differences into a model of acquisition. Issues in Education: Contributions From Educational Psychology, 1, 1–57. Simmons, D. C. & Kame‘enui, E. J. (1998). What reading research tells us about children with diverse learning needs: Bases and basics. Mahwah, NJ: Erlbaum. 169 Simmons, D. C., Kame‘enui, E. J., Harn, B., Coune, M. D., Stoomiller, M., Santoro, L. E., . . . Kaufman, N. K. (2007). Attributes of effective and efficient kindergarten reading intervention: An examination of instructional time and design specificity. Journal of Learning Disabilities, 40, 331–347. Slavin, R. E., & Cheung, A. (2005). A synthesis of research on language of reading instruction for English language learners. Review of Educational Research, 75(2), 247–285. Slavin, R. E., & Madden, N. A. (2001). One million children: Success for all. Thousand Oaks, CA: Corwin. Slavin, R. E., Madden, N., Calderon, M., Chamberlain, A., & Hennessy, M. (2010). Reading and language outcomes of a five-year randomized evaluation of transitional bilingual education. Best evidence encyclopedia. Retrieved from http://www.bestevidence.org/word/bilingual_education Smolkowski, K., & Gunn, R. (in press). The reliability and validity of student-teacher interaction and context observations collected during kindergarten reading instruction. Journal of Learning Disabilities. Snow, C. E., Burns, M. S., & Griffin, P. (Eds.). (1998). Preventing reading difficulties in young children. Washington, DC: National Academy Press. Snow, C. E., & Uccelli, P. (2009). The challenge of academic language. In D. R. Olson & N. Torrance (Eds.), The Cambridge handbook of literacy (pp. 112–133). New York, NY: Cambridge University Press. Sprick, M. M., Howard, L. M., & Fidanque, A. (1998). Read well: Critical foundations in primary reading. Longmont, CO: Sopris West. Stanford Achievement Test (10th ed.). (2003). San Antonio, TX: Harcourt Brace Educational Measurement. Stevens, J. P. (2009). Applied multivariate statistics for the social sciences (5th ed.). Mahwah, NJ: Routledge Academic. Swanson, H. L., Harris, K. R., & Graham, S. (2003). Handbook of learning disabilities. New York, NY: Guildford. Swanson, H. L., Hoskyn, M., & Lee, C. (1999). Interventions for students with learning disabilities: A meta-analysis of treatment outcomes. New York, NY: Guilford. Test of reading fluency. (1987). Minneapolis, MN: Children‘s Educational Services. 170 Thomas, W., & Collier, V. (2002). A national study of school effectiveness for language minority students’ long-term academic achievement. Santa Cruz, CA, and Washington, DC: Center for Research on Education, Diversity and Excellence. Retrieved from http://www.crede.ucsc.edu/research/llaa/l/l_final.html Torgesen, J. K. (2002). The prevention of reading difficulties. Journal of School Psychology, 40(1), 7–26. Torgesen, J. K., Alexander, A. W., Wagner, R. K., Rashotte, C. A., Voeller, K. S., & Conway, T. (2001). Intensive remedial instruction for children with severe reading disabilities: Immediate and long-term outcomes from two instructional approaches. Journal of Learning Disabilities, 34(1), 33–58. Torgesen, J. K., & Burgess, S. R. (1998). Consistency of reading-related phonological processes throughout early childhood: Evidence from longitudinal-correlational and instructional studies. In J. Metsala & L. Ehri (Eds.), Word recognition in beginning reading (pp. 161–188). Hillsdale, NJ: Erlbaum. Trophies. (2005). Orlando, Fl: Harcourt School Publishers. U.S. Department of Education. (2007). Best practices for ELLs: Vocabulary instruction. Retrieved from http://www.readingrockets.org/article/28882 U.S. Department of Education & National Institute of Child Health and Human Development. (2003). National symposium on learning disabilities in English language learners: Symposium summary. Washington, DC: Authors. Valdes, G. (2004). Between support and marginalisation: The development of academic language in linguistic minority children. Bilingual Education and Bilingualism, 7(2-3), 102–132. Vaughn, S., Cirino, P. T., Linan-Thompson, S., Mathes, P. G., Carlson, C. D., Cardenas- Hagan, E., . . . Francis, D. L. (2006). Effectiveness of a Spanish intervention and an English intervention for English-language learners at-risk for reading problems. American Educational Research Journal, 43(3), 449–487. Vaughn, S., Gersten, R., & Chard, D. J. (2000). The underlying message in LD intervention research: Findings from research syntheses. Exceptional Children, 67(1), 99–114. Vaughn, S., & Linan-Thompson, S. (2003). Group size and time allocated to intervention: Effects for students with reading difficulties. In B. R. Foorman (Eds.), Interventions for children at-risk for reading difficulties or identified with reading difficulties (pp. 203–228). Timonium, MD: York Press. 171 Vaughn, S., Mathes, P. G., Linan-Thompson, S., Cirino, P., Carlson, C., Francis, D., & Pollard-Durodola, S. D. (2006). First-grade English language learners at-risk for reading problems: Effectiveness of an English intervention. Elementary School Journal, 107, 153–180. Vaughn, S., Mathes, P. G., Linan-Thompson, S., & Francis, D. (2005). Teaching English language learners at-risk for reading disabilities to read: Putting research into practice. Learning Disabilities Research and Practice, 20(1), 58–67. Wagner, R. K., & Torgesen, J. K. (1987). The nature of phonological processing and its causal role in the acquisition of reading skills. Psychological Bulletin, 101, 192– 212. Willig, A. C. (1985). A meta-analysis of selected studies on the effectiveness of bilingual education. Review of Educational Research, 55, 269–318. Williams, K. T. (2001). Group Reading Assessment and Diagnostic Evaluation (GRADE). Circle Pines, MN: American Guidance Service. Wolf, M. (Ed.). (2001). Dyslexia, fluency, and the brain. Timonium, MD: York Press. Wolf, M., Pfeil, C., Lotz, R., & Biddle, K. (1994). Towards a more universal understanding of developmental dyslexia: The contribution of orthographic factors. In V. W. Berninger (Ed.), The varieties of orthographic knowledge: Vol. 1. Theoretical and developmental issues (pp. 137–171). Dordrecht, The Netherlands: Kluwer Academic. Woodcock, R. W., & Johnson, M.B. (1989). WJ-R Tests of Cognitive Ability. Itasca, IL: Riverside. Woodcock, R. W., & Muñoz-Sandoval, M. R. (1993). Woodcock-Muñoz Language Survey - English. Itasca, IL: Riverside. Yeung, A. E., Marsh, H. W., & Suliman, R. (2000). Can two tongues live in harmony? Analysis of the National Education Longitudinal Study of 1988 (NELS88) longitudinal data on the maintenance of home language. American Educational Research Journal, 37(4), 1001–1026.