MetaMetrics® is focused on improving education for learners of all ages. For over twenty years, our work has been increasingly recognized for its distinct value in differentiating instruction and personalizing learning. Our research on postsecondary reading demands, for example, informed the Common Core State Standards for college- and career-readiness.
In addition to the white papers and policy briefs we publish throughout the year, our research briefs will encompass our work on a variety of educational issues, such as personalized learning platforms, text complexity, and college- and career-readiness. The research briefs discuss the studies and findings of our established team of psychometricians on the K-16 education spectrum.
by: Heather Koons, Ph.D., Jeff Elmore; Eleanor Sanford-Moore, Ph.D., Alfred Jackson Stenner, V
This research brief discusses the relationship between two popular text leveling systems, The Lexile® Framework for Reading and Fountas & Pinnell (F&P) reading levels. MetaMetrics examined the Lexile measures and F&P reading levels assigned to a set of early-reading texts. F&P reading levels assign letter labels from A through Z+ to indicate text difficulty. Lexile measures assign a numeric value (for example, 120L) from below 0L for beginning reader text to above 2000L for advanced text. A positive correlation was found between the Lexile measures and F&P reading levels for early-reading books. Overall, as the F&P reading levels progress to indicate more challenge, the Lexile measures of the books also increase. This study's strong correlation suggests that in general, the Lexile Framework and F&P reading levels describe a text's complexity similarly. Read the research brief to learn more about this study's methodology, analysis and full conclusion. The research brief also explores variability within leveling systems using book examples.
by: Ian F. Hembry, Ph.D., Heather Koons, Ph.D., Robin Baker, Ph.D., Kate Pringle, M.F.A., and Eleanor E. Sanford-Moore, Ph.D.
Reading assessments designed for beginning readers often include items that require the student to match a word to a picture prompt; Istation’s Indicators of Progress and Renaissance’s Star Early Literacy assessments are two examples (Mathes, Torgesen, & Herron, 2016; Renaissance Learning, 2016). The words in these picture items typically include words found in reading materials commonly encountered by students in the earliest stages of reading. MetaMetrics has also developed picture items that can be incorporated into assessments of early reading ability and has conducted research to explore the characteristics of these picture items that may affect difficulty.
by: Jeff Elmore, Steve Lattanzio, A. Jackson Stenner, Ph.D., and Eleanor E. Sanford-Moore, Ph.D.
As part of a larger project related to modeling multiple aspects of vocabulary knowledge, word difficulty measures denominated in Lexile units were calculated in a manner similar to the development of the Lexile Framework for Reading (Stenner, Horabin, Smith, & Smith, 1988; Stenner, Burdick, Sanford, & Burdick, 2007). A Lexile word measure is an estimate of the challenge a particular word will present, on average, to a particular reader during independent reading. Lexile word measures were calculated using a corpus-based, machine-learning model that was developed using student performance data from several reading tasks. The model was then used to calculate Lexile word measures for tens of thousands of words for which student performance data was unavailable.
by: Steve Lattanzio, Jeff Elmore and A. Jackson Stenner, Ph.D.
MetaMetrics has historically provided measures of text complexity for books, articles, and other texts using the Lexile Framework for Reading. A text is made up of many words, which raises the question: can a single word have a Lexile measure? Quantifying word difficulty on the Lexile scale would provide a variety of educational benefits. Example utilities could include identifying challenging words in a text and more precise word selection for both human- and machine-generated assessment items. This research brief describes recent efforts to find empirical measures of the complexity (difficulty) of words.
by: Jeff Elmore, August 2016
This report describes the development of a new type of frequency measure for words that better reflects the developmental nature of word exposure. The relationship between word frequency and word knowledge has been well documented (Brysbaert, Buchmeier, Conrad, Jacobs, Bölte, & Böhl, 2011; Rudell, 1993). Indeed, word frequency is the operational measure of semantic difficulty in the equation powering the Lexile Analyzer (Stenner, Horabin, Smith, & Smith, 1988; Stenner, Burdick, Sanford, & Burdick, 2007). However, the underlying theoretical explanation of why word frequency predicts word knowledge is exposure. Readers are exposed more often to more frequent words, and thus have greater knowledge of them (Klare, 1963). The connection between word frequency and word knowledge therefore is more meaningful if the word frequencies more accurately reflect the degree of exposure to a word for the average developing reader.
by: Gary L. Williamson, Ph.D.; Eleanor Sanford-Moore, Ph.D.; Lisa Bickel; July 2016
The objective of this research is to answer the question, “What mathematics must a student be capable of performing to be ready for college or a career?” To address the question we analyzed mathematical concepts and skills that students may encounter as they begin their postsecondary education and/or enter the workplace. The answer is predicated on two perspectives: (a) mathematical readiness for college implies being ready for instruction in advanced mathematics courses associated with the beginning of the postsecondary educational experience; and, (b) readiness for the mathematical demands of careers implies, at a minimum, sufficient mathematical ability to perform well on the mathematics content required for a high school diploma. To answer the key question, we analyzed the difficulty of mathematical skills and concepts incorporated into the mathematics lessons found in mathematics texts commonly used in the United States. The Quantile® Framework for Mathematics provides the measurement foundation to place on a common scale both student mathematics ability and the difficulty of mathematical skills and concepts. Thus, we infer requisite student ability from the observed difficulty of mathematical skills and concepts contained in mathematics lessons presented in mathematics textbooks. We regard mathematics ability as an individual, malleable attribute, which improves with instruction and practice.
by: Jeff Elmore, Jill Fitzgerald, Ph.D., Michael Graves, Ph.D., Kimberly Bowen, Ph.D., October 2015
As part of a multi-phase research endeavor to identify the words that U.S. students are likely to encounter in grades K–12, the vocabulary contained in current, best-selling elementary grades (first through fifth grade) disciplinary textbook series (social studies, science, and mathematics) was analyzed. Word and morphological word family frequencies were tallied. Lists of words and word families by grade level and series were created and are available from the first author on request to researchers, curriculum developers, policy makers, and publishers to improve vocabulary curricula and instruction. Based on initial (Version 1) analyses, the present brief summarizes the word and family frequencies by discipline and series and elaborates selected points about elementary grade children’s exposure to words and word families in disciplinary textbooks.
by: Steve Lattanzio, August 2015
In the past, high correlations between empirical and theoretical Lexile® measures (r = 0.963 1 ) for articles from EdSphere®, a personalized learning platform, have been observed (Swartz et al, 2015). Research has been conducted to examine different types of methods for obtaining the empirical measures and to ensure the observed correlations between empirical and theoretical Lexile measures are accurate. The “ensemble calibration” (EC) method (Swartz et al, 2013) involved using theoretical values for article difficulties to arrive at an initial estimate of person abilities (using a Bayesian procedure [Swartz et al, 2013]) and then estimating empirical values for article difficulty. Iterations would continue until convergence, which occurred in a relatively small number of iterations. While the obtained empirical measures are thought to be valid, the final empirical estimates were not derived independently from the theoretical values that they would then be compared against. The new method, dubbed the Simple method, involves determining the articles’ empirical Lexile measures independently from their theoretical values in a simple, unbiased, transparent, and robust manner.
by: Jeff Elmore, Kim Bowen, Ph.D., Michael Graves, Ph.D., Jill Fitzgerald, Ph.D., August 2015
As the first part of a comprehensive research endeavor to identify the words that U.S. students are likely to encounter in grades K-12, this study involved analyzing four current, best-selling, elementary core reading programs. From this analysis, frequency lists for words and word families appearing in these programs were developed. These frequency lists are available for researchers, publishers, and educators.
by: Hal Burdick, Sean T. Hanlon, Ph.D., Carl W. Swartz, Ph.D., Donald S. Burdick, Ph.D., and A. Jackson Stenner, Ph.D.
The idea behind the Lexile® Framework for Reading is simple: if we know how well a student can read and how hard a specific text is to comprehend, then we can predict how well that student will likely understand the text. When a student reads a book with a Lexile text measure that matches his or her Lexile reader measure, he or she will comprehend about 75% of the text. The 75% comprehension rate is called “targeted reading". The target reading rate is the point at which a reader will comprehend enough to understand the text, but also will encounter some reading challenges. This rate is based on independent reading; if the reader receives help or "scaffolding" (e.g., picture support, font size, glossary of terms), the comprehension rate will likely increase. This research brief explores how scaffolding in the form of audio support affects reading comprehension rate. It also examines if the type of audio support (text-to-speech engines vs. human readers) influences reading comprehension.
by: Gary L. Williamson, Ph.D., Todd Sandvik, Jackson Stenner V, Allen Johnson, April 2015
This research quantifies the complexity of textbooks commonly used in universities in the United Kingdom (UK) and compares the text complexity of UK university texts with texts used in the United States (US) by postsecondary educational institutions (universities, community colleges and technical colleges). Seventy texts from 10 UK universities were measured using the Lexile Analyzer. The results show that university texts used in the UK and those used in the US have similar distributions of text complexity.
by: Eleanor E. Sanford-Moore, Ph.D., Robert F. Baker, Ph.D., Allen Johnson, February 2015
The growth of English as lingua franca has focused attention on the applicability of The Lexile Framework for Reading for the English as a Foreign Language population. This study sought to answer two questions. First, do Lexile reading items function similarly for both English as a Foreign Language (EFL) and English as a Native Language (ENL) groups; and by extension, can the Lexile Framework be used similarly with both ENL and EFL populations? Data were collected from four US and three internationally based, large-scale assessment linking studies with a total of 210,475 examinees. Differential Item Functioning analysis was used to examine the relationship between the score on an item and group membership while controlling for ability. Moderate to large scale DIF was identified and the direction of any differences were examined to determine whether items are easier or more difficult for EFL readers.
by: Eleanor E. Sanford-Moore, Ph.D., Gary L. Williamson, Ph.D., Lisa Bickel, Heather Koons, Ph.D., Robert F. Baker, Ph.D., Ruth Price, November 12, 2014
This research quantifies the difficulty of mathematics lessons drawn from mathematics textbooks commonly used in the United States. It also documents the mathematical complexity of textbook lessons within and across grades. Lessons were extracted from selected textbooks used in grades K-12 in the United States and analyzed. Textbooks aligned with the Common Core State Standards for Mathematics (CCSSM) and those not aligned with CCSSM were measured using the Quantile Framework for Mathematics, and each lesson was assigned a Quantile measure to represent its mathematical difficulty. The results of this research show that the median mathematical difficulty of textbook lessons consistently increases with grade, and that within grades, lessons vary in their mathematical complexity.
by: Gary L. Williamson, Ph.D., Robert F. Baker, Ph.D., December 27, 2013
Text complexity associated with college and career preparedness has become an issue of national interest (NGA & CCSSO, 2010). Previous work (e.g., Daggett, 2003; Stenner, Sanford-Moore & Williamson, 2012; Williamson, 2008) has examined workplace texts globally or within the United States Department of Education (USED) career clusters but has not reported on texts associated with specific individual occupations. This study provides a description of the text complexity of reading materials that are considered important for accessing specific individual occupations.
by: Gary L. Williamson, Ph.D., Juee Tendulkar, Sean T. Hanlon, Carl W. Swartz, Ph.D., November 20, 2012
Educators are aggressively working to implement the Common Core State Standards (CCSS). Student use of technology is one potential key to helping students meet higher reading standards proposed by the CCSS (National Education Technology Plan, 2010). Well-designed technology includes components of deliberate practice. Students benefit from these components of deliberate practice when their day-to-day and year-to-year performance is placed on an equal-interval developmental scale. EdSphere, formerly known as Learning Oasis (Hanlon, Swartz, Stenner, Burdick, & Burdick, 2012), is a web-based application that leverages the ability of The Lexile® Framework for Reading and The Lexile® Framework for Writing to provide students with activities targeted to their abilities and to topics being taught in the classroom. The objective of this research was to ascertain whether student growth in reading in response to exposure to Learning Oasis could be determined from an external progress-monitoring measure.
Bending the Text Complexity Curve to Close the Gap (190KB, PDF)
by: Eleanor E. Sanford-Moore, Ph.D., and Gary L. Williamson, Ph.D., October 1, 2012
Prior research has identified a gap between the reading demands of high school and the postsecondary world. This suggests increasing exposure to complex texts during the K-12 years. What should an “aspirational” or “stretch” text complexity trajectory look like?
This research bulletin presents a “stretch” text complexity trajectory that aligns with postsecondary text demands.
by: Carl W. Swartz, Ph.D., A. Jackson Stenner, Ph.D., Sean T. Hanlon, Hal Burdick, Donald S. Burdick, Ph.D., Kurt W. Kuehne, October 1, 2012
Developing expertise in any field of endeavor requires immersing people in activities targeted to their abilities with opportunities to receive feedback and independent practice over long periods of time. Applying these principles in the classroom, so that each student has an opportunity to develop expertise in literacy, will require using technology that supports the teacher. EdSphere, formerly known as Learning Oasis, is one such technology that, through research, can help to validate the potential of technology to meet those goals.
by: A. Jackson Stenner, Ph.D., Eleanor E. Sanford-Moore, Ph.D., and Gary L. Williamson, Ph.D., MetaMetrics, October 1, 2012
How can we quantify college and career readiness? The Lexile Framework for Reading informs this question by measuring reading materials sampled from various postsecondary text collections, quantifying the associated text complexity, and then statistically summarizing the resulting distribution of readability measures. This approach allows us to provide a single text complexity target, situated in a band of “typical” text complexity requirements that are characteristic of postsecondary reading experiences.
The Text Complexity Continuum in Grade 1-12 (231KB, PDF)
by: Gary L. Williamson, Ph.D., Heather Koons, Ph.D., Todd Sandvik, and Eleanor E. Sanford-Moore, Ph.D., MetaMetrics, October 1, 2012
How difficult are the texts commonly used in the public schools? How does text complexity exposure increase with grade level? This brief summarizes research that quantifies grade-level text complexity across grades 1-12. This effectively documents a systematic continuum of text complexity exposure for reading education.
by: Carl W. Swartz, Ph.D., Sean T. Hanlon, A. Jackson Stenner, Ph.D., Hal Burdick, Donald S. Burdick, Ph.D., Colin Emerson, September 9, 2011
Emerging research from an array of fields suggests that experts are not born but rather develop expertise by engaging in deliberate practice over a long period of time. This deliberate practice must be targeted, intensive, distributed and self-directed, and provide real-time feedback using an objective developmental scale to measure progress. Providing more opportunities for deliberate literacy practice by increasing the time each student devotes to individualized, targeted reading and writing activities may overwhelm educators who teach in already-busy classrooms. Yet, with the Common Core State Standards for English Language Arts and writing, educators will have to focus more attention on growing students’ literacy skills. Given this increased demand on teacher time, how can students spend the time necessary to develop as readers and writers? And what instructional strategies or technology-based solutions can educators use to guide all students onto reading growth trajectories that will result in college and career readiness?
by: Carl W. Swartz, Ph.D., Sean T. Hanlon, Ph.D., A. Jackson Stenner, Ph.D., Hal Burdick and Donald S. Burdick, Ph.D.,
Learning Science and Technology, MetaMetrics
English is the unofficial technical and business language of the world. Estimates suggest that more than 1 billion people worldwide use English to varying degrees of understanding and expression. A common second language like English enables the internet to function as a digital passport allowing those whose first language might be Russian, Arabic, Cantonese, French, Spanish, or Hindi to cross international borders and share understanding of local, national, and international events and cultures. The purpose of this study was to investigate the text complexity of online English language newspapers sampled from around the world. The results of this study suggest that the text complexity of online English newspapers is commensurate with the complexity of text encountered by readers in two and four year universities, colleges and the workplace and is slightly higher than the text demands of domestic newspapers. Text at this high level may prove to be a barrier to understanding across borders and cultures. But, the level of text complexity sets an implicit aspirational goal for those who desire to be educated in or work in the United States. Our goal is not to advocate for lowering the text complexity of online English newspapers, but to enhance the reading ability of all English language learners who desire to access the information and knowledge contained in college and career texts.