Introduction
The 2011 Race to the Top Early Learning Challenge Grant was instrumental in introducing to states the concept of systematically collecting and using data from formative assessments of children at kindergarten entry.1 By the 2018-2019 school year, 35 states (D.C is included as a “state”) required public schools to assess children’s learning and development within a few months of kindergarten entry, and two additional states had an optional kindergarten entry assessment (KEA) in place.2 The fact that so many states are implementing a KEA begs the question, “What exactly is a KEA?”. We can define KEAs by looking at the composition of the assessments, but this approach leaves out one critical component: the intended purposes of KEAs.
Successful assessment systems have a clearly defined3 purpose to obtain information about children, teachers and/or programs and to then examine that information at a specifically defined level (child, classroom, site, district, state, etc.). States then determine the extent to which resulting data are used to make high-stakes decisions about children, teachers, and/or programs, but the appropriateness of using assessment data to drive specific decisions depends on the technical properties of the assessment. Absent clarity of purpose, along with careful instrument selection and strong implementation, assessments may produce results that lack validity.
States demonstrate several different intended purposes for their KEAs. After reviewing state legislation, guidance and websites, three overall patterns emerge (see Table 1). Many states focus on the KEA as a tool to support teachers’ understanding of the skills children bring to the classroom at the start of the kindergarten year.4 Resulting KEA data may be expected to provide teachers with a general picture of their classroom, or with actionable information to develop individualized educational plans for each child.5 There is also evidence of states using KEA data to keep families informed about and engaged in their children’s progress at school. Finally, states are using KEA data to inform state-level decisions. The data may be used to test the strengths and weaknesses of the programs children attended prior to entering kindergarten6, to identify trends in kindergarten readiness over time,7 or as a means of predicting later school success.8 This brief explores evidence of the specific ways states are currently using KEA data in an attempt to bring greater clarity to the “why” of conducting these assessments.
KEA Data Used to Inform Instruction
In almost every state with a KEA (35 of 37, 95%), states indicate that one purpose of the KEA is to provide data to inform instruction. In most of these 35 states, relatively general references are made to the use of KEA data to guide teaching. For example, North Carolina’s Department of Public Instruction states that the purpose of the KEA “is to capture the development of each child at kindergarten entry to inform instruction and education planning”.9
Only five states appear to provide more specificity around how data must be used by teachers to drive instruction, in many cases focusing on reading skills and/or children in need of intervention:
- Arkansas kindergarten teachers are encouraged to use assessment results to inform future instruction and identify students in need of “remediation, intervention and/or enrichment”.10
- Colorado schools are required to use KEA results to develop a school readiness plan for each individual child.11
- Idaho’s KEA results are examined at the school level to determine whether interventions are necessary to maintain or improve children’s’ reading skills.
- In Iowa, remediation must be provided to children who exhibit a reading deficiency.
- Nevada statute requires that school districts “provide intervention services and intensive instruction to pupils who have been identified as deficient in the subject area of reading to ensure that those pupils achieve adequate proficiency in the requisite reading skills and reading comprehension skills.”
Despite this overwhelming call to use KEA data to improve instruction, few states provide specific guidance in the implementation of the goal. In fact, some research indicates that teachers find KEAs overly burdensome with inadequate content to successfully inform teaching. Despite a communicated purpose, these states may be falling short in the execution of the overall objective.
KEA Data Used to Inform Families
Fewer states with a KEA (23 of 37, 62%) reference using KEA data to inform families of their child’s progress, and the depth of information provided to families varies widely, as does whether information is provided to all families. States’ approaches to using KEA data with families can be categorized in six distinct categories:
- Resources provided to help families understand results and/or to provide families with home-based activities to support learning and development at home: Georgia, Maryland, Mississippi, Ohio, Texas, and Virginia
- Results shared with all families: Arkansas, Louisiana, Nevada, Kentucky, Pennsylvania, Washington
- Results shared only with families of children at-risk: Iowa
- Determined locally whether and how data are shared with families: Illinois, Michigan, New Mexico, Tennessee
- Families included in instructional planning: Colorado, Kansas
- Statement about sharing data with families: DC, Florida, South Carolina, Utah
Specific examples from three states are useful in further drawing out the differences in the ways in which KEA data are used with families. A 2017 memo from the Nevada Department of Education provides LEAs with guidance for sharing screening information with families. LEAs are directed to first provide information to help families understand the screening before providing a child’s screening results. Once families receive this information, LEAs are encouraged to develop progress monitoring plans and use communications with families “as an opportunity to provide families additional information and referrals to services when applicable”.
In its Kindergarten Readiness Assessment Implementation Plan, the Mississippi Department of Education describes parent reports that are available for LEAs to customize and provide KEA data to parents. Included in each report is a list of suggested reading exercises associated with the scaled assessment score for parents to use with their children. Taking a different approach, Iowa requires that KEA results are shared with families, but only for children identified as being at risk of not becoming a proficient reader.
Some states appear to take seriously the importance of including families in the KEA process, not just by going through the motions of sharing assessment results, but by also taking strides to make families active participants in their children’s educational experience. In these states, the purpose of the KEA is clearly beyond just that informing teachers, but also informing families.
KEA Data Used to Inform State Policy
At least 9 of 37 states (24%) are using data from KEAs at the state level for a specifically identified purpose. In some cases, states are using data to evaluate state-funded preschool program performance, identify opportunity gaps among kindergarten-aged children, or to target additional supports aimed at increasing program quality. The primary ways in which KEA data are being used at the state level can be organized as follows:
- Data used as an indicator of state preschool program impacts: Alabama, Michigan, Mississippi, Oregon, Utah
- Data used to measure progress towards state goals: Idaho, Washington
- Data used to identify schools in need of improvement/targeted assistance/support: Alabama, California, Delaware, Florida, Kansas, Pennsylvania, South Carolina, Oregon
Oregon is one example of a state where KEA data are used both as a measure of the efficacy of state-funded preschool programs, and as a means to identify preparedness gaps among groups of children throughout the state that may highlight equity issues across other aspects of the early care and education system. The Oregon Department of Education refers to the Oregon Kindergarten Assessment as a “consistent, statewide tool for identifying systemic opportunity gaps, determining Early Learning resource allocation to best support students in need, and measure improvement over time”. Annual assessment results are posted on the Department’s website, along with interpretive guidance.
Florida stands out as a state that uses KEA data to potentially sanction poorly performing programs. With data from the state’s KEA, Florida publishes an annual evaluation of the state’s Voluntary Pre-kindergarten Program (VPK), including overall kindergarten “readiness rates” for each VPK provider. Readiness rates are calculated based on the number of children meeting readiness standards in each provider, and VPK providers with low readiness rates are then targeted for training and technical assistance to improve developmentally appropriate practices for preschool-aged children. Programs that do not improve are not permitted to continue as VPK providers.
For some states, it is more challenging to find specific evidence of how KEA data are used at the state level to drive decision-making. Twelve states (32%) either post KEA data in an online report or reference the collection of KEA data from individual programs, but do not necessarily provide examples of how data are used:
- Regular (usually annual) report published with KEA results at the state, districts and/or school level: Alaska, Colorado, Connecticut, Idaho, Iowa, Kentucky, Maryland, Ohio, Vermont, Virginia
- Programs required to submit data to state: Indiana (special education only), Louisiana
Despite some interesting and innovative uses of KEA data at the state level, many states appear to be collecting, but not necessarily using, KEA data. For these states, the “why” of conducting KEAs becomes less concrete with some potentially lost opportunities.
How Can States Use KEA Data to Respond to COVID-19?
As LEAs across the country make plans to reopen schools, KEAs may be able to serve an additional, critical purpose – to measure the extent of potential learning losses incurred as a result of school closures at the end of the 2019-2020 school year. How have school closures impacted the incoming class of kindergarteners, and what can LEAs do to shore up learning deficits? Assessments conducted during the first few weeks of schools reopening will provide classroom teachers, LEAs and state agencies with invaluable information about children’s school readiness after losing up to three months of learning during their four-year-old preschool year. It is not safe to assume that kindergarten teachers can begin the year as they would under normal circumstances. Kindergarten Entry Assessments are needed to gauge where children are starting from, so teachers understand how to begin instruction for the year.
This said, states with remote or partially remote fall semesters will need to find more creative ways to conduct KEAs, especially those that rely entirely on teachers’ classroom observations. State are currently considering combinations of options such as training teachers to make observations via remote educational platforms, training and engaging families in the process of recording observations, reducing the number of items observed in the KEA, and inviting families to bring their children to special testing centers where assessments can be conducted while following necessary health and safety protocols. While these changes are necessary within the context of assessments during the COVID-19 pandemic, they have important implications. Modifications to how KEA assessments are conducted will likely impact states’ abilities to compare KEA data from the 2020-21 school year with data from prior, and likely future, school years.
Conclusions
KEA’s have the potential to provide policymakers, state agencies, LEAs, teachers, and families with rich data to inform teaching practices, program quality, and necessary interventions. Across the nation, there are strong examples of states using KEA data to address each of these areas. However, not all states use KEA data to their fullest, sometimes missing opportunities to engage families in their children’s learning and development, and sometimes missing opportunities to examine data at the state level to drive overall program improvements.
But before a state can effectively use KEA data, the technical properties of the assessment need to match its purpose. While less precise instruments may be used to screen children, inform instruction, and examine data in the aggregate, multiple precise instruments are necessary for high-stakes decision making and causal inferences. Also, the administration of the KEA impacts how data can be used (e.g. assessments conducted by teachers should not be used to make high-skates decisions about those teachers). States must keep these limitations in mind when choosing KEA instruments and determining appropriate uses for resulting data.
Finally, as the 2020-2021 school year approaches, states now have a new opportunity to utilize KEA data to craft an informed response to potential learning deficits resulting from COVID-19 school closures. However, necessary changes in the way KEA’s are safely conducted during the 2020-2021 school year will also impact the way these data can be viewed and understood within the context of KEA data from other school years.
Table 1 – Use of Kindergarten Entry Assessment Data by State (PDF)
Footnotes
1. Hanover Research (December 2013). Kindergarten entry assessments: Practices and policies. https://www.hanoverresearch.com/media/Kindergarten-Entry-Assessments-Practices-and-Policies.pdf
2. Weisenfeld, G. G., Garver, K., & Hodges, K. (2020): Federal and state efforts in the implementation of kindergarten entry assessments (2011-2018). Early Education and Development. DOI: 10.1080/10409289.2020.172048.
3. Hanover Research (December 2013). Kindergarten entry assessments: Practices and policies. https://www.hanoverresearch.com/media/Kindergarten-Entry-Assessments-Practices-and-Policies.pdf
4. Little, M., Cohen-Vogel, L., Sadler, J. & Merrill, B., (2020). Moving kindergarten entry assessments from policy to practice evidence from North Carolina. Early Education and Development. DOI: https://doi.org/10.1080/10409289.2020.1724600
5. Daily, S. & Maxwell, K. (2018). Frequently asked questions about kindergarten entry assessments. Washington, DC: Alliance for Early Success and Child Trends. https://www.childtrends.org/wp-content/uploads/2018/11/FAQKEA_ChildTrends_November2018.pdf
6. Daily, S. & Maxwell, K. (2018). Frequently asked questions about kindergarten entry assessments. Washington, DC: Alliance for Early Success and Child Trends. https://www.childtrends.org/wp-content/uploads/2018/11/FAQKEA_ChildTrends_November2018.pdf
7. Ibid.
8. Harvey, H. & Ohle, K. (2018). What’s the Purpose? Educators’ Perceptions and Use of a State-Mandated Kindergarten Entry Assessment. Education Policy Analysis, 26 (142).
9. North Carolina Public Schools: Office of Early Learning. (n.d.). Kindergarten entry assessment. Retrieved from http://www.ncpublicschools.org/earlylearning/kea/
10. Arkansas Department of Education. (n.d.). K-2 Assessment. Retrieved from http://www.arkansased.gov/divisions/learning-services/assessment/k-2-assessment
11. Colorado Department of Education. (n.d.). Assessment choices and school readiness plans. Retrieved from https://www.cde.state.co.us/schoolreadiness/assessment
12. Idaho State Department of Education. (n.d.). Idaho Reading Indicator. Retrieved from http://www.sde.idaho.gov/assessment/iri/
13. IA Code § 279.68 (through 2013). Retrieved from https://law.justia.com/codes/iowa/2013/titlevii/subtitle6/chapter279/279-68/
14. Nev. Rev. Stat. § 388.157. Retrieved from https://law.justia.com/codes/nevada/2019/chapter-388/statute-388-157/
15. Schachter, R. E., Flynn, E. E., Napoli, A. R. & Piasta, S. B. (2020). Teachers’ perspectives on year two implementation of a kindergarten readiness assessment. Early Education and Development. DOI: 10.1080/10409289.2020.172048
16. Canavero, S. (2017, November 17). Kindergarten entry assessment: Brigance Screen I. II [Memorandum]. Carson City, NV: State of Nevada Department of Education. Retrieved on March 23, 2019 from http://www.doe.nv.gov/uploadedFiles/ndedoenvgov/content/News__Media/Guidance_Memos/2017/FY18GuidanceMemo17-31_KEAGM.pdf
17. Ibid.
18. Ibid.
19. Mississippi Department of Education. (2014). Kindergarten readiness assessment: Implementation plan [PDF document]. Retrieved on March 24, 2019 from https://www.mdek12.org/sites/default/files/Offices/MDE/OA/OSA/Kindergarten-Readiness-Assessment-Implementation-Plan-2014.07.01_5.pdf
20. Iowa Department of Education. (n.d.) Early literacy implementation (ELI). Retrieved on March 28, 2019 from https://educateiowa.gov/early-literacy-implementation
21. Oregon Department of Education. (n.d.a.). Kindergarten assessment. Retrieved on March 24, 2019 from https://www.oregon.gov/ode/educator-resources/assessment/Pages/Kindergarten-Assessment.aspx
22. Ibid.
23. Office of Early Learning. (n.d.). Office of Early Learning annual report 2017-2018: Setting the standards [PDF document]. Retrieved from http://www.floridaearlylearning.com/Content/Uploads/floridaearlylearning.com/files/2017-2018%20Annual%20Report_ADA.pdf
24. Ibid.