LOADING

Type to search

The Once-in-a-Generation Opportunity: What States and Districts Can Do Now to Learn From American Rescue Plan ESSER Interventions

The Once-in-a-Generation Opportunity:
What States and Districts Can Do Now to Learn From American Rescue Plan ESSER Interventions

Heather Boughton
Ohio Department of Education

Jessica de Barros
American Institutes for Research/CALDER

Dan Goldhaber
American Institutes for Research/CALDER
University of Washington/CEDR

Sydney Payne
American Institutes for Research/CALDER

Nathaniel Schwartz
Brown University/Annenberg Institute for School Reform

CALDER Policy Brief No. 27-0921

The Need and Opportunity

We have today a once-in-a-generation moment of unprecedented need, unprecedented support, and unprecedented opportunity. The need is obvious: The COVID-19 pandemic has disrupted schools across the country, negatively impacting student learning, especially for students of color and students experiencing poverty. While there is some debate about the appropriateness of the term “learning loss,” what is clear from recently released reports about achievement on standardized tests is that students are behind their pre-pandemic peers in math and reading  (Kuhfeld et al., 2020; Kuhfeld et al., 2021; Dorn et al., 2021)[1]. These students need our support; in the absence of successful COVID catch-up activities, the likelihood is that they will face future obstacles to success in college and the workforce.

Enter Elementary & Secondary School Emergency Relief (ESSER) funds, the unprecedented support. Over the course of one year – from March 2020 to March 2021 – the United States Congress enacted three laws providing $190 billion in federal education funding to states and school districts. The largest of these Acts, the American Rescue Plan (ARP), provides $122 billion to states, $110 billion of which must go to districts. Of that $110 billion, 20 percent (or $22 billion) is specifically designated for addressing “learning loss” due to pandemic-related schooling impacts, using “evidence-based interventions” (U.S. Department of Education, 2021).

The opportunity is to learn from the unprecedented federal investment. It appears that school districts are planning to spend ARP ESSER funds[2] in a wide array of ways, including additional instructional time, tutoring, expanded learning opportunities, social-emotional supports, technology, facilities, and teacher supports (Dusseault & Pillow, 2021). Many ARP ESSER investments seem conceptually sound, but here’s the problem: a long history of education research shows that often, conceptually sound ways of investing in youth do not pan out as intended. However, if districts and states collect the right information, we can learn what is working and what isn’t. This information could in turn inform decision-making over the short term, providing leaders with the ability to make mid-course corrections while the window for ESSER spending is still open, and providing the sector with valuable knowledge that would inform educational investments over the long term.

In this brief, we review aspects of the ARP and argue that the Act strongly implies or even requires that districts invest smartly. And further, that smart investments require some data collection, not only so we know how ARP funds were spent, but also so we can learn about their efficacy. Unfortunately, we also issue a warning that our analysis of state ARP ESSER plans shows ambiguity about what districts will be required to do, which could lead to inconsistent views by districts about what is required, and their approaches to monitoring impact.

Goals of the ARP and How We Will Know If They Are Met

The goals of ARP educational investments, which must be obligated by September 2024 and spent by January 2025, are to “help safely reopen and sustain the safe operation of schools and address the impact of the coronavirus pandemic on the Nation’s students.” Of the 20 percent of district funds designated to address learning loss, districts must focus on the “disproportionate impact of COVID-19 on underrepresented student subgroups” (U.S. Department of Education, 2021).[3]

“Addressing the academic needs of those students most impacted by the pandemic will not be easy, but there is strong evidence that programs providing small-group, supplemental support for students can make a significant difference for students who have fallen behind their peers.”

This idea that recovery funds must focus directly on the students who have faced the greatest challenges during the pandemic is one of the ARP’s key tenets. Addressing the academic needs of those students most impacted by the pandemic will not be easy, but there is strong evidence that programs providing small-group, supplemental support for students can make a significant difference for students who have fallen behind their peers.[4] And states and districts have no shortage of resources to turn to for ideas on programmatic approaches, from summer learning, to extended day programs, attendance incentives, tutoring, and social-emotional supports.[5]

At the same time, a host of previous reforms, ranging from the supplemental services provision in the No Child Left Behind Act to the strategies built into Response to Intervention, have aimed to provide targeted academic supports. Unfortunately, many of the interventions showed mixed and often disappointing results (Reback et al., 2009; Heinrich et al., 2010; Balu et al., 2015).  

The success of the $22 billion ARP ESSER investment dedicated to learning loss depends a great deal on how well the ARP’s individual support provisions are implemented at the district level. But with nearly 14,000 districts making individual choices around this work, we as a nation must harness this effort to allow us to learn faster about which approaches are most successful.

What Do State Plans Show About Supporting Individual Students and Measuring Impact?

State ARP ESSER plans were due to the U.S. Department of Education on June 9, 2021. As of our analysis (August 2021), 44 states had submitted plans, 28 of which had been federally approved (OESE, 2021). In our analysis of the plans, we looked for evidence that states will implement individualized student interventions to target students needing additional supports to get back on track, and evidence that states will monitor/collect data on EESER-funded interventions to measure their impact.

It should be noted that state ARP ESSER plans – essentially compliance documents – are one data point that does not represent all efforts across each state. It is possible states and districts have more robust plans for individualized interventions and progress monitoring. For this analysis, we used state ARP ESSER plans because they are a common data point available at this time across states.

Supporting Individual Students

To assess the extent to which states are requiring interventions that target individual students based on their needs, we reviewed the plans and searched for terms such as: “target/targeting/targeted”, “prioritize”, “individual”, “subgroup”, “identify/identified”, and “student level”. Based on these criteria, we found that all plans mention targeting districts and/or student groups most impacted by the pandemic, but stating that groups are targeted for recovery does not make it clear that there is a plan to focus on individual student needs. In particular, when we take a parent/family perspective and ask, “Would you know if your own child would have access to a specific COVID recovery program?”, the answer is less clear from state plans. Only about half detail how students will be identified for interventions, their learning goals, and/or how interventions will be matched to student learning goals.

Examples of state plans that show evidence of plans to implement individualized student interventions include:

New Mexico – Students struggling with disengagement and/or chronic absenteeism will be locally identified and provided with a personal academic coach/counselor.

Connecticut – Personnel will go directly to homes to engage with the students most impacted, as identified by disengagement and chronic absenteeism in the prior year.

New Hampshire – Educators will be trained to adapt to varying student needs. The state is also using $6 million of its ARP ESSER dollars to fund learning pods, an individualized instruction model that offers small-group, multi-age, trauma-sensitive instruction to students needing additional support.

Oklahoma – Students will be targeted for math tutoring based on state mathematics assessment scores.

Tennessee – Will provide high-dosage tutoring focused on students in a grade span who fall below the “Meets Standards” level on their standardized tests, and will support those students for three years.

Measuring Impact

We used a similar approach to identify how states plan to track the impact of ARP ESSER funds. Specifically, we reviewed section G., “Monitoring and Measuring Progress” of each plan and performed searches throughout the documents for terms such as “evaluation/evaluate”, “track”, “monitor”, “outcome”, “impact”, “report”, “program data”, and “participation”. In doing so, we looked for specific language that demonstrated a state will monitor both how ARP ESSER funds are spent and the impact of those funds. This includes stating a requirement that districts monitor ARP ESSER program progress and impact, or language in the plans such as, “monitor direct implementation”, “monitor use of funds and effectiveness”, or “track and evaluate efficacy of initiatives”.[6]

With these criteria, we found that most state plans include general language indicating they will collect the data required for ARP ESSER grant reporting, but just over half of states explicitly describe concrete plans for how they will collect data on the impact of ARP ESSER funded programs. Examples of plans describing both how they will track funds and their impact include:

Hawaii – Will reserve funds for a three-year study to track incoming sixth graders and “assess the impact of the strategies and interventions implemented on students’ academic, social, emotional, and behavioral performance.”

Ohio – Will collect statewide data that supports resource prioritization and collect “program data” which tracks student participation in ARP ESSER funded programs.

Tennessee – Will collect statewide information on the “interventions and activities associated with ARP ESSER-funded initiatives.” To measure the impact of high-dosage tutoring, for example, Tennessee will collect and analyze data on many factors including “data on the LEA student prioritization process, student baseline, progress, and regular interim assessments.”

More generally, states such as South Dakota will require LEAs to submit annual performance reports on the “use of ARP ESSER funds that will detail the outcomes achieved based on the uses of funding.”

“We have some idea about how districts plan to spend ARP ESSER funds from a survey administered by the School Superintendents Association (AASA, 2021). But while useful for providing a broad understanding about intended investments, this level of information does not allow us to learn much about the efficacy of spending choices.”

We have some idea about how districts plan to spend ARP ESSER funds from a survey administered by the School Superintendents Association (AASA, 2021). But while useful for providing a broad understanding about intended investments, this level of information does not allow us to learn much about the efficacy of spending choices. In order to determine the impact of ARP investments on individual student learning goals, there are more specific questions states and districts could be asking up-front in order to maximize learning from ARP ESSER interventions. 

What States and Districts Can Do to Learn From ARP ESSER Interventions

“States and districts have an incredible responsibility to help students recover from the pandemic and achieve their full potential, and should be held accountable for this responsibility.”

States generally have limited ability to directly support students and families; further, states cannot require districts to spend local federal funds in specific ways. However, states do have levers they can use to increase the likelihood that districts will be equipped to individualize supports, all of which could be funded through state activity funds. These include, but are not limited to:

  • Building Capacity for Local Data Use: Local level data is more timely and detailed than state data; yet, many districts struggle with the capacity to analyze and use their data resources effectively to target individual students’ needs. States can use their funds to increase districts’ capacity to identify and address students’ individual needs by, for example, creating opportunities for enhancing local data systems, for professional development, or for regional data supports.
  • Contributing to a Culture of Learning and Continuous Improvement: Ultimately, states and districts have an incredible responsibility to help students recover from the pandemic and achieve their full potential, and should be held accountable for this responsibility. At the same time, it is likely that some interventions and other ARP ESSER-funded activities along the way will not work. States can create opportunities that focus on the importance of continuous improvement, while equipping districts to engage in that work well.
  • Encouraging Student, Family, and Community Engagement: Fundamental to the idea of offering individualized support is student and family engagement, without which districts will have an incomplete picture of the interventions that are needed and will work. At the same time that states encourage evidence-based academic interventions and supports, they should help equip districts to foster authentic engagement opportunities that can help shape the use of their ESSER funds.
  • Peer to Peer Networking: Because states have cross-district data, they often have greater insight into which districts are struggling with similar challenges. States can use their data to connect similar districts who face similar challenges, creating opportunities for those districts to share knowledge, experiences and resources. 
  • Professional Development: States can create statewide professional development opportunities focused on specific interventions or, more generally, the use of evidence-based strategies.
  • Using Cross-Sector State Level Data: States generally have extensive data resources that they can use to help districts understand where individual students are in their learning and identify students at-risk. States can enhance their reporting tools so that districts have access to actionable information about their students (e.g., Early Warning Systems, individual-level assessment reports tied to state curriculum standards).       
  • Grant Opportunities Requiring Individualized Supports: States may use state activity funds to create grant programs that target a high-need area (e.g., chronic absenteeism, math) and require districts to implement targeted interventions as part of the grant program.

“For these levers to result in learning acceleration, states must strategically apply them based on districts’ differentiated needs, and in a way that is responsive to the questions district leaders are asking.”

For these levers to result in learning acceleration, states must strategically apply them based on districts’ differentiated needs, and in a way that is responsive to the questions district leaders are asking (Conaway, 2021). States should approach support in the spirit of partnership, as opposed to compliance (Henrick, et al., 2017).

Three Essential Questions

In order to learn from ARP ESSER interventions and make course-corrections as we work toward COVID recovery, it is necessary to know both the kinds of interventions being utilized and who is receiving them. In particular, we argue that states should collect three types of information:

What COVID recovery interventions are districts using and what are their key features?

As noted above, districts are engaging in a number of different initiatives designed to help students academically. To state the obvious, it is necessary to know what interventions districts are using in order to evaluate whether they are working. And there is a real role for states here. Large districts are likely to have both large samples of students who are enrolled in interventions and the capacity to do some evaluation. This combination will allow them to evaluate these interventions. Smaller and rural districts are likely to need the help of states given that the samples of students engaging in recovery initiatives in individual districts will, in many cases, be too small to reach firm conclusions about the efficacy of the intervention. Rural districts may also use different intervention approaches; for example, they may maximize acceleration during the school day due to longer school transportation routes and less reliable internet connectivity at home. Thus, there is a need to look across small and rural districts to assess the findings for districts engaging in similar initiatives.

But beyond the high-level question of what initiatives districts are using, it will be important to understand some of the key features of the initiatives. Factors such as whether programs take place during or outside of school hours, whether the adults supporting students have trusted relationships with them, how adults are trained, and whether the program is conducted virtually or in-person may result in different impacts.

Which students are targeted for COVID recovery efforts?

Some initiatives, such as curricular reforms, HVAC upgrades, or teacher retention bonuses, could lead to across-the-board changes in student achievement. On the other hand, initiatives targeted to particular grades, subjects, or student groups should have differentiated impacts on the targeted areas. For example, an effort to identify and support students with high absenteeism would impact attendance rates, whereas a program to identify gaps in early literacy and mathematics would impact student performance in those content areas. We need to know which students are receiving extra support in which areas in order to know where to look for gains as a result of programs.

Which students are actually participating in and regularly attending COVID recovery initiatives?

That districts have programs designed to help students does not necessarily mean that all or even most students are getting extra help. Understanding whether students are actually participating in recovery efforts, and why, is crucial for interpreting evidence about the efficacy of initiatives. For example, an out of school time tutoring program implemented in different districts serving similar student populations with similar learning needs could have significantly different effects. If in District A, a high percentage of students participate, and in District B, a small percentage of students participate, the programs will have different effects on the districts’ performance overall. Moreover, we might find that for the students who participated in District A, the program did not have as great an impact as it did in District B. In thinking about how to draw conclusions and make adjustments in the outyears, each district might reach different conclusions and use different course-corrections – District A examining the nature of the intervention, and District B expanding recruitment.  

All of the questions above go beyond the level of detail currently required by the U.S. Department of Education for ESSER reporting. Still, states and districts would be wise to consider putting systems in place to answer such questions over the next four years. Planning up-front how to communicate why ESSER funds are needed, how they are used, and how students are benefiting, is imperative for our nation’s collective learning – and student’s future success.

A Short Window of Opportunity to Get this Right

With loose federal requirements on monitoring student impacts due to ESSER, states and districts will need to assume the lead role in ensuring ESSER funds change the trajectory of students’ lives. While some states indicate they have already collected district plans, many are still in the process of doing so, with the majority of states setting deadlines of late August.

That this is an unprecedented amount of federal funding to be spent over just four years cannot be overstated; understandably, state and district leaders were and will continue to be forced to make compromises about the amount of data that is realistic to collect and report. Leading school systems through COVID-19 is political and complex (Sawchuk, 2021). To consider both the urgent needs of today with what we will need to know in the next three to five years is a balancing act. As states consider their role in this balancing act, it may be especially important to consider the following:

  • A state’s role in supporting its districts will be enhanced if they authentically engage with districts as their partners in this work. Districts can help states understand what supports are most needed, can co-design programs and resources to ensure their success, and can help states find the line that balances the need to learn with districts’ many priorities.
  • How states use the data they collect for this purpose is critically important. Data can be used proactively throughout the next five years to help ensure that the COVID recovery interventions employed actually do work. States need to create an environment in which districts can see data and evaluation as supportive tools.
  • The needs students face today may not be the same as what they face next year or the year after. States and districts should be prepared to assess needs regularly and update their recovery plans as needed; states will need to give districts the opportunity to course-correct based on what they learn.
  • The pandemic exacerbated existing needs. The resources we put in place to use data and evaluation to learn what is working for students during recovery are the same resources that were needed pre-pandemic and the same resources that will be valuable post-pandemic. States and districts should design the investments they make with ESSER funds with a long-term view of creating an educational system that is stronger and more equitable than it was before the pandemic.

Without focused intervention supporting students most impacted, COVID will alter life outcomes for generations, widening existing disparities in educational opportunity, earnings and wealth based on race and income (Dorn et al., 2021). The magnitudes of learning loss that have been identified suggest that COVID academic recovery will necessarily be a multi-year process (especially considering schooling this year looks like it too might be interrupted). Importantly, ESSER funding allows for this since the federal money can be spent over four years.

If we fast-forward five years, what will be the takeaways of these monumental ESSER investments? How will what we learn shape education policy and finance discourse? Will ESSER effectively make the case for permanently increasing the federal investment in education – started in 1965 with the purpose of closing opportunity gaps – or will the funding pendulum swing, with policymakers and taxpayers dusting their hands off on a job completed? More importantly, what will we learn about the effectiveness of the robust learning interventions states and districts are trying out, and how will those learnings shape future programming? We urge district leaders to keep these questions top of mind as they design, implement, and adjust learning acceleration efforts.


[1] A 2020 NWEA study of fall 2020 Measures of Academic Progress (MAP) test scores from students in grades 3-8 found students scored similarly to same-grade 2019 students in reading and lower in math. Although most students made growth on average, growth was higher for reading than math compared with growth in prior years. A 2021 NWEA study of test scores from 2.1 million Black, Indigenous and people of color (BIPOC) students in grades 3-8 found that math achievement as measured by percentile rank was much lower in 2020 than 2019 and reading achievement was slightly lower during the same time period. While all BIPOC student groups made between 60-70% of the pre-pandemic gains for their groups on average, typical gains were lower for male BIPOC students and for BIPOC students attending high-poverty schools. A 2021 McKinsey study of iReady test scores from 1.6 million elementary school students found students scored 10 points lower than 2020 same-grade students in math and nine points lower in reading.

[2] The focus of this brief is on ARP ESSER, although some of the opinions apply more broadly to all three ESSER funds. When referring specifically to the ARP, the term ARP ESSER is used. When applicable to ESSER funds more generally, the term ESSER is used.

[3] Underrepresented subgroups are defined as “each major racial and ethnic group, children from low-income families, children with disabilities, English learners, gender, migrant students, students experiencing homelessness and children and youth in foster care.” (U.S. Department of Education, 2021)

[4] See (Robinson et al., 2021)

[5] See resources from Council of Chief State Schools Officers Coalition to Advance Future Student Success, Council of Great City Schools, EdResearch for Recovery, Education Resource Strategies,  Education Trust, FutureEd, Turnaround for Children, U.S. Department of Education Return to School Roadmap.

[6] In this analysis, we did not consider states as fulfilling this requirement if the plan stated that they do not yet have the capacity for this kind of data collection, even if it stated that they were working on building this capacity.

Tags: