skip to main content
Washington State Institute for Public Policy
Back Button

"Nudge" attendance program

Pre-K to 12 Education
  Literature review updated September 2020.

"Nudge" attendance programs are designed to reduce the school-to-parent information gap by sending routine messages to the student's parent or guardian. Typically, at least one parent/guardian receives timed messages about their students' absences, missing assignments, or class performance. The timing and frequency of the intervention vary across studies and outcomes. Schools using this intervention method most commonly inform parents/guardians when their student misses at least one day of school, one class assignment, or falls below a pre-determined performance threshold. Messages are sent weekly, bi-weekly, or monthly.

This analysis is restricted to studies that 1) are based in the United States, 2) use text messages to communicate with the parent/guardian, and 3) focus on middle and high school students. On average, students were 15 years old at the start of the intervention. Treatment lasted seven months.
 
ALL
META-ANALYSIS
CITATIONS

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the program impacts measured in the research literature (for example, impacts on crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases. See Estimating Program Effects Using Effect Sizes for additional information on how we estimate effect sizes.

The effect size may be adjusted from the unadjusted effect size estimated in the meta-analysis. Historically, WSIPP adjusted effect sizes to some programs based on the methodological characteristics of the study. For programs reviewed in 2024 or later, we do not make additional adjustments, and we use the unadjusted effect size whenever we run a benefit-cost analysis.

Research shows the magnitude of effects may change over time. For those effect sizes, we estimate outcome-based adjustments, which we apply between the first time ES is estimated and the second time ES is estimated. More details about these adjustments can be found in our Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured No. of effect sizes Treatment N Effect sizes (ES) and standard errors (SE) Unadjusted effect size (random effects model)
ES SE Age ES p-value
15 1 569 0.099 0.059 15 0.099 0.095
15 1 569 0.124 0.059 15 0.124 0.036

Citations Used in the Meta-Analysis

Bergman, P.L.S., & Chan, E.W. (2017). Leveraging technology to engage parents at scale: Evidence from a randomized controlled trial. (Series: CESifo Working Paper ; No.6493.) Munich: Center for Economic Studies and Ifo Institute (CESifo.