skip to main content
Washington State Institute for Public Policy
Back Button

Drug court

Juvenile Justice
Benefit-cost methods last updated December 2018.  Literature review updated July 2014.
Open PDF
In therapeutic drug courts, youth with substance-abuse issues typically enter into a contract with the court and agree to comply with treatment and supervision requirements. While each drug court is unique, these therapeutic courts share similar characteristics. Drug courts typically involve a team of stakeholders (e.g., youth, guardian, judge, treatment provider, case manager, and probation officer). Components of the drug court model include treatment; judicial monitoring; random drug testing; incentives, rewards, and sanctions; and progressive stages (less monitoring with compliance). Drug courts can be pre- or post-adjudication models and the length of the program may vary from 6 to 12 months.
The estimates shown are present value, life cycle benefits and costs. All dollars are expressed in the base year chosen for this analysis (2017). The chance the benefits exceed the costs are derived from a Monte Carlo risk analysis. The details on this, as well as the economic discount rates and other relevant parameters are described in our Technical Documentation.
Benefit-Cost Summary Statistics Per Participant
Benefits to:
Taxpayers ($664) Benefits minus costs ($8,941)
Participants ($463) Benefit to cost ratio ($1.69)
Others $151 Chance the program will produce
Indirect ($4,640) benefits greater than the costs 41 %
Total benefits ($5,616)
Net program cost ($3,325)
Benefits minus cost ($8,941)
1In addition to the outcomes measured in the meta-analysis table, WSIPP measures benefits and costs estimated from other outcomes associated with those reported in the evaluation literature. For example, empirical research demonstrates that high school graduation leads to reduced crime. These associated measures provide a more complete picture of the detailed costs and benefits of the program.

2“Others” includes benefits to people other than taxpayers and participants. Depending on the program, it could include reductions in crime victimization, the economic benefits from a more educated workforce, and the benefits from employer-paid health insurance.

3“Indirect benefits” includes estimates of the net changes in the value of a statistical life and net changes in the deadweight costs of taxation.
Detailed Monetary Benefit Estimates Per Participant
Benefits from changes to:1 Benefits to:
Taxpayers Participants Others2 Indirect3 Total
Crime $670 $0 $1,429 $335 $2,434
Labor market earnings associated with illicit drug abuse or dependence ($48) ($106) $0 $0 ($155)
Health care associated with illicit drug abuse or dependence ($1,242) ($218) ($1,303) ($616) ($3,378)
Labor market earnings associated with problem alcohol use $105 $232 $0 $0 $337
Property loss associated with problem alcohol use $0 $1 $2 $0 $3
Health care associated with problem alcohol use $22 $3 $24 $11 $59
Mortality associated with illicit drugs ($171) ($376) $0 ($2,706) ($3,253)
Mortality associated with problem alcohol $0 $1 $0 $6 $7
Adjustment for deadweight cost of program $0 $0 $0 ($1,670) ($1,670)
Totals ($664) ($463) $151 ($4,640) ($5,616)
Detailed Annual Cost Estimates Per Participant
Annual cost Year dollars Summary
Program costs $2,645 2004 Present value of net program costs (in 2017 dollars) ($3,325)
Comparison costs $0 2004 Cost range (+ or -) 10 %
The per-participant costs, based on 12 months of service, are from Anspach, D.F., Ferguson, A.S., & Phillips, L.L. (2003). Evaluation of Maine's statewide juvenile drug treatment court program. Augusta, ME: University of Southern Maine.
The figures shown are estimates of the costs to implement programs in Washington. The comparison group costs reflect either no treatment or treatment as usual, depending on how effect sizes were calculated in the meta-analysis. The cost range reported above reflects potential variation or uncertainty in the cost estimate; more detail can be found in our Technical Documentation.
Estimated Cumulative Net Benefits Over Time (Non-Discounted Dollars)
The graph above illustrates the estimated cumulative net benefits per-participant for the first fifty years beyond the initial investment in the program. We present these cash flows in non-discounted dollars to simplify the “break-even” point from a budgeting perspective. If the dollars are negative (bars below $0 line), the cumulative benefits do not outweigh the cost of the program up to that point in time. The program breaks even when the dollars reach $0. At this point, the total benefits to participants, taxpayers, and others, are equal to the cost of the program. If the dollars are above $0, the benefits of the program exceed the initial investment.

^WSIPP’s benefit-cost model does not monetize this outcome.

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic in order to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the types of program impacts that were measured in the research literature (for example, crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases.

Adjusted effect sizes are used to calculate the benefits from our benefit cost model. WSIPP may adjust effect sizes based on methodological characteristics of the study. For example, we may adjust effect sizes when a study has a weak research design or when the program developer is involved in the research. The magnitude of these adjustments varies depending on the topic area.

WSIPP may also adjust the second ES measurement. Research shows the magnitude of some effect sizes decrease over time. For those effect sizes, we estimate outcome-based adjustments which we apply between the first time ES is estimated and the second time ES is estimated. We also report the unadjusted effect size to show the effect sizes before any adjustments have been made. More details about these adjustments can be found in our Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured Treatment age No. of effect sizes Treatment N Adjusted effect sizes(ES) and standard errors(SE) used in the benefit - cost analysis Unadjusted effect size (random effects model)
First time ES is estimated Second time ES is estimated
ES SE Age ES SE Age ES p-value
Alcohol use^ 16 1 31 -0.079 0.250 18 n/a n/a n/a -0.079 0.751
Cannabis use^ 16 1 31 -0.144 0.250 18 n/a n/a n/a -0.144 0.564
Crime 16 11 2210 -0.037 0.053 18 -0.037 0.053 28 -0.103 0.308
Illicit drug use disorder 16 2 145 0.274 0.280 18 0.000 0.187 21 0.509 0.341
Problem alcohol use 16 1 31 -0.015 0.250 18 -0.015 0.250 18 -0.015 0.951

Citations Used in the Meta-Analysis

Anspach, D.F., & Ferguson, A.S., (2005). Part II: Outcome Evaluation of Maine’s Statewide Juvenile Drug Treatment Court Program. Main State Office of Substance Abuse, Augusta, Maine.

Byrnes, E.C., & Hickert, A.O. (2004). Process and outcome evaluation of the third district juvenile drug court in Dona Ana County, New Mexico. Annapolis, MD: Glacier Consulting.

Carey, S.M. (2004). Clackamas County Juvenile Drug Court outcome evaluation: Final report. Portland, OR: NPC Research.

Gilmore, A.S., Rodriguez, N., & Webb, V.J. (2005). Substance abuse and drug courts: The role of social bonds in juvenile drug courts. Youth Violence and Juvenile Justice, 3(4), 287-315.

Henggeler, S.W., Halliday-Boykins, C.A., Cunningham, P.B., Randall, J., Shapiro, S.B, & Chapman, J.E. (2006). Juvenile drug court: Enhancing outcomes by integrating evidence-based treatments. Journal of Consulting and Clinical Psychology, 74(1), 42-54.

Kralstein, D. (2008) Evaluation of the Suffolk County Juvenile Treatment Court: Process and impact findings. New York NY: Center for Court Innovation.

Latessa, E.J., & University of Cincinnati. (2013). Outcome and process evaluation of juvenile drug courts. Cincinnati, OH: Center for Criminal Justice Research, University of Cincinnati, School of Criminal Justice.

Latessa, E.J., Shaffer, D.K., & Lowenkamp C. (2002). Outcome evaluation of Ohio’s drug court efforts: Final report. Cincinnati, OH: University of Cincinnati, Center for Criminal Justice Research, Division of Criminal Justice.

LeGrice, L.N. (2004). Effectiveness of juvenile drug court on reducing delinquency. Dissertation Abstracts International, 64(12), 4626A.

O'Connell, J.P., Nestlerode, E., & Miller, M.L. (1999). Evaluation of the Delaware juvenile drug court diversion program. Dover: State of Delaware Executive Department, Statistical Analysis Center.

Parsons, B.V., Byrnes, E.C. (n.d.). Byrne evaluation partnership program: Final report. Salt Lake City: University of Utah, Social Research Institute.

Sullivan, C.J., Blair, L., Latessa, E., & Sullivan, C.C. (2014). Juvenile drug courts and recidivism: Results from a multisite outcome study. Justice Quarterly, online publication doi: 10.1080/07418825.2014.908937.