skip to main content
Washington State Institute for Public Policy
Back Button

Case management for unemployment insurance claimants

Workforce Development
Benefit-cost methods last updated December 2023.  Literature review updated November 2015.
Open PDF
Case managers work with Unemployment Insurance (UI) claimants in individual or group sessions to provide counseling, job search assistance or job retention services through orientations, assessments, interviews, or telephone calls. Case managers usually provide referrals to child care subsidies, transportation assistance, and other support services. They may also refer clients to education and training, particularly if job searches are unsuccessful. Case management may end when clients find employment, or continue with post-employment support services. UI programs usually provide these services to eligible dislocated workers, lasting anywhere from one week to three months.
 
ALL
BENEFIT-COST
META-ANALYSIS
CITATIONS
For an overview of WSIPP's Benefit-Cost Model, please see this guide. The estimates shown are present value, life cycle benefits and costs. All dollars are expressed in the base year chosen for this analysis (2022). The chance the benefits exceed the costs are derived from a Monte Carlo risk analysis. The details on this, as well as the economic discount rates and other relevant parameters are described in our Technical Documentation.
Benefit-Cost Summary Statistics Per Participant
Benefits to:
Taxpayers $1,014 Benefits minus costs $3,079
Participants $2,388 Benefit to cost ratio $15.31
Others $0 Chance the program will produce
Indirect ($108) benefits greater than the costs 69%
Total benefits $3,294
Net program cost ($215)
Benefits minus cost $3,079

*The effect size for this outcome indicates percentage change, not a standardized mean difference effect size.

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic in order to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the types of program impacts that were measured in the research literature (for example, crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases. See Estimating Program Effects Using Effect Sizes for additional information.

Adjusted effect sizes are used to calculate the benefits from our benefit cost model. WSIPP may adjust effect sizes based on methodological characteristics of the study. For example, we may adjust effect sizes when a study has a weak research design or when the program developer is involved in the research. The magnitude of these adjustments varies depending on the topic area.

WSIPP may also adjust the second ES measurement. Research shows the magnitude of some effect sizes decrease over time. For those effect sizes, we estimate outcome-based adjustments which we apply between the first time ES is estimated and the second time ES is estimated. We also report the unadjusted effect size to show the effect sizes before any adjustments have been made. More details about these adjustments can be found in our Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured Treatment age No. of effect sizes Treatment N Adjusted effect sizes(ES) and standard errors(SE) used in the benefit - cost analysis Unadjusted effect size (random effects model)
First time ES is estimated Second time ES is estimated
ES SE Age ES SE Age ES p-value
39 11 102201 0.036 0.015 42 0.000 0.014 43 0.036 0.019
39 13 209702 -0.002 0.007 42 0.000 0.014 43 -0.002 0.820
1In addition to the outcomes measured in the meta-analysis table, WSIPP measures benefits and costs estimated from other outcomes associated with those reported in the evaluation literature. For example, empirical research demonstrates that high school graduation leads to reduced crime. These associated measures provide a more complete picture of the detailed costs and benefits of the program.

2“Others” includes benefits to people other than taxpayers and participants. Depending on the program, it could include reductions in crime victimization, the economic benefits from a more educated workforce, and the benefits from employer-paid health insurance.

3“Indirect benefits” includes estimates of the net changes in the value of a statistical life and net changes in the deadweight costs of taxation.
Detailed Monetary Benefit Estimates Per Participant
Affected outcome: Resulting benefits:1 Benefits accrue to:
Taxpayers Participants Others2 Indirect3 Total
Earnings Labor market earnings $1,014 $2,388 $0 $0 $3,401
Program cost Adjustment for deadweight cost of program $0 $0 $0 ($108) ($108)
Totals $1,014 $2,388 $0 ($108) $3,294
Click here to see populations selected
Detailed Annual Cost Estimates Per Participant
Annual cost Year dollars Summary
Program costs $180 2014 Present value of net program costs (in 2022 dollars) ($215)
Comparison costs $0 2014 Cost range (+ or -) 75%
Case management services typically last between one week and three months. We estimated the average annual cost of treatment per participant using data from studies in our meta-analysis that report cost estimates (Black et al., 2003; Decker et al., 2000; Michaelides et al., 2012). Costs vary by study but may include central administration, staff salaries, staff benefits, recruitment, assessment services, job placement and retention services, short-term training provided by staff, transportation, and medical treatments.
The figures shown are estimates of the costs to implement programs in Washington. The comparison group costs reflect either no treatment or treatment as usual, depending on how effect sizes were calculated in the meta-analysis. The cost range reported above reflects potential variation or uncertainty in the cost estimate; more detail can be found in our Technical Documentation.
Benefits Minus Costs
Benefits by Perspective
Taxpayer Benefits by Source of Value
Benefits Minus Costs Over Time (Cumulative Discounted Dollars)
The graph above illustrates the estimated cumulative net benefits per-participant for the first fifty years beyond the initial investment in the program. We present these cash flows in discounted dollars. If the dollars are negative (bars below $0 line), the cumulative benefits do not outweigh the cost of the program up to that point in time. The program breaks even when the dollars reach $0. At this point, the total benefits to participants, taxpayers, and others, are equal to the cost of the program. If the dollars are above $0, the benefits of the program exceed the initial investment.

Citations Used in the Meta-Analysis

Benus, J.M., Poe-Yamagata, E., Wang, Y., & Blass, E. (2008). Reemployment and Eligibility Assessment REA) study: FY 2005 Initiative. Columbia, MD: IMPAQ International.

Black, D.A., Smith, J.A., Berger, M.C., & Noel, B.J. (2003). Is the threat of reemployment services more effective than the services themselves? Evidence from random assignment in the UI System. American Economic Review, 93(4), 1313-1327.

Decker, P.T., Olsen, R.B., Freeman, L., & Klepinger, D.H. (2000). Assisting unemployment insurance claimants: The long-term impacts of the Job Search Assistance Demonstration. U.S. Department of Labor, Employment and Training Administration, Unemployment Insurance Service

Dickinson, K.P., Kreutzer, S.D., & Decker, P.T. (1997). Evaluation of Worker Profiling and Reemployment Services Systems: Report to Congress. Menlo Park, CA: Social Policy Research Associates.

Dickinson, K.P., Decker, P.T., Kreutzer, S.D., Heinberg, J.D., & Nicholson, W. (2002). Evaluation of WPRS systems. In R.W. Eberts, C.J. O'Leary, & S.A. Wandner (Eds.), Targeting Employment Services (pp. 69-90). Kalamazoo, MI: W.E. Upjohn Institute.

Johnson, T.R., & Klepinger, D.H. (1991). Evaluation of the impacts of the Washington Alternative Work Search Experiment: Final report. Washington, DC: U.S. Department of Labor, Employment and Training Administration, Unemployment Insurance Service.

Michaelides, M., Poe-Yamagata, E., Benus, J., & Tirumalasetti, D. (2012). Impact of the Reemployment and Eligibility Assessment (REA) Initiative in Nevada. Washington, DC: U.S. Department of Labor, Employment and Training Administration.

Poe-Yamagata, E., Benus, J., Bill, N., Carrington, H., Michaelides, M., & Shen, T. (2011). Impact of the Reemployment and Eligibility Assessment (REA) Initiative. Washington, DC: U.S. Department of Labor, Employment and Training Administration.