skip to main content
Washington State Institute for Public Policy
Back Button

Partners for Change Outcome Management System (PCOMS) for youth

Children's Mental Health: Other
  Literature review updated July 2019.
Open PDF
Partners for Change Outcome Management System (PCOMS) is a tool for using systematic client feedback to facilitate routine monitoring and encourage therapeutic conversations about the client’s experience of therapy. Brief youth-appropriate feedback measures are completed by clients at the beginning and/or end of a therapy session. The brevity of measures is intended to allow for their routine and expected integration into therapy. This analysis includes studies that utilize PCOMS in a child or adolescent sample. In the included study, children in the intervention group received PCOMS as an adjunct to play-based therapy in a school counseling context for children with a range of presenting concerns or diagnoses, including anxiety, depression, attention difficulties, and externalizing behavior problems. Youth in the comparison group received play-based therapy in a school counseling context, without PCOMS.
 
ALL
META-ANALYSIS
CITATIONS

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic in order to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the types of program impacts that were measured in the research literature (for example, crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases. See Estimating Program Effects Using Effect Sizes for additional information.

Adjusted effect sizes are used to calculate the benefits from our benefit cost model. WSIPP may adjust effect sizes based on methodological characteristics of the study. For example, we may adjust effect sizes when a study has a weak research design or when the program developer is involved in the research. The magnitude of these adjustments varies depending on the topic area.

WSIPP may also adjust the second ES measurement. Research shows the magnitude of some effect sizes decrease over time. For those effect sizes, we estimate outcome-based adjustments which we apply between the first time ES is estimated and the second time ES is estimated. We also report the unadjusted effect size to show the effect sizes before any adjustments have been made. More details about these adjustments can be found in our Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured No. of effect sizes Treatment N Adjusted effect size(ES) and standard error(SE) Unadjusted effect size (random effects model)
ES SE Age ES p-value
9 1 18 -0.600 0.339 9 -0.600 0.077
9 1 18 -0.184 0.333 9 -0.184 0.579

Citations Used in the Meta-Analysis

Cooper, M., Duncan, B., Golden, S., & Toth, K. (2019). Systematic client feedback in therapy for children with psychological difficulties: Pilot cluster randomized controlled trial. Counseling Psychology Quarterly.