Czech version of the outcome rating scale: Selected psychometric properties
Keywords:confirmatory factor analysis, measurement invariance, Outcome Rating Scale, ORS
Objectives. The Outcome Rating Scale (ORS) is an ultra-brief self-report scale designed to measure change during psychotherapy. The goal of this study was to test (a) the factor structure of the ORS, (b) the measurement invariance between a clinical and a non-clinical sample, between pre-therapy and post-therapy assessment (within the clinical sample), and between online and paper-and-pencil forms of administration (within the non-clinical sample), (c) concurrent validity with other outcome measures, and (d) sensitivity to therapeutic change.
Sample and settings. N = 256 patients, N = 210 non-clinical respondents, and N = 89 students participated in the study. Patients responded to the ORS before and after psychotherapy.
Statistical analysis. The factor structure and measurement invariance were tested using confirmatory factor analysis. Concurrent validity and test-retest reliability were assessed using correlational analysis. Sensitivity to change was assessed using the Reliable Change Index and pre-post effect size.
Results. The unidimensional structure was supported. The best-fitting model was a partially tau-equivalent model with the first and the fourth items’ loadings fixed to the same value. While only metric invariance was demonstrated between the clinical and non-clinical samples, the ORS demonstrated scalar invariance between pre- and post-therapy assessment and strict invariance between the paper-and-pencil and online forms of administration. Internal consistency, as well as concurrent validity, were satisfactory. The sensitivity to the therapeutic change was adequate. Furthermore, internal consistency and sensitivity to change were increased if the score was computed as a weighted sum of items.
Study limitation. The samples were not representative.
How to Cite
Copyright (c) 2021 Dana Seryjová Juhová, Tomáš Řiháček, Hynek Cígler, Eva Dubovská, Martin Saic, Martin Černý, Jan Dufek, Scott D. Miller
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.