Although researchers have investigated technical adequacy and usability of written-expression curriculum-based measures (WE-CBM), the economic implications of different scoring approaches have largely been ignored. The absence of such knowledge can undermine the effective allocation of resources and lead to the adoption of suboptimal measures for the identification of students at risk for poor writing outcomes. Therefore, we used the Ingredients Method to compare implementation costs and cost-effectiveness of hand-calculated and automated scoring approaches. Data analyses were conducted on secondary data from a study that evaluated predictive validity and diagnostic accuracy of quantitative approaches for scoring WE-CBM samples. Findings showed that automated approaches offered more economic solutions than hand-calculated methods; for automated scores, the effects were stronger when the free writeAlizer R package was employed, whereas for hand-calculated scores, simpler WE-CBM metrics were less costly than more complex metrics. Sensitivity analyses confirmed the relative advantage of automated scores when the number of classrooms, students, and assessment occasions per school year increased; again, writeAlizer was less sensitive to the changes in the ingredients than the other approaches. Finally, the visualization of the cost-effectiveness ratio illustrated that writeAlizer offered the optimal balance between implementation costs and diagnostic accuracy, followed by complex hand-calculated metrics and a proprietary automated program. Implications for the use of hand-calculated and automated scores for the universal screening of written expression with elementary students are discussed.

Matta, M., Keller-Margulis, M., Mercer, S. (2022). Cost analysis and cost-effectiveness of hand-scored and automated approaches to writing screening. JOURNAL OF SCHOOL PSYCHOLOGY, 92, 80-95 [10.1016/j.jsp.2022.03.003].

Cost analysis and cost-effectiveness of hand-scored and automated approaches to writing screening

Matta M.
;
2022

Abstract

Although researchers have investigated technical adequacy and usability of written-expression curriculum-based measures (WE-CBM), the economic implications of different scoring approaches have largely been ignored. The absence of such knowledge can undermine the effective allocation of resources and lead to the adoption of suboptimal measures for the identification of students at risk for poor writing outcomes. Therefore, we used the Ingredients Method to compare implementation costs and cost-effectiveness of hand-calculated and automated scoring approaches. Data analyses were conducted on secondary data from a study that evaluated predictive validity and diagnostic accuracy of quantitative approaches for scoring WE-CBM samples. Findings showed that automated approaches offered more economic solutions than hand-calculated methods; for automated scores, the effects were stronger when the free writeAlizer R package was employed, whereas for hand-calculated scores, simpler WE-CBM metrics were less costly than more complex metrics. Sensitivity analyses confirmed the relative advantage of automated scores when the number of classrooms, students, and assessment occasions per school year increased; again, writeAlizer was less sensitive to the changes in the ingredients than the other approaches. Finally, the visualization of the cost-effectiveness ratio illustrated that writeAlizer offered the optimal balance between implementation costs and diagnostic accuracy, followed by complex hand-calculated metrics and a proprietary automated program. Implications for the use of hand-calculated and automated scores for the universal screening of written expression with elementary students are discussed.
Articolo in rivista - Articolo scientifico
Automated text evaluation; Cost analysis; Cost effectiveness; Curriculum-based measurement; Universal screening; Written expression;
English
28-mar-2022
2022
92
80
95
reserved
Matta, M., Keller-Margulis, M., Mercer, S. (2022). Cost analysis and cost-effectiveness of hand-scored and automated approaches to writing screening. JOURNAL OF SCHOOL PSYCHOLOGY, 92, 80-95 [10.1016/j.jsp.2022.03.003].
File in questo prodotto:
File Dimensione Formato  
matta-2022-Journal of School Psychology-Vor.pdf

Solo gestori archivio

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Tutti i diritti riservati
Dimensione 654.49 kB
Formato Adobe PDF
654.49 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/505579
Citazioni
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
Social impact