Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of 6 young dermatologists, after a specific training. Clinical judgment was evaluated during 2 online sessions, 1 month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non-suspicious or not assessable. Cohen s and Fleiss kappa were used to calculate intra-and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval-CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspicious lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.
Cazzaniga, S., De Ponti, L., Baratelli, G., Francione, S., La Vecchia, C., Di Landro, A., et al. (2023). Agreement on classification of clinical photographs of pigmentary lesions: Exercise after a training course with young dermatologists. DERMATOLOGY REPORTS, 15(1), 1-4 [10.4081/dr.2022.9500].
Agreement on classification of clinical photographs of pigmentary lesions: Exercise after a training course with young dermatologists
Carugno, Andrea;
2023
Abstract
Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of 6 young dermatologists, after a specific training. Clinical judgment was evaluated during 2 online sessions, 1 month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non-suspicious or not assessable. Cohen s and Fleiss kappa were used to calculate intra-and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval-CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspicious lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.File | Dimensione | Formato | |
---|---|---|---|
Cazzaniga-2022-Dermatol Reports-VoR.pdf
accesso aperto
Descrizione: Article
Tipologia di allegato:
Publisher’s Version (Version of Record, VoR)
Licenza:
Creative Commons
Dimensione
762.64 kB
Formato
Adobe PDF
|
762.64 kB | Adobe PDF | Visualizza/Apri |
Cazzaniga-2023-Dermatol Reports-AAM.pdf
accesso aperto
Descrizione: Article
Tipologia di allegato:
Author’s Accepted Manuscript, AAM (Post-print)
Licenza:
Creative Commons
Dimensione
643.48 kB
Formato
Adobe PDF
|
643.48 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.