Inter-rater Reliability of Examiners in the Hong Kong College of Radiologists’ Palliative Medicine Oral Examination

Full Article

R Chow, L Zhang, IS Soong, OWK Mang, LCY Lui, KH Wong, SWK Siu, SH Lo, KK Yuen, YSH Yau, KY Wong, C Leung, SY Wong, R Ngan, E Chow, R Yeung

Hong Kong J Radiol 2017;20:232-6

DOI: 10.12809/hkjr1716804

Objective: To analyse the inter-rater reliability of scores in the Palliative Medicine Oral Examination among examiners, among observers, and between examiners and observers.
Methods: The Palliative Medicine Subspecialty Board aims to train oncology specialists for palliative medicine through a 4-year accreditation programme. At the end of the programme, trainees undergo a Board Examination involving subjective ratings by examiners. Each candidate rotated through two panels during the 1-day examination; one panel involved the written dissertation and questions pertaining to symptom management (viva 1) and the other about psychosocial issue (viva 2) and ethics (viva 3). A total of 10 candidates were evaluated on the four occasions using a 10-point scale by six examiners and four observers, along with one external examiner. Intraclass correlation coefficient (ICC) was calculated to determine inter-rater reliability (concordance) among examiners, among observers, and between examiners and observers. ICC values are classified as poor (≤0.20), fair (0.21-0.40), moderate (0.41-0.60), good (0.61-0.80), and very good (0.81-1.00).

Results: Among examiners, concordance was overall good at different stations. Among observers, concordance was fair to very good across different stations. Between examiners and observers, concordance was fair to moderate at two stations. Across all stations, concordance was good between examiners and observers.

Conclusion: The inter-rater reliability was good at the Board Examination administered by the Palliative Medicine Subspecialty Board of the Hong Kong College of Radiologists. The examination is reliable in accrediting practitioners for subspecialty certification.


Authors’ affiliations:

R Chow, L Zhang, E Chow: Sunnybrook Health Sciences Centre, University of Toronto, Toronto, ON, Canada

IS Soong, R Yeung: Department of Clinical Oncology, Pamela Youde Nethersole Eastern Hospital, Chai Wan, Hong Kong

OWK Mang, KH Wong, C Leung, R Ngan: Department of Clinical Oncology, Queen Elizabeth Hospital, Jordan, Hong Kong

LCY Lui, KY Wong: Department of Clinical Oncology, Princess Margaret Hospital, Lai Chi Kok, Hong Kong

SWK Siu, KK Yuen: Department of Clinical Oncology, Queen Mary Hospital, Pokfulam, Hong Kong

SH Lo, SY Wong: Department of Clinical Oncology, Tuen Mun Hospital, Tuen Mun, Hong Kong

YSH Yau: Department of Clinical Oncology, Prince of Wales Hospital, Shatin, Hong Kong

 

中文摘要

 

考官在香港放射科醫學院紓緩醫學口試評分者間的可靠性

R Chow、張麗瑩、宋崧、孟偉剛、呂卓如、黃錦洪、蕭偉君、魯勝雄、袁國強、邱秀嫻、黃家仁、梁偉濂、王韶如、顏繼昌、E Chow、楊美雲

 

目的:分析考官之間、觀察員之間以及考官與觀察員之間的紓緩醫學口試評分者間的可靠性。

方法:紓緩醫學專業委員會旨在通過一個4年認證計劃為腫瘤專科醫師進行紓緩醫學培訓。在課程結束時,學員需要通過考官的主觀評級考試。每個考生在一天的考試中輪流接受兩組考官的評核:一組涉及有關書面論文及症狀處理(viva 1),另一組涉及心理社交(viva 2)和倫理(viva 3)。共10名考生被六名考官和四名觀察員以及一名外部考官評估。使用內部相關系數(ICC)評估考官之間、觀察員之間以及考官和觀察員之間的評分者間可靠性(一致性)。ICC值分為差(≤0.20)、一般(0.21-0.40)、中等(0.41-0.60)、好(0.61-0.80)和非常好(0.81-1.00)。

結果:考官之間的一致性在不同站總體為好。觀察員之間的一致性在不同站為一般至非常好。考官和觀察員之間的一致性在兩個站為一般至中等。所有站之間,考官和觀察員間的一致性為好。

結論:香港放射科醫學院紓緩醫學專科委員會的紓緩醫學口試評分者間的可靠性好。該口試能可靠認證醫師的專業資格。