Objective
Intra-rater agreement in observing and decision making in diagnosis of any disease is of great importance. This investigation is to observe and read ultrasound pictures of ovarian cysts and distinguish its category for any radiologist. Distinguish ability is one of the related entities in this matter and radiologists’ ability in correct diagnosis is of great concern.In this study, we evaluated radiologist’s distinguish ability of ordered categories of ovarian cyst diseases (benign, borderline and malignant) in ultrasonography. To do this, we measured intra-rater agreement of radiologists by Weighted Kappa coefficient, and then by the help of “square scores association model” we evaluated their distinguish ability in diagnosis of the severity of the ovarian cyst’s diseases.
Methods
In this analytical cross-sectional study, two radiologists and there radiology residents assessed ultrasounds of 40 patients separately and independently in two periods (with the interval of one week). Patients selected from those who were referred to Mirza Koockk Khan Hospital in January 2012. Ultrasounds were performed by an expert radiologist and by a single apparatus.
Result
Data from radiologists was evaluated by “square scores association model” due to their superior result of distinguish ability. Mean of Weighted Kappa coefficient was 0.81 and intra-rater agreement was 0.99 for our radiologists, but due to weaker results of our residents, we used “agreement plus square scores association model” for analyzing and mean of Weighted Kappa coefficient was 0.65 and intra-rater agreement was 0.97 for them.
Conclusion
Although radiologists had a better function than their residents, all of them showed appropriate distinguish ability and intra-rater agreement in diagnosis and categorizing of the ovarian cyst’s disease. To distinguish benign category from borderline was more difficult than to distinguish malignant category from borderline and radiologists showed better result in this than their residents did.
Keywords
Ovarian cyst, ultrasonography, reliability