Objective: To determine if the quality of medical imaging reportsdiffers significantly between radiologists and nonradiologists.
Study Design: A retrospective nonblinded review of randomlyselected chest and long bone x-ray reports by orthopedists and primarycare physicians compared with randomly selected imagingreports generated by radiologists.
Guideline for Communication: Diagnostic Radiology.
Methods: We randomly selected 1 report from each of 50 highself-referring physicians privileged by 2 metropolitan New Yorkarea health plans for both bone and joint studies and chest x-raysfor a total of 200 reports (50 bone and joint x-rays from each planand 50 chest x-rays from each plan). We compared them with 50randomly selected radiologist-generated reports. The reports wereevaluated for quality based on the American College of Radiology'sThe datawere analyzed by the 2-sample -test between proportions at the95% confidence interval.
Results: Radiologists consistently provided higher-quality medicalimaging reports than nonradiologists.
Conclusions: To improve imaging service quality, all providersshould be held to the same standards for reporting and communicationof results.
(Am J Manag Care. 2005;11:781-785)
Guideline for Communication: Diagnostic Radiology
CPT 2004 Physician's Current Procedural
Twenty-first century radiology requires rapid andaccurate communication of imaging results. TheAmerican College of Radiology's states that every imaging study should have an "officialinterpretation."1 The College further defines "officialinterpretation" as the written report that becomes partof a patient's permanent medical record. The AmericanMedical Association (AMA) supports this position, statingin the "A written report, signed by the interpretingphysician, should be an integral part of a radiologicprocedure or interpretation."2 According to KarenZupko and Associates Inc3 (a practice management consultingcompany), the Center for Medicare and MedicaidServices (CMS) applies section 15023 of the Medicareto outpatient radiology services. Thisregulation requires a written report for payment of professionalservices. It also distinguishes between an"interpretation and report" and a "review," notation, orcomment about the study.4 Finally, the NationalCommittee for Quality Assurance (NCQA) has generalguidelines for keeping good medical records.
In 2002, the Consensus Workgroup on Health InformationCapture and Report Generation publishedtheir findings and recommendations regarding improvementin healthcare documentation and communication.They identified inadequate, illegible documentation andlimited access to the medical record as contributing tomedical errors.5 The group also found that despite thefact that "accurate, accessible, and shareable healthinformation is a well-accepted prerequisite of goodhealth care," inadequacies in documentation practices,accessibility, and shareability are accepted in the UnitedStates. They added that these substandard practicescompromised "patient safety, public safety, continuity ofpatient care, healthcare economics, clinical research,and outcomes analysis." They also noted that lack of uniformityof medical records and the use of free text madefinding important information difficult and time consuming.The inconsistent terminology used made it difficultto understand the information once located.
During our work in radiology benefits management,we found that the availability of plain film reports wasinconsistent when performed at the treating physician'soffice (self-referral) but consistent when performed at aradiology site. We decided to determine the extent ofthis problem.
Nonradiologist Case Selection
We evaluated the imaging reports of internists andorthopedists privileged by 2 metropolitan New York areahealth plans to perform chest and long bone and joint x-raysin their offices. We searched our database and identifiedthe 50 physicians who self-referred the most chestand bone x-rays for each plan. We then randomly selected1 case for each physician and requested a copy of theexamination report for both bone and joint studies andchest x-rays for a total of 200 reports (50 bone and jointx-rays from each plan and 50 chest x-rays from eachplan). After reviewing the responses we noted that radiologistsactually generated a few reports, even though thestudy had been performed in the treating physician'soffice. We eliminated these reports from our study.
Radiologist Case Selection
Additionally, we randomly selected 50 radiologist-generatedreports from those on file in our QualityManagement department. These included 22 chestx-rays, 21 bone and joint films, and 7 abdominalfilms, upper gastrointestinal series, or barium enemaexaminations.
What Makes a Good Report?
ACR Guideline for
Communication: Diagnostic Radiology,
We determined the required elements for a radiologyreport (Table 1) from the making a fewappropriate modifications for this study. The ACRguideline encourages inclusion of the following 4 demographicelements:
Of these factors, we included only the latter 2 elements.Other elements included in the ACR guideline,but not evaluated in our study, are:
A board-certified radiologist with more than 25 years'experience interpreting plain films then compared thesubmitted materials to required elements. The reviewerwas not blinded to the specialty of the interpretingphysicians. We analyzed the data using a 2-sample testbetween proportions at the 95% confidence level(< .05).
Reports for Analysis
An imaging report accompanied all 50 cases of theradiologists. Of 200 requests for reports from the nonradiologistgroup we received 163 responses, for aresponse rate of 81.5%. Of the 163 responses, a radiologistactually read 16, either at a radiology facility or in anonradiologist's office. We excluded these 16 cases fromthe study, resulting in 147 cases for review. Of these, 56(38.1%) had no report as described by the AmericanCollege of Radiology, although several did include anotation or comment about the study in office notes orin a letter to the referring physician (these were includedin the analysis). Another 17 (11.6%) sent a reportthat was dated after our request for the report or wasaddressed in a manner indicating that the report mighthave been generated in response to our request. These17 cases were retained in the data analysis because wecould not accurately determine whether the reportexisted prior to our request. Therefore, up to 49.7% ofthe cases may not have had an appropriate report.Seventy four (50.3%) sent a real report (Table 2).Nonradiologists failed to use a unique identifier 70% ofthe time. Handwritten notes or reports were often illegible,resulting in useless documentation. No handwrittennotes were found in the radiologists reports.
How the Results Measure Up
Radiologists' reports were consistently better thanthose of nonradiologists. A statistically significant difference was noted between the radiologists' reportsand those of the nonradiologists for the followingcategories of demographic data: use of uniqueidentifier number, patient's date of birth, patientsex, and interpreting physician's name and signature.For clinical information the data differedsignificantly between the radiologists' reports andthose of the nonradiologists for the description ofthe examination, views taken, description of findings,and presence of an impression or conclusion(Table 3, Figure).
Radiologists did not include the indications forthe examination as often as did nonradiologists(46% vs 58%). This information may have beenomitted because the requesting physician mayhave provided the radiologist with inadequateinformation. When we stratified the nonradiologists' results according to internists or familypractitioners and orthopedists, only 14 of 68(21%) chest x-ray reports from internists or familypractitioners included a reason for the examinationcompared with 67 of 79 (86%) bone andjoint x-ray reports from orthopedists, mostlybecause the orthopedist reports were frequentlypart of a letter to the referring physician.
We designed this study to evaluate the adequacy ofmedical record documentation, not the films' technicalquality or interpretation accuracy. This distinction isimportant because self-referring physicians performsignificantly more x-rays than radiologist-referringphysicians and the absence of a good medical recordcan negatively impact patient care. In 1998, Spettelland colleagues6 reported that in nonhospital settingsnonradiologists performed approximately 67% of chestand spine films and between 78% and 86% of bone andjoint films.
Healthcare documentation has come under scrutinyin the last few years. In 2002, a report from theConsensus Workgroup on Health Information Captureand Report Generation indicated that the accuracyand accessibility to healthcare information was compromisedin the United States.5 The report stated thatpatient safety, public safety, continuity of patient care,healthcare economics, and clinical research and outcomesanalyses were all adversely affected by poordocumentation and record keeping. This group stronglyadvocates adoption of electronic medical recordsthat are uniformly structured and designed to besearchable.
Inadequate Imaging Reports Compromise Care
Documentation of an imaging examination is consideredto be an important part of a patient's medicalrecord.7 The American College of Radiology (ACR) hasdescribed guidelines for structuring good imagingreports, while the NCQA has general guidelines formedical records maintenance. Both organizations agreethat each page of the medical record should includethe patient's name, date of service, interpreting physician'sname, and the physician's signature (thiscould be a unique electronic identifier). The ConsensusWorkgroup on Healthcare Information Capture andReport Generation also recommends the use of aunique patient identifier. A unique identifier allowshealthcare providers to differentiate patients with thesame name and date of birth.
In our study we found myriad documentation errorsamong records from nonradiologists. The absence of areport or a brief comment about an x-ray in a chart noteor in a letter to a referring physician (which happenedin up to 49.7% of the cases in the current study in thenonradiologist group) can lead to inadequate communicationbetween healthcare providers. This communicationbreakdown may result in repeat examinations,which increase costs, expose patients to unnecessaryradiation, and may potentially delaypatient care. Failure to describe the findings orto include an appropriate impression or differentialdiagnosis limits the examination's value toother providers. The use of excessively shortnotations in a record or letter often makes locatingor interpreting the results difficult whenanother provider, healthcare facility, or healthplan requests information about a patient.
In 2000, Moskowitz and associates8 reportedsimilar observations in their review of nonradiologists' imaging reports. In their study 62% ofthe offices evaluated did not issue a formal radiologyreport. At many of these sites "a note wasmade in the chart cryptically stating that a radiographwas either positive or negative."Another study of chest x-rays performed byPennsylvania's Blue Shield found 21 of 98reports to be incomplete.9
The current study also demonstrated thatalthough radiologists' reports more consistentlycomplied with national recommendations, thereis room for improvement, especially in documentingthe reason for the examination.
In addition to these issues concerning goodpatient care, legal issues exist regarding medicalrecord documentation and the matter of reimbursement.As stated previously, the CMS indicatesthat all imaging studiesshould include a writtenreport, and the AMA furthersuggests that the report besigned by the interpretingphysician. Radiology billingis unique in that the reimbursementis divided intoprofessional (identified bymodifier -26) and technical(identified by modifier -tc)components or is billed asglobal (no modifiers andcomprises the professionalplus technical components).Examinations performed inan office setting are usuallybilled globally. The professionalrelative-value units, orwork that determines theprofessional reimbursement,includes not only imageinterpretation but also reportpreparation. Failure to provide a report is not only poor patient care and insufficientrecord keeping in the event of a lawsuit, but canalso lead to claims of fraudulent billing for services,which are incompletely provided. Because reimbursementis the same for chest or bone x-rays regardless ofthe specialty of the interpreting physician, reportingstandards should also be the same.
One limitation of this study is that we obtained theradiologists' reports from our Quality Management division,rather than randomly requesting them. However,in the course of our normal procedures, radiologistshave consistently provided us with a written report.
In the interest of better patient care and improvedcommunication between healthcare providers, imagingservice providers should be required to produce legiblex-ray reports that are easily identified. This requirementshould be part of a health plan's privileging program.Additionally, we recommend that health plans definethe report's essential elements, including patient identificationrequirements. Furthermore, we suggest that theAmerican College of Radiology's guideline for communicationbe adopted and that all imaging service providersbe held to this national standard.
From CareCore National, LLC, Wappingers Falls, NY.
Address correspondence to: Shelley Nan Weiner, MD, FACR, Medical Director,CareCore National, LLC, 169 Myers Corners Road, Wappingers Falls, NY 12590. E-mail:email@example.com.
Practice Guidelines & Technical Standards.
1. Hauser JB, Mintzer R, et al, for the Guidelines and Standards Committee of theGeneral and Pediatric Radiology Commission. ACR practice guideline for communication:diagnostic radiology. In: Reston, Va: American College of Radiology; 2004:5-7. Available at:http://www.acr.org/s_acr/bin.asp?CID=541&DID=12196&DOC=FILE.PDF. AccessedSeptember 18, 2005.
CPT 2004 Physician's Current Procedural
2. American Medical Association. Chicago, Ill: American Medical Association; 2004:208.
3. LeGrand M, Maley M. The orthopaedic coding coach. June 2002. Available at:http://www.karenzupko.com/Resources/coding/xrays.htm. Accessed December 13,2004.
4. Centers for Medicare & Medicaid Services. Fee schedule for physicians' service.In: Part 3, chapter XV, section 15023:73. Baltimore, Md: Centersfor Medicare & Medicaid Services; Last modified on September 16, 2004. Availableat: http://www.cms.hhs.gov/manuals/14_car/3b15000.asp. Accessed December 14,2004.
5. Waegemann CP, Tessier C, Barbash A, et al, for the Consensus Workgroup onHealth Information Capture and Report Generation. Healthcare documentation: areport on information capture and report generation. Boston, Mass: MedicalRecords Institute; June 2002. Available at: http://www.medrecinst.com/pages/libArticle.asp?id=39. Accessed December 1, 2004.
AJR Am J
6. Spettell CM, Levin DC, Rao VM, Sunshine JH, Bansal S. Practice patterns ofradiologists and nonradiologists: nationwide Medicare data on the performance ofchest and skeletal radiography and abdominal and pelvic sonography. 1998;171:3-5.
7. Blue Cross Blue Shield of Georgia. Medical records standards. Available at:https://provider.bcbsga.com/provider/credentialing/medical_records.html. AccessedDecember 1, 2004.
AJR Am J Roentgenol.
8. Moskowitz H, Sunshine J, Grossman D, Adams L, Gelinas L. The effect of imagingguidelines on the number and quality of outpatient radiographic examinations.2000;175:9-15.
AJR Am J Roentgenol.
9. Kouri BE, Parsons RG, Alpert HR. Physician self-referral for diagnostic imaging:review of the empiric literature. 2002;179:843-850.