Author | Affiliation |
Jeff Riddell, MD | University of California, San Francisco-Fresno, Department of Emergency Mediicne, Fresno, California |
Stuart Swadron, MD | University of Southern California, Keck School of Medicine, Los Angeles, California |
DOI: 10.5811/westjem.2014.10.23864
In Reply:
We thank the authors of the letter for their insightful comments.
There were 98 patients with bedside US evidence of hydronephrosis and 11 patients with evidence of a stone. Only one patient with US evidence of stone had no hydronephrosis. The total number of patients with emergency department (ED) bedside US evidence of stone was 99. This correct number is consistent with Table 4.
The value for Table 1 “bedside US evidence of stone” should also be 99. The “Overall positive finding (hydronephrosis or stone) column in Table 2 should be 99, not 103. This changes the overall sensitivity to 79.2% (95% CI), rather than the 82.4% as published originally, which is consistent with the previously reported sensitivities cited in our paper.
Table 2. Sensitivity of ultrasound in all patients.
N=125 | Ultrasound (US) hydronephrosis | US stone | Overall positive finding (hydronephrosis or stone) |
ED bedside US evidence | 98 | 11 | 99 |
Sensitivity | 78.4% | 8.8% | 79.2% |
95% CI | 70.0-85.1% | 4.7-15.6% | 70.8-85.7% |
ED, emergency department
The emergency physicians performing the ultrasounds were not formally blinded to the computed tomography (CT) results. However, it is common practice in our emergency department to perform the bedside ultrasound prior to ordering a CT. Though possible that a resident went back and did an US after viewing the CT result, it is unlikely to occur in a busy ED.
Testing of inter-rater agreement is one of the methodologic standards in emergency medicine chart reviews.1 Our reviewers re-abstracted a sample of charts, blinded to the information obtained by the first reviewer. There were no discrepancies.
Lack of inter-observer variability of the US examination is a limitation. If there were significant interobserver variability, it could have biased the results of the study. There is little in the existing renal ultrasound literature regarding interobserver variability. One study of urologists interobserver agreement was excellent for the grade assessment of hydronephrosis by conventional sonography (κ= 0.82; p<0.001).2 Goertz and Lotterman studied ED resident and attending physicians performing US and found there was very good interobserver agreement between the degree of hydronephrosis as determined by the performing emergency physician and QA review with κ = 0.847 (95% confidence interval, 0.777-0.918).3 A study published in September showed a difference in sensitivity of renal ultrasound performed by emergency medicine residents and fellowship-trained emergency physicians for the detection of hydronephrosis. The authors did not report a kappa statistic for interobserver agreement.4
US examinations were performed in the ED with a SonoSite MicroMaxx ultrasound machine with a C60e 2 to 5-MHz curvilinear or P17 1 to 5-MHz phased array ultrasound probe (SonoSite, Bothell, Wash). The CT stone examinations were performed on a single-source 64-detector CT scanner (Aquilion CFX; Toshiba, Tustin, Calif), using the following parameters: 120kVp, 100-500mAs (using dose modulation depending on the size of the patient), gantry revolution speed of 0.5 second, pitch factor of 0.844, beam collimation of 64 x 0.5mm, variable field of view (depending on the size of the patient), standard body kernel. This data is reconstructed into 3mm thick sections in the transverse, coronal and sagittal planes.
Sensitivity was 100% for stones ≥ 6mm when combined with hematuria. Of the 60 patients with stones ≥ 6mm, 7 had 3 or more stones. Put another way, 7 of the 8 cases with 3 or more stones had a stone ≥ 6mm.
We thank the authors for their comments and hope this additional explanation helps readers place this retrospective study in its proper context. It was our hope that it would spur further prospective studies. Many of our questions have since been addressed with publication of the initial results of the STONE trial, a prospective multi-centered study of ED patients with suspected renal colic.5
Footnotes
Address for Correspondence: Jeff Riddell, MD. University of California, San Francisco-Fresno, 155 N Fresno St., Fresno, CA 93701. Email: jriddell@fresno.ucsf.edu.
Conflicts of Interest: By the WestJEM article submission agreement, all authors are required to disclose all affiliations, funding sources and financial or management relationships that could be perceived as potential sources of bias. The authors disclosed none.
REFERENCES
1. Gilbert EH, Lowenstein SR, Koziol-McLain J, et al. Chart reviews in emergency medicine research: where are the methods? Ann Emerg Med. 1996;27:305-308. [PubMed]
2. Rud O, Moersler J, Peter J, et al. Prospective evaluation of interobserver variability of the hydronephrosis index and the renal resistive index as sonographic examination methods for the evaluation of acute hydronephrosis. BJU Int. 2012;110(8 Pt B):E350-6. [PubMed]
3. Goertz JK, Lotterman S. Can the degree of hydronephrosis on ultrasound predict kidney stone size? Am J Emerg Med. 2010;28(7):813-6. [PubMed]
4. Herbst MK, Rosenberg G, Daniels B, et al. Effect of provider experience on clinician-performed ultrasonography for hydronephrosis in patients with suspected renal colic. Ann Emerg Med. 2014;64(3):269-76. [PubMed]
5. Smith-Bindman R, Aubin C, Bailitz J, et al. Ultrasonography versus computed tomography for suspected nephrolithiasis. N Engl J Med. 2014;18;371(12):1100-10. [PubMed]