Upload
yuda-fhunkshyang
View
223
Download
0
Embed Size (px)
Citation preview
8/17/2019 Blado Reject Analysis SCAR 2002
1/25
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/11272119
Is Reject Analysis Necessary after Converting toComputed Radiography?
Article in Journal of Digital Imaging · February 2002
Impact Factor: 1.19 · DOI: 10.1007/s10278-002-5028-7 · Source: PubMed
CITATIONS
22
READS
590
3 authors, including:
Maria Elissa Blado
Texas Children's Hospital
4 PUBLICATIONS 44 CITATIONS
SEE PROFILE
Available from: Maria Elissa Blado
Retrieved on: 09 May 2016
https://www.researchgate.net/profile/Maria_Blado?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_4https://www.researchgate.net/profile/Maria_Blado?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_5https://www.researchgate.net/?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_1https://www.researchgate.net/profile/Maria_Blado?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_7https://www.researchgate.net/institution/Texas_Childrens_Hospital?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_6https://www.researchgate.net/profile/Maria_Blado?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_5https://www.researchgate.net/profile/Maria_Blado?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_4https://www.researchgate.net/?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_1https://www.researchgate.net/publication/11272119_Is_Reject_Analysis_Necessary_after_Converting_to_Computed_Radiography?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_3https://www.researchgate.net/publication/11272119_Is_Reject_Analysis_Necessary_after_Converting_to_Computed_Radiography?enrichId=rgreq-13d67604-a00e-4c96-a86b-887ed3010734&enrichSource=Y292ZXJQYWdlOzExMjcyMTE5O0FTOjEyNzU5MDk4NzczNTA0MEAxNDA3NDMxNDcwMTQy&el=1_x_2
8/17/2019 Blado Reject Analysis SCAR 2002
2/25
1
Title: Is Reject Analysis Necessary after Converting to Computed Radiography?
Primary Author: Rosemary Honea A.R.R.T., A.R.D.M.S.,
Secondary Author: Maria Elissa Blado
Secondary Author: Yinlin Ma
Affiliation: Edward L. Singleton Diagnostic Imaging Services
Texas Children's Hospital
Address: Edward L. Singleton Diagnostic Imaging Services
Texas Children's Hospital
6621 Fannin MC 2-2521
Houston, Texas 77030-2399
Phone: (832) 824-5563 voice
(832) 825-5370 facsimile
Internet Addresses: [email protected] (Rosemary Honea)
[email protected] (Maria Elissa Blado)
[email protected] (Yinlin Ma)
Topic 1. Modality Image Acquisition (Detectors, Imaging Physics, Quality Assurance)
mailto:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]
8/17/2019 Blado Reject Analysis SCAR 2002
3/25
2
ABSTRACT
Reject Analysis is an accepted standard of practice for Quality Assurance (QA) in conventional
radiology. The needs for Reject Analysis has been challenged by the introduction of Computed
Radiography (CR) because of low reported reject rates and because criteria for improperly exposed
image were lacking. Most CR systems include Quality Control (QC) workstations that are capable
of modifying the appearance of images before release, and also of deleting bad images before
being analyzed. Texas Children‟s Hospital has been practicing Computed Radiography since
October 1995, and now conducts essentially filmless imaging operations using a large-scale
Picture Archival and Communications (PACS) with fourteen CR units. In our hospital, the QC
workstation is a key element in our CR QC operation, however, the extensive software tools of the
workstation are limited in terms of avoiding repeated examinations. Neither the QC Workstation
nor the PACS itself are designed to support Reject Analysis, so our task was to design a system
that accommodates identification, isolation, and archiving of repeated examinations making use of
our electronic imaging systems. We had already developed transcription codes for our
radiologist‟s examination critique, so we adopted these as codes for rejected images. The
technologist at the QC workstation appends the critique code to patient demographic information,
and modifies other fields to indicate that the image is a reject, and archives as usual. Modified
routing tables prevent the release of rejected images, but ensure they are available for review. Our
frequency and reasons for repeated examinations are comparable to other reports of Reject
Analysis in the literature. The most frequent cause of a repeated examination is mis-positioning.
Developing the method for capturing repeat, collecting the data, and analyzing it is only one-half
of the battle. In order to achieve an improvement in services, it is necessary to feedback the results
to management and staff and to implement training as indicated. It is our intent to share our results
with PACS and CR vendors in the hope that they will incorporate some mechanisms for Reject
Analysis into the design of their systems.
8/17/2019 Blado Reject Analysis SCAR 2002
4/25
3
INTRODUCTION
Reject Analysis (RA), a.k.a. Repeat Analysis, is an accepted standard practice for Quality
Assurance (QA) in conventional radiography. Analysis of rejected images yields information
about the efficiency of the department and is the basis for Quality Control (QC) and the education
of the individual technologist1. While no one would question the value of performing Reject
Analysis, a.k.a. Repeat Analysis, in a conventional radiology department, the advent of Computed
Radiography (CR) has prompted some to challenge its relevance to electronic radiology
operations. This skepticism developed partly because of reports by early adopters of extremely
low reject rates using CR. The usual appearance of the CR image differs somewhat from a
conventional image and its contrast and density is automatically adjusted to improve appearance,
so criteria for improperly exposed images were slow to be recognized by practitioners. CR
systems almost universally include QC workstations that have the capability of modifying images
before release, as well as deleting unacceptable images. Even after more than a decade of
widespread clinical practice of CR, systems to support Reject Analysis are absent from standalone
CR systems as well as those incorporated into large-scale Picture Archiving and Communications
Systems (PACS). Our challenge was to utilize our electronic image acquisition and distribution
systems to develop a system for identifying, capturing, isolating, and archiving rejected images.
MATERIALS AND METHODS
Texas Children‟s Hospital operates a large-scale Agfa (Agfa Medical Systems, Ridgefield Park,
NJ) IMPAX Version 3.5 PACS. This system includes fourteen CR units with all Agfa Diagnostic
Center (ADC) production versions represented, namely the ADC70, ADC Compact, and ADC
Solo. Patient demographics and examination information are retrieved from the IDXRad Version
8/17/2019 Blado Reject Analysis SCAR 2002
5/25
4
9.7 Radiology Information System (RIS) and supplied to the ADC by Identification Stations
augmented by bar code scanners as described previously2. DICOM Modality Worklist
Management has been tested but is currently not practical for clinical operations. The Diagnostic
Imaging Service performed 141,321 examinations in calendar year 2001 (IDXRad), of which
93,386 are CR (Oracle version 7.0), consisting of 1.72 images per examination on the average.
Virtually all the primary interpretation is conducted using softcopy review stations: routine
printing and archiving of hardcopy images ceased on October 31, 2000.
The Processing Station. The ADC image is transmitted to one of eleven Processing Stations (PS,
a.k.a. VIPS) before being released for distribution in the PACS system. This QC workstation is a
key component in our imaging operation: a technologist inspects each image at the PS and
determines whether it was appropriately acquired and properly identified. If all views are
complete and acceptable, they are transmitted to the PACS system, the examination is “completed”
in the RIS, and the patient is released. In the event of errors, the PS has some sophisticated
features for modifying the image. As shown in Table I, some of these features are useful in
recovering images that would otherwise be rejected, such as by annotating the image when the
Left/Right marker is obscured, correcting incorrect demographic information, or reprocessing an
image with the appropriate examination menu selection. Technologists are discouraged from
making drastic adjustments to the image at the workstation3. The PS display is not designed for
primary interpretation: the image is down-sampled for display and the monitor luminance is not
strictly controlled. Technologists are trained to recognize anatomy, not pathology, and are as
likely to obscure important clinical features as to enhance them, especially when the display does
not match the appearance on diagnostic workstations. The operator of the PS also has the ability to
make bad CR images disappear without a trace.
8/17/2019 Blado Reject Analysis SCAR 2002
6/25
5
Reasons for Repeated Examinations. Table II shows reasons why an examination might need to
be repeated in a conventional department4. Each of these can occur with CR, although the
categories of “under -“ and “over -exposure”, and “lost film” require further elaboration. Even
though the CR system acts to adjust the density of the image to compensate for inappropriate
radiographic technique, a CR image that is under-exposed will appear “grainy” and a CR image
that is over-exposed subjects the patient to more radiation dose than necessary for the
examination5. In conventional radiography, the exposure level is evident from the optical density
(OD) of the film. Because density of CR is adjustable, the exposure level is revealed by numerical
analysis of pixel values in the digital image. An acceptable range of values was established for
this exposure indicator, called “lgM” or the “logarithm of the Median of the grayscale histogram”.
The range of acceptable values allows a factor of two under- or over-exposure around the target
value. Contrary to claims by some PACS proponents, electronic images can also be lost, either by
operator deletion or by equipment malfunctions resulting in corrupt image data files that cannot be
transferred.
Reject Codes. We previously reported a system of dictation codes for documenting radiologist
examination critiques6. Technologists team leaders used the dictation codes, shown in Table III to
classify the reason for repeating an image. The appropriate critique code, a delimiter (/), and the
responsible technologist‟s identification number are inserted before the contents of the Patient
Name field at the PS.
Segregating Rejected Images. A rejected image sent to PACS from the PS normally would join
the diagnostic images in the patient‟s examination folder. A procedure was developed to modify
specific fields to indicate that the examination is a reject. The text string “NONE” is inserted in
front of the contents of the Patient Name, Medical Record Number, and Accession Number fields.
8/17/2019 Blado Reject Analysis SCAR 2002
7/25
6
When these modified images are sent to PACS, they fail validation by the RIS interface and are
sequestered from public view by PACS users.
Releasing the Rejected Images for Archiving and Review. A PACS Analyst with
Administrative privileges retrieves the sequestered rejected images, modifies the Examination
Description field by inserting the text string “NONE-“, and forces the image into the public area.
(It is unfortunate that on the PS, the user is not allowed to edit the Examination Description field.
If this field was editable, the technologist will be able to enter the text „NONE‟ to its contents.
Once it is archived, there will be no need to manually modify this field by the PACS Analyst and it
will automatically route to its destinations.)
At this point the rejected image is disseminated according to rules established in the IMPAX
Routing Tables (Table IV). To avoid widespread dissemination of rejected images to clients
throughout the PACS network, the Routing Tables were extensively modified to send Rejects only
to the Archive Servers, where they would be automatically recorded on Magneto-Optical Disk
(MOD) and tape media7. Modification of Routing Tables that were appropriate for clinical
imaging operations was a major effort, and warrants further explanation in the discussion that
follows.
If we did not have reject images sent from the acquisition station, the specialty field in the Routing
Pattern table will indicate “Don‟t Care” (it does not matter what specialty the image is coming
from that station), as shown in Table IV. A new specialty called NONE is created and the routing
tables assign any examination procedure that has the text „NONE‟ with this specialty. It is quite
unfortunate that the routing pattern of our PACS system does not allow exclusive routing by
specialty. It will allow a specific specialty to route to a certain destination; for example, route
8/17/2019 Blado Reject Analysis SCAR 2002
8/25
7
ONLY FLUORO cases to a specific review station. But the design of configuring our routing
tables do not allow all specialties EXCEPT a specific specialty (in our case, the NONE specialty)
to route to a destination – unless we create an entry for EACH specialty.
As an example, on Table IV the CR modality has 14 different specialties. An entry for each
specialty was created to route to the NICU review station (patient location „NEO‟). The „NONE‟
specialty was not included so that the NICU physician will not be able to view the rejected image.
If there were more than one patient location to an area, i.e., ER and EMC for the Emergency
Room, then 14 entries for EACH patient location will be created on the routing pattern, in this case
a total of 28 entries for just one destination, as demonstrated in Table V.
The growth of the routing tables, in order to accommodate the rejected images into our archive
servers, was exponential and dependent on the number of each criteria: specialty, referring
physician, and patient location. The more specialties there are, the more the entries will be created
on the routing table for each referring physician and / or patient location.
Analysis of Rejected Images. Once a system was in place for documenting and preserving
rejected images, tools were needed to interrogate the image database to collect meaningful data for
Performance Improvement of imaging operations.
A script, using Standard Query Language (SQL) queries, was written to query the IMPAX image
database and to generate a report of all the archived NONE files in a month. This monthly report
is then imported to an Access database, which includes the following fields:
date of examination
time of examination
modality
8/17/2019 Blado Reject Analysis SCAR 2002
9/25
8
technologist number
accession number
examination procedure
number of images
and reject code.
All the data for these fields are retrieved from the IMPAX image database.
RESULTS AND DISCUSSION
Various statistical reports, using the information from NONE Access database, were generated.
These include:
Number of rejected CR images for the year 2001 and its percentage over the total number of
images in the archive (Table VI)
Number of rejected CR images broken down by reject code (Table VII).
Number of rejected CR images per shift and its percentage over the total number of images in
the archive per shift (Table VIII).
Number of rejected CR images by technologist number (Table IX).
Number of rejected CR images by examination description (Table X).
Table VI shows the number of rejected CR images for the year 2001 and its percentage over the
total number of images archived in a month. The data yields a yearly overall reject rate of 4.07%
of CR rejects for the year 2001. The average change in the monthly rates is 0.53% with the
maximum of 1.13% between August and September. Figure 1 is the chart representation of Table
VI.
8/17/2019 Blado Reject Analysis SCAR 2002
10/25
9
Table VII reports the number of rejected CR images broken down by reject reason code for the
year 2001. The codes listed are based on Table III, the Radiologist Examination Critique List. A
code for OTHER (code number 45) and NOT INDICATED (code number 46) were added to this
list. According to this data, the most common reason for rejecting an image is mis-positioning, at
62% compared with the total number of rejects. Inadequate inspiration comes in second at
8.73% and not enough contrast at 6.74%, while 5.7% of the rejects were not labeled with a reject
code. Two separate studies conducted at other institutions within the past 5 years also reported
positioning as the top reason for their repeated examinations, one at 57.19%4 and the other at
46.9%6.
Table VIII is a sample report on the number of rejects by shift. Figure 2 is the chart representation
of the percentages on Table VIII. These reports show that the weekend shift consistently has the
most number of rejects for the whole year.
Table IX shows a portion of the report listing the technologists‟ numbers and the number of
rejected images they archived over a period of time. This information is further broken down by
reject codes. The report consistently shows MISPOSITIONING as being the most common reason
for rejecting an image among the technologists. The number of rejected images by a technologist
will be compared with the number of images in the examinations a technologist actually performed
during this same period. The latter piece of information still needs to be entered in the NONE
database in order to get this ratio.
The under-reporting of rejected images, the toleration of unacceptable images, and the inconsistent
following of the procedure for sending rejected images to the archive could contribute to the
inaccuracy of the statistics. On the processing station, the technologist may send an image
8/17/2019 Blado Reject Analysis SCAR 2002
11/25
10
directly to the trashcan. Once that trashcan is emptied, the image will be deleted for good. This is
one example of under-reporting of rejected images.
Table X shows that CHEST examinations have the most number of repeats at 51.66% while
ABDOMEN procedures are at 9.97%. 9.69% of the rejects have not been indicated with an
examination description by the technologists (NOT INDICATED).
The list in Table III has been modified by the area supervisors to meet the technologists‟ needs in
being more specific and descriptive with their reasons for rejecting images. A number of the
reasons were broken down in more detail by indicating more specifics of various scenarios or
possibilities for doing repeats. This modified reject code list will assist the technologists with
consistency as well. Other reasons for repeats that have been discovered are:
„double exposure‟ (as opposed to „duplicate images‟),
„wrong marker‟ (as opposed to „no marker‟),
„patient mis- positioned‟ and „cassette mis- positioned‟,
equipment malfunction,
high / low lgM (instead of too much contrast or not enough contrast),
breaking down „artifacts‟ to „patient‟, „cassette‟, or „equipment‟,
radiologist request to reject,
and test images.
The categories of „Availability‟, „Identification‟, „Appropriateness of Exam‟, and „Diagnostic‟
have been modified by omitting reasons that are not used for rejecting an image. Table XI is the
modified reject code list that will be used beginning this year.
8/17/2019 Blado Reject Analysis SCAR 2002
12/25
11
There is more work to be done. There are a couple of pieces of information that may be added in
the Access database of NONE files. The first is the area where the examination was performed
(i.e., portable, main radiology/fluoroscopy, outpatient, etc.) which will be useful for the
supervisors of these areas. The second is the lgM value of every rejected CR image, which can be
extracted from its DICOM header file. From this, the distribution of the lgM values can be
reported and analyzed. The automation of data transfer from the IMPAX database to the Access
database of NONE files will also be explored. The PACS team will continue to extract the data
into the NONE Access database, from which the management of Diagnostic Imaging may generate
their statistical reports for their own use. The procedure for labeling the rejected images with
„NONE‟ will also be clarified for consistency as well as to eliminate the possibility of having any
more records recorded as „NOT INDICATED‟. Changes are expected as we evolve through this
electronic process of accommodating rejected images.
These preliminary statistical reports were presented to Diagnostic Imaging‟s management and staff
and a number of them have viewed the rejected images. These reports, being a source for Quality
Improvement (QI), will be analyzed to investigate the causes of rejects and find ways to eliminate
them. Retraining of staff and other corrective actions may have to be implemented and
documented.
The next focus of our department‟s QI efforts in Reject Analysis is expanding this process to the
areas or modalities that involve some form of radiation, such as Computed Tomography (CT),
Nuclear Medicine (NM) and scout images of Fluoroscopy examinations (not all the fluoroscopy
images are archived). How the technologists of these areas will be sending there reject images will
have to be reviewed and documented. This includes finding out where in their imaging chain do
8/17/2019 Blado Reject Analysis SCAR 2002
13/25
12
we accommodate the reject analyses. Their data will be placed in the same database as the reject
data for CR. Reports will also be generated and shared with the supervisors of these areas.
Of course, all these accommodations for reject analyses have been based on current versions of the
software of the acquisition stations, processing stations, and the PACS archive. They may not
work on the next version of IMPAX or processing stations. As we upgrade to the next versions of
each system, we would have to review the versions to assure that we can continue to do the
procedure we have developed for reject analyses.
We also intend to share our results with our vendors. They may be able to assist us in
accommodating reject images electronically, hopefully through a user-friendly mechanism, if not
in the current versions, maybe in future versions of their systems. Vendors play a vital role in the
world of reject analysis8. A few suggestions are one that allows specific fields to be edited to
include „NONE-„ labels, allows automatic r outing of rejected images, and will not allow deletion
of such images on the processing or quality control station.
The Diagnostic Imaging service formed a team to lower the reject rate. This team consists of all
Team Leaders from Nuclear Medicine, Ultrasound, Cat Scan, Portable X-ray, Outpatient,
Magnetic Resonance (MR), Main Radiology, and the PACS team. The lists of rejects (NONE
files) included discarded images from each modality. Team leaders were surprised to find that
some MR Technologists routinely acquire additional series, rather than following the appropriate
clinical protocol. When these Technologists determine which Radiologist is to interpret the
examination, they were having the PACS Analyst split away the images not in excess of the
protocol of the individual Radiologist protocol. The Team leader commented that this practice
contributed to extended patient scan times for a service already suffering from a substantial
8/17/2019 Blado Reject Analysis SCAR 2002
14/25
13
backlog as well as wasting PACS Analysts time and PACS archive space. This finding reinforced
the idea that reject analysis is valuable even for modalities that do not involve ionizing radiation.
CONCLUSIONS
Reject Analysis must be conducted routinely regardless of using conventional film/screen or CR
radiography. Attention to the sources and frequency of rejects can dramatically improve routine
image quality, provide a basis for in-service training of the individual technologist, resulting in
better patient care. The results of our reject analysis led our department to modify our entry
training program for new Technologists to emphasize averting the most common mis-positioning
errors. This is also included in evaluating job competency at 90 days post employment. The
efforts of the PACS Analysts in compiling reject reports from the image database are wasted
unless administrators are willing to implement methods of addressing the causes of rejects. Team
leaders are also key to this method: they are the ones who assure that rejects are properly reported
and they are the ones who determine whether an individual technologist or a group of technologists
need additional training.
The purpose, methodology, and importance of reject analysis must be emphasized with PACS
vendors so they can incorporate this in their software.
8/17/2019 Blado Reject Analysis SCAR 2002
15/25
14
REFERENCES
1. S.Peer, R. Peer, M. Walcher, M.Pohl, W. Jaschke: Comparative reject analysis in
conventional film-screen and digital storage phosphor radiography Eur. Radiol.9, 1693-1696
(1999)
2. Shook, K.A. O‟Neall, D., and Honea, R. Challenges in the integra tion of PACS and RIS
databases. Journal of Digital Imaging Vol 11 No. 3 Suppl 1 (August) 1998: pp. 75-79.
3. Willis, C.E., Parker, B.R., Orand, M., and Wagner, M.L.: Challenges for pediatric radiology
using computed radiography. Journal of Digital Imaging. Vol. 11 No. 3 Suppl 1 (August) 1998
pp156-158.
4. Willis, C.E., Mercier, J., and Patel, M.: Modification of conventional quality assurance
procedures to accommodate computed radiography. 13th Conference on Computer
Applications in Radiology. Denver, Colorado. June 7, 1996. pp. 275-281.
5. Willis, C.E.: Computed radiographic imaging and artifacts. Chapter 7. in Filmless Radiology.
New York: Springer-Verlag. pp. 137-154. 1999.
6. Willis, C.E. Computed Radiography: QA/QC in Practical Digital Imaging and PACS.
Medical Physics Monograph No. 28. Madison: Medical Physics Publishing pp 157-175. 1999.
7. Willis, C. E.; McCluggage, C.W.; Orand, M.R., and Parker, B.R. Puncture Proof Picture
Archiving and Communications Systems. Journal of Digital Imaging Vol 14 No 2 Suppl 1
(June) 2001: pp 66-71.
8. Barnes, Eric. IN DIGITAL RADIOLOGY, QA MEANS NEVER HAVING TO SAY
YOU‟RE SORRY; September 19, 2000, http://www.auntminnie.com/index.asp?sec=sea&sub=res
http://www.auntminnie.com/index.asp?sec=sea&sub=reshttp://www.auntminnie.com/index.asp?sec=sea&sub=reshttp://www.auntminnie.com/index.asp?sec=sea&sub=reshttp://www.auntminnie.com/index.asp?sec=sea&sub=res
8/17/2019 Blado Reject Analysis SCAR 2002
16/25
15
Table I. Agfa Processing Station Features
Demographic EditingImage modifications
– Reorientation
– Annotation
– Window and level
CollimationExposure field mask or removal
– Examination menu selectionMeasuring distance
Invert
Orientation change
– Sensitometry curve selection
Image processing (MUSICA -MultiScale Image Contrast Amplification)
)
Table II. Reason for Repeated Examinations in Conventional Department
Artifacts
Mis-positioningOver-collimation
Patient motion
Double exposure
Inadequate inspirationOverexposed - too dark
Underexposed - too light
Marker missing or wrongWrong examination
Wrong patientFilm lost in processor
8/17/2019 Blado Reject Analysis SCAR 2002
17/25
16
Table III: Radiologist Examination Critique List
RADIOLOGIST EXAM CRITIQUE
Media Comment Category Fault Specification Dictat ion Code
F =Film Availability Current Exam Not Local 1
S =Soft Copy Not on System or Cache 2
Prior Exam Not Local 3
Not on System 4
Number of Images Missing Images 5
Duplicate Images 6
Image Sequence Wrong Sequence 7
Combined Exam 8
Identification Patient Identification Wrong MRN 9
Wrong Patient 10
Wrong Name 11
Wrong DOB 12
Exam Information Wrong Accession Number 13
Wrong Exam Procedure 14
Annotation Incorrect Orientation 15
Improper Placement of Marker 16
No Marker 17
Appropriateness of Exam Wrong Diagnosis 18
Wrong Exam Performed 19
Inappropriate Exam 20
No History Provided 21
Technical Mis-positioned 22
Inadequate Inspiration 23
Motion 24
Collimation Not Enough 25
Too Much 26
No Collimation 27
Shielding Image Artifacts (Holding) 28
Inappropriate Shielding 29
No Shielding 30
Quality Density Too Dark 31
Too Light 32
Blurred Monitor 33
Image 34
Contrast Too Much Contrast35
Not Enough Contrast 36
Noisy 37
Image Size Magnified 38
Minified 39
Artifact (Note: Please Specify) 40
Diagnostic Repeat: Non Diagnostic 41
Save For Teaching File 42
Save for Green Dot 43
8/17/2019 Blado Reject Analysis SCAR 2002
18/25
17
Table IV: Routing Pattern, not allowing the routing of a rejected image to the user, but only
to the archive servers.
Table V: An entry for each specialty with each patient location or referring physician for
each destination is created on the routing table, giving an exponential growth to the table.
8/17/2019 Blado Reject Analysis SCAR 2002
19/25
18
Table VI: Number of Rejected Images reported for the year 2001.
Month Jan Feb Mar Apr May Jun Jul Aug Sept Oct Nov Dec TOTAL AVERAGE
Modality Computed Radiology
(CR)# of NONE
Images559 617 541 573 456 493 446 458 622 611 542 613 6531 544
# of TotalImages inArchive
13028 12931 13388 12966 13336 12074 12520 12842 13235 15206 14428 14667 160621 13385
% 4.29 4.77 4.04 4.42 3.42 4.08 3.56 3.57 4.7 4.02 3.76 4.18 4.07 4.07
Figure 1: Monthly CR rejects for the year 2001 versus the total number of image archived.
Monthly Distribution of Rejects
4.29
4.77
4.04
4.42
3.42
4.08
3.56 3.57
4.70
4.023.76
4.18
0.00
1.00
2.00
3.00
4.00
5.00
6.00
Jan Feb Mar Apr May Jun Jul Aug Sept Oct Nov Dec
Month
% o f R e j e c t e d I m a
g e s t o T o t a l
A r c h i v e d I m
a g e s
8/17/2019 Blado Reject Analysis SCAR 2002
20/25
19
Table VII: Number of Images per Reject Code for 2001.
Sum of Reject Reason (CR, By Image)
Reject Code Reject Description Number ofImages
Percentage toTotal # of Rejects
%
22 MISPOSITIONED 4035 61.7823 INADEQUATE INSPIRATION 570 8.73
36 CONTRAST--NOT ENOUGH CONTRAST 440 6.74
46 NOT INDICATED 372 5.70
40 ARTIFACT 286 4.38
35 CONTRAST--TOO MUCH CONTRAST 139 2.13
24 MOTION 131 2.01
41 REPEAT: DIAGNOSTIC 83 1.27
6 # OF IMAGES--DUPLICATE IMAGE 74 1.13
32 DENSITY--TOO LIGHT 69 1.06
16 ANNOTATION--IMPROPER PLACEMENT 75 1.15
17 ANNOTATION--NO MARKER 54 0.83
34 BLURRED--IMAGE 28 0.4326 COLLIMATION--TOO MUCH 23 0.35
19 WRONG EXAM PERFORMED 24 0.37
28 SHIELDING--IMAGE ARTIFACTS 21 0.32
31 DENSITY--TOO DARK 13 0.20
15 ANNOTATION--INCORRECT ORIENTATION 13 0.20
33 BLURRED--MONITOR 11 0.17
5 # OF IMAGES--MISSING IMAGES 10 0.15
14 EXAM INFORMATION -WRONG EXAM PROCEDURE 8 0.12
29 SHIELDING--INAPPROPRIATE SHIELDING 7 0.11
10 PATIENT ID--WRONG PATIENT 7 0.11
43 DUPLICATE 7 0.11
21 NO HISTORY PROVIDED 5 0.08
25 COLLIMATION-NOT ENOUGH 5 0.08
11 PATIENT ID--WRONG NAME 4 0.06
45 OTHER 3 0.05
20 INAPPROPRIATE EXAM 3 0.05
2 CURRENT EXAM NOT ON SYSTEM OR CACHE 2 0.03
7 IMAGE SEQUENCE - WRONG SEQUENCE 2 0.03
13 EXAM INFORMATION - WRONG ACCESSION NUMBER 2 0.03
38 IMAGE SIZE--MAGNIFIED 1 0.02
8 COMBINED EXAM 1 0.02
42 SAVE FOR TEACHING FILE 1 0.02
27 COLLIMATION--NO COLLIMATION 1 0.024 PRIOR EXAM NOT ON SYSTEM 1 0.02
TOTAL 6531
8/17/2019 Blado Reject Analysis SCAR 2002
21/25
2
Table VIII: CR Rejected compared with # of Archived Images by SHIFT for the year 2001.
2001 Jan Feb March April May June July Aug Sept Oct Nov Dec TOTAL AVERA
Shift 17 a - 3 pm
# of NONE Images 227 274 175 195 199 205 161 177 224 240 233 269 2579 215
Total # of Images in PACS Archive
5254 4986 5174 5122 5425 5014 4908 5494 4833 6178 5662 5298 63348 5279
% (NONE vs Total Images) 4.32 5.50 3.38 3.81 3.67 4.09 3.28 3.22 4.63 3.88 4.12 5.08 4.07 4.0
Shift 23 - 11 pm
# of NONE Images 142 143 142 180 102 107 96 137 188 165 136 151 1689 141
Total # of Images in PACS Archive
4021 4012 3974 3857 4004 3351 3724 3816 4144 4638 4180 4109 47830 3986
% (NONE vs Total Images) 3.53 3.56 3.57 4.67 2.55 3.19 2.58 3.59 4.54 3.56 3.25 3.67 3.53 3.5
Shift 311 pm - 7
am
# of NONE Images 48 52 90 73 50 63 55 55 63 64 61 56 730 61
Total # of Images in PACS Archive
1732 1754 1809 1724 1824 1537 1608 1515 1607 1991 2039 1936 21076 1756
% (NONE vs Total Images) 2.77 2.96 4.98 4.23 2.74 4.10 3.42 3.63 3.92 3.21 2.99 2.89 3.46 3.4
Weekends # of NONE Images 142 148 134 125 105 118 134 89 147 142 112 137 1533 128
Total # of Images in PACS Archive
2021 2179 2431 2263 2083 2172 2280 2017 2651 2399 2539 3324 28359 2363
% (NONE vs Total Images) 7.03 6.79 5.51 5.52 5.04 5.43 5.88 4.41 5.55 5.92 4.41 4.12 5.41 5.4
TOTAL # of NONE Images 559 617 541 573 456 493 446 458 622 611 542 613 6531 544
Total # of Images in PACS Archive
13028 12931 13388 12966 13336 12074 12520 12842 13235 15206 14420 14667 160613 1338
8/17/2019 Blado Reject Analysis SCAR 2002
22/25
21
Figure 2: CR Rejected Images by Shift for 2001
% of CR Rejects by SHIFTs
0.00
1.00
2.00
3.00
4.00
5.00
6.00
7.00
8.00
J a n
F e b
M a r c h
A p r i l
M a y
J u n e
J u l y
A u g
S e p t O c
t N o
v D e
c
Month
% o f C R R e j e c t s o v e r T o t a l A r c h i v e d I m a g e s b y
S h i f t
SHIFT 1
SHIFT 2
SHIFT 3
WEEKEND
8/17/2019 Blado Reject Analysis SCAR 2002
23/25
22
Table IX: A portion of the report showing the Number of Rejects broken down by
Technologist Number and Reject Codes.
Technologist Number Reject Code Number of Studies Number of Images
6 6 4 4
11 22 3 3
40 1 1
12 6 4 4
17 3 322 15 15
23 1 1
35 1 1
40 1 1
16 16 1 1
21 34 1 1
22 22 7 7
23 6 1 1
7 2 2
16 1 1
22 101 102
23 18 18
29 1 1
31 2 2
40 9 941 2 2
8/17/2019 Blado Reject Analysis SCAR 2002
24/25
23
Table X: CR Rejects by Exam Description
Exam Description Number of Rejected Images %
Chest 3374 51.66
Abdomen 651 9.97
NOT INDICATED 633 9.69
Spine 496 7.59Upper_Ext 426 6.52
Head 412 6.31
Lower_Ext 397 6.08
Pelvis 106 1.62
Body 18 0.28
Abdomen/KUB 7 0.11
Renal 6 0.09
Bone 3 0.05
Neck 2 0.03
TOTAL 6531
8/17/2019 Blado Reject Analysis SCAR 2002
25/25
Table XI: Reject Code List
REJECT CODE LIST
Comment Category Fault Specification Dictat ionCode
Images Number of Images Missing Images 1
Duplicate Images 2
Image Sequence Wrong Sequence 3
Combined Exam 4
Identification Patient Identification Wrong Patient 5
Exam Information Wrong Exam Procedure 6
Annotation Incorrect Orientation 7
Improper Placement of Marker 8
No Marker 9
Wrong Marker 10
Appropriateness of Exam Wrong Exam Performed 11
Technical Mis-positioned Patient 12
Cassette 13
Inadequate Inspiration 14
Motion 15
Collimation Not Enough 16
Too Much 17
No Collimation 18
Shielding Image Artifacts (Holding) 19
Inappropriate Shielding 20
No Shielding 21
Double Exposure 22
Equipment Malfunction23
Quality Density Too Dark 24
Too Light 25
Blurred Monitor 26
Image 27
Contrast High LGM 28
Low LGM 29
Noisy 30
Image Size Magnified 31
Minified 32
Artifact Patient 33
Cassette 34
Equipment 35
Diagnostic Radiologist's Request to Reject 36
Save for Green Dot/Teaching File 37
Test 38
Other 39