86
JOURNAL LITERATURE – FULL ABSTRACT – 1987 to August, 2005 This list of journal articles duplicates the document “Abstract Listing” but includes the article abstract. The articles pertain to the review, selection, testing, or evaluation of safer needle devices. This list is not meant to be definitive or exhaustive; as new devices are released new articles can be expected to be published. 1. Comparison of a needleless system with conventional heparin locks. Adams KS, Zehrer CL, Thomas W. Department of Nursing, University of Minnesota Hospital and Clinic, Minneapolis. American Journal of Infection Control 1993 Oct;21(5):263-9 BACKGROUND: Despite the improvements in needle disposal systems, needlesticks to health care workers continue to occur at unacceptably high rates. Needleless systems have been shown to reduce the risk of needlesticks. METHODS: This pilot study examined the safety of such a system for patients by comparing the rates of intravenous infection- related indicators between a conventional heparin lock and a needleless system. Patients (n = 97) were categorized on the basis of the duration of intravenous placement into 24-, 48-, and 72-hour groups. Within each group, half of the patients received conventional heparin locks and half received the needleless system. Intravenous infection-related indicators included catheter tip culture, adaptor fluid culture, intravenous site This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all- inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing. 1

Safety device studies

Embed Size (px)

Citation preview

Page 1: Safety device studies

JOURNAL LITERATURE – FULL ABSTRACT – 1987 to August, 2005

This list of journal articles duplicates the document “Abstract Listing” but includes the article abstract. The articles pertain to the review, selection, testing, or evaluation of safer needle devices. This list is not meant to be definitive or exhaustive; as new devices are released new articles can be expected to be published.

1.

Comparison of a needleless system with conventional heparin locks.

Adams KS, Zehrer CL, Thomas W.

Department of Nursing, University of Minnesota Hospital and Clinic, Minneapolis.

American Journal of Infection Control 1993 Oct;21(5):263-9

BACKGROUND: Despite the improvements in needle disposal systems, needlesticks to health care workers continue to occur at unacceptably high rates. Needleless systems have been shown to reduce the risk of needlesticks.

METHODS: This pilot study examined the safety of such a system for patients by comparing the rates of intravenous infection-related indicators between a conventional heparin lock and a needleless system. Patients (n = 97) were categorized on the basis of the duration of intravenous placement into 24-, 48-, and 72-hour groups. Within each group, half of the patients received conventional heparin locks and half received the needleless system. Intravenous infection-related indicators included catheter tip culture, adaptor fluid culture, intravenous site erythema, induration and tenderness, and elevated oral temperature.

RESULTS: Prevalence of one or more indicators was 48% for the conventional and 40% for the needleless system, a difference that was not statistically significant.

CONCLUSIONS: The needleless system appeared to pose no greater risk of infection to patients and nurses preferred it for its reduced risk of potential needlesticks.

2.

Guarded fistula needle reduces needlestick injuries in hemodialysis

Adams T, McCleary J, Caldero K.

Sacred Heart Medical Center, Spokane, Wash., USA.

Nephrol News Issues 2002 May;16(6):66-70, 72

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

1

Page 2: Safety device studies

Use of large-gauge, hollow-bore, arteriovenous fistula needles (AVFNs) and high-pressure accesses are unique factors inherent to the hemodialysis (HD) setting. The dialysis patient population has a higher incidence of hepatitis C (HCV) than the general population (8.4% compared to 1.8%) and the incidence of Human Immunodeficiency Virus (HIV) has increased tenfold from 1985 to 2000. HD health care workers (HCWs) are twice as likely to sustain a high-risk needlestick injury (NSI) as HCWs in all other settings. All of these factors leave HD HCWs at a high risk of exposure to bloodborne pathogens (BBPs). Although published data on NSI reduction with guarded AVFNs is lacking, many HD facilities have rushed to implement guarded AVFNs to comply with Occupational Safety and Health Administration's (OSHA) newly revised Bloodborne Pathogens Standard (29 CFR 1910.1030). For this study, we evaluated the effectiveness of one design of AVFN guard (MasterGuard Anti-Stick Needle Protector, Medisystems Corporation) by comparing its NSI rate to that of unguarded AVFNs. The unguarded AVFN injury rate was 8.58 NSIs per 100,000 unguarded AVFNs (in 81,534 cannulations) compared to zero NSIs per 100,000 guarded AVFNs (in 54,044 cannulations). The guarded AVFN showed a statistically significant NSI reduction compared to the unguarded AVFN (p < 0.029). This study demonstrates that using a guarded AVFN will help reduce HCWs' risk of exposure to BBPs in the dialysis setting.

3.

A comprehensive approach to percutaneous injury prevention during phlebotomy: results of a multicenter study, 1993-1995.

Alvarado-Ramy F, Beltrami EM, Short LJ, Srivastava PU, Henry K, Mendelson M, Gerberding JL, Delclos GL, Campbell S, Solomon R, Fahrner R, Culver DH, Bell D, Cardo DM, Chamberland ME.

Division of Healthcare Quality Promotion, National Center for Infectious Diseases, Centers for Disease Control and Prevention, Atlanta, Georgia 30333, USA.

Infect Control Hosp Epidemiology 2003 Feb;24(2):97-104

OBJECTIVE: To examine a comprehensive approach for preventing percutaneous injuries associated with phlebotomy procedures.

DESIGN AND SETTING: From 1993 through 1995, personnel at 10 university-affiliated hospitals enhanced surveillance and assessed underreporting of percutaneous injuries; selected, implemented, and evaluated the efficacy of phlebotomy devices with safety features (i.e., engineered sharps injury prevention devices [ESIPDs]); and assessed healthcare worker satisfaction with ESIPDs. Investigators also evaluated the preventability of a subset of percutaneous injuries and conducted an audit of sharps disposal containers to quantify activation rates for devices with safety features.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

2

Page 3: Safety device studies

RESULTS: The three selected phlebotomy devices with safety features reduced percutaneous injury rates compared with conventional devices. Activation rates varied according to ease of use, healthcare worker preference for ESIPDs, perceived "patient adverse events," and device-specific training.

CONCLUSIONS: Device-specific features and healthcare worker training and involvement in the selection of ESIPDs affect the activation rates for ESIPDs and therefore their efficacy. The implementation of ESIPDs is a useful measure in a comprehensive program to reduce percutaneous injuries associated with phlebotomy procedures.

4.

Prevention of needle-stick injury. Efficacy of a safeguarded intravenous cannula

Asai T, Matsumoto S, Matsumoto H, Yamamoto K, Shingu K.

Department of Anaesthesiology, Kansai Medical University, Osaka, Japan.

Anaesthesia 1999 Mar;54(3):258-61

One possible method of reducing the incidence of needle-stick injury is to use needles with safeguard mechanisms. The needle of the Insyte AutoGuard intravenous cannula can be retracted into the safety barrel. One hundred patients were randomly allocated to receive either an 18-gauge conventional Insyte intravenous cannula (group C) or the AutoGuard cannula (group AG) to assess the ease of use and efficacy of the AutoGuard device. It was possible to insert the cannula into the vein within two attempts in all patients; there was no significant difference between two groups with respect to ease of insertion. No problems, such as inadvertent withdrawal of the needle, occurred during insertion in any patient. Handling the withdrawn needle was judged significantly safer in group AG than in group C (p < 0.001). Blood contamination often occurred where a withdrawn needle was placed in group C, whereas no blood stain was detected in any case in group AG (p < 0.001). The AutoGuard cannula provides safer handling of a withdrawn needle without reducing its ease of insertion.

5.

Differences in Percutaneous Injury Patterns in a Multi-Hospital System

Babcock HM, Fraser V.

Infect Control Hosp Epidemiology 2003 Oct;24(10):731-736

OBJECTIVE: Determine differences in patterns of percutaneous injuries (PIs) in different types of hospitals.

DESIGN: Case series of injuries occurring from 1997 to 2001.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

3

Page 4: Safety device studies

SETTING: Large Midwestern healthcare system with a consolidated occupational health database from 9 hospitals, including rural and urban, community and teaching (1 pediatric, 1 adult) facilities, ranging from 113 to 1,400 beds.

PARTICIPANTS: Healthcare workers injured between 1997 and 2001.

RESULTS: Annual injury rates for all hospitals decreased during the study period from 21 to 16.5/100 beds (chi-square for trend = 22.7; P = .0001). Average annual injury rates were higher at larger hospitals (22.5 vs 9.5 Pis/100 beds; P = .0001). Among small hospitals, rural hospitals had higher rates than did urban hospitals (14.87 vs 8.02 Pis/100 beds; P = .0143). At small hospitals, an increased proportion of injuries occurred in the emergency department (13.7% vs 8.6%; P = .0004), operating room (32.3% vs 25.4%; P = .0002), and ICU (12.3% vs 9.4%; P = .0225), compared with large hospitals. Rural hospitals had higher injury rates in the radiology department (7.7% vs 2%; P = .0015) versus urban hospitals. Injuries at the teaching hospitals occurred more commonly on the wards (28.8% vs 24%; P = .0021) and in ICUs (11.4% vs 7.8%; P = .0006) than at community hospitals. Injuries involving butterfly needles were more common at pediatric versus adult hospitals (15.8% vs 6.5%; P = .0001). The prevalence of source patients infected with HIV and hepatitis C was higher at large hospitals.

CONCLUSIONS: Significant differences exist in injury rates and patterns among different types of hospitals. These data can be used to target intervention strategies.

6.

Evaluation of a needle-free intravenous access system

Beason R, Bourguignon J, Fowler D, Gardner C.

J Intraven Nursing 1992 Jan-Feb;15(1):11-6

Needle-stick injuries are one of the most severe hazards faced by nurses today. The most physically and emotionally devastating type of injury is from a needle contaminated with human immunodeficiency virus (HIV), but far more likely to occur is infection with other blood-borne pathogens, especially hepatitis B. The daily threat of needle-stick injuries adds yet another dimension of concern to the stresses inherent in working in a health care facility. Health care workers at greatest risk are those who manipulate needles and draw blood samples on a regular basis. With this concern in mind, a study was launched to evaluate a needle-free I.V. access system with respect to the following research objectives: 1) to assess the prevention/reduction of needle-stick risks and injuries; 2) to identify associated reduction in expenses; 3) to implement product and ease of use; and 4) to assess nursing satisfaction levels. This article describes the methodology used and the results of the study.

7.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

4

Page 5: Safety device studies

The use of a surgical assist device to reduce glove perforations in post delivery vaginal repair: a randomized controlled trial

Bebbington MW, Treissman MJ.

Department of Obstetrics and Gynecology, British Columbia Women's Hospital, University of British Columbia.

Am J Obstet Gynecology 1996 Oct;175(4 Pt 1):862-6

OBJECTIVE: Our purpose was to compare the effectiveness of a surgical assist device, SutureMate, to decrease glove perforations during post delivery vaginal repair.

STUDY DESIGN: This was a prospective randomized trial. After delivery surgeons who needed to perform vaginal repair were randomized to use the surgical assist device or to perform the repair in the usual fashion. After the repair, gloves were collected and the operator was asked to complete a standardized data form that was submitted with the gloves. The gloves were tested for perforations within 24 hours by the Food and Drug Administration-approved hydrosufflation technique. Comparisons were made with chi(2) statistics with p < 0.01 taken as being statistically significant with the use of a Bonferoni adjustment for multiple comparisons.

RESULTS: A total of 476 glove sets were evaluated. The use of the surgical assist device significantly reduced the overall glove perforation rate from 28.3% in the control arm to 8.4% in the study arm (p = 0.0001). Rates of perforation varied with level of training and expertise but fell in all groups that used the device. Family physicians had the highest perforation rate in the control arm and benefited most from the device. A total of 76% of perforations were located in the thumb, index, and second fingers of the nondominant hand. Perforations were recognized in only 16% of the glove sets. The level of satisfaction with the device was mixed, but overall 50% of operators indicated that they were either satisfied or very satisfied with the device.

CONCLUSION: The rate of glove perforation in post delivery vaginal repair is high. The surgical assist device significantly reduced the rate of glove perforations.

8.

Evaluation of Blunt Suture Needles in Preventing Percutaneous Injuries Among Health-Care Workers During Gynecologic Surgical Procedures - New York City, March 1993 - June 1994

Center for Disease Control and Prevention. (1997). Evaluation of Blunt Suture Needles in Preventing Percutaneous Injuries Among Health-Care Workers During Gynecologic Surgical Procedures - New York City, March 1993 - June 1994. Morbidity and Mortality Weekly Report, 46(2), (25-28).

9.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

5

Page 6: Safety device studies

Evaluation of Safety Devices for Preventing Percutaneous Injuries Among Health-Care Workers During Phlebotomy Procedures-Minneapolis-St. Paul, New York City, and San Francisco, 1993-1995

Center for Disease Control and Prevention. (1997). Evaluation of Safety Devices for Preventing Percutaneous Injuries Among Health-Care Workers During Phlebotomy Procedures-Minneapolis-St. Paul, New York City, and San Francisco, 1993-1995. Morbidity and Mortality Weekly Report, 46(2), (21-25).

10.

Selection of needlestick prevention devices: a conceptual framework for approaching product evaluation.

Chiarello LA.

Infection Control Program, New York State Department of Health, Albany, NY 12237, USA.

American Journal of Infection Control 1995 Dec;23(6):386-95

Needlestick injuries have been associated with blood-borne disease transmission to health care workers. A demand for a safer work environment has contributed to a proliferation of "safety" products. The selection and evaluation of these devices differs from traditional product evaluation in that it considers not only effectiveness in patient care but also health care worker safety and cost-effectiveness in terms of prevention gained. In addition, multiple devices associated with injuries and choices between passive, active, and accessory safety options require that institutions establish priorities for focusing intervention efforts. Selection of products must involve the primary users. Unless new devices are found acceptable for patient care, health care workers are likely to reject them, despite any apparent safety advantages. Five project steps help define a systematic approach for this process: (1) creation of a multidisciplinary team, (2) defining prevention priorities on the basis of collection and analysis of an institution's injury data, (3) development of design and performance criteria for product selection according to needs for patient care and health care worker safety, (4) planning and implementing an evaluation of products in clinical settings, and (5) analyzing product performance and cost-effectiveness to choose the product. Several methodologic issues raise questions for future research in the area of product evaluation, including the selection of study populations, methods of product distribution and data collection, and influence of institutional culture. In addition, there is a need to develop product-specific design and performance criteria by which evaluation teams can measure various technologies under consideration. Standardization of the product evaluation process for needlestick prevention technology should lead to the collection of information that can be compared across institutions. Infection control professionals have an important opportunity to assume a leadership role in this process.

11.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

6

Page 7: Safety device studies

Needlestick injuries to nurses, in context

Clarke SP, Sloane DM, Aiken LH.

LDI Issue Brief 2002 Sep;8(1):1-4

Center for Health Outcomes and Policy Research, School of Nursing, University of Pennsylvania, USA.

Injuries with used needles and other "sharps" put health care workers at risk for serious bloodborne infections, such as HIV and hepatitis B and C. To some extent, this risk can be lessened through safer techniques (such as not recapping needles) and safer devices (such as needleless and self-sheathing equipment). But these injuries occur within a context (often a hospital unit) with organizational features that may themselves contribute to an increased or decreased risk.

This Issue Brief summarizes a series of studies that investigate whether workplace aspects of the hospital (such as staffing levels, and organizational structure and climate) affect the risk of needlestick injuries to nurses.

12.

The promise of novel technology for the prevention of intravascular device-related bloodstream infection. I. Pathogenesis and short-term devices

Crnich CJ, Maki DG.

Section of Infectious Diseases, Department of Medicine, University of Wisconsin Medical School, Madison, WI, USA.

Clinical Infectious Disease. 2002 May 1;34(9):1232-42

Intravascular devices (IVDs) are widely used for vascular access but are associated with substantial risk of development of IVD-related bloodstream infection (BSI). The development of novel technologies, which are based on an understanding of pathogenesis, promises a quantum reduction in IVD-related infections in an era of growing nursing shortages. Infections of short-term IVDs (that is, those in place <10 days), including peripheral venous catheters, noncuffed and nontunneled central venous catheters (CVCs), and arterial catheters, derive mainly from microorganisms colonizing the skin around the insertion site, which most often gain access extraluminally. More-effective cutaneous antiseptics, such as chlorhexidine, a chlorhexidine-impregnated sponge dressing, CVCs with an anti-infective coating, anti-infective CVC hubs, and novel needleless connectors, have all been shown to reduce the risk of IVD-related BSI in prospective randomized trials. The challenge for the future will be to identify new preventative

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

7

Page 8: Safety device studies

technologies and to begin to adapt more widely those technologies already shown to be efficacious and cost-effective.

13.

The promise of novel technology for the prevention of intravascular device-related bloodstream infection. II. Long-term devices

Crnich CJ, Maki DG.

Section of Infectious Diseases, Department of Medicine, University of Wisconsin Medical School, Madison, WI, USA.

Clinical Infectious Disease. 2002 May 15;34(10):1362-8

Intravascular devices (IVDs) are widely used for vascular access but are associated with a substantial risk of IVD-related bloodstream infection (BSI). The development of novel technologies based on our understanding of pathogenesis promises a quantum reduction in IVD-related infections in an era of growing nursing shortage. Infections of long-term IVDs (most are in place for > or =10 days), including cuffed and tunneled central venous catheters (CVCs), implanted subcutaneous central venous ports, and peripherally inserted central catheters (PICCs), are primarily due to microorganisms that gain access to the catheter hub and lumen. Novel securement devices and antibiotic lock solutions have been shown to reduce the risk of IVD-related BSI in prospective randomized trials. The challenge for the future will be to identify new preventative technologies and to begin to more-widely adapt those technologies that have already been shown to be efficacious and cost effective.

14.

Accidental needlesticks in the phlebotomy service of the Department of Laboratory Medicine and Pathology at Mayo Clinic Rochester

Dale JC, Pruett SK, Maker MD.

Department of Laboratory Medicine and Pathology, Mayo Clinic Rochester, MN 55905, USA.

Mayo Clin Proc 1998 Jul;73(7):611-5

OBJECTIVE: To determine the change in accidental needlestick rates in the Phlebotomy Service at Mayo Clinic Rochester and to identify safety practices implemented from 1983 through 1996.

MATERIAL AND METHODS: We retrospectively reviewed yearly Phlebotomy Service accidental needlestick rates from 1983 through 1996. Interviews were conducted with representatives of the Infection Control Committee and the management team for the

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

8

Page 9: Safety device studies

Phlebotomy Service, and minutes of meetings of these two groups were reviewed to identify implemented safety improvements that may have had an effect on accidental needlestick exposures.

RESULTS: Accidental needlestick exposures in the Phlebotomy Service declined from a high of 1.5/10,000 venipunctures to 0.2/10,000 venipunctures. Several safety improvements were made during that time, including the implementation of a one-handed recapping block, change to single-use evacuated tube holders, increased number and improved locations of disposal containers for needles, implementation of resheathing needles and retractable capillary puncture devices, discontinuation of the practice of changing needles before inoculation of blood culture bottles, increased emphasis on safety for new and experienced phlebotomists, and improved exposure reporting tools.

CONCLUSION: We believe that the decrease in our accidental needlestick exposure rate is correlated with the changes in education, practices, and products that we have implemented.

15.

Percutaneous Injury, Blood Exposure, and Adherence to Standard Precautions: Are Hospital-Based Health Care Providers Still at Risk?

Doebbeling BN, Vaughn TE, McCoy KD, Beekmann SE, Woolson RF, Ferguson KJ, Torner JC.

CID 2003 Oct;37:10006-13

To examine factors associated with blood exposure and percutaneous injury among health care workers, we assessed occupational risk factors, compliance with standard precautions, frequency of exposure, and reporting in a stratified random sample of 5123 physicians, nurses, and medical technologists working in Iowa community hospitals. Of these, 3223 (63%) participated. Mean rates of handwashing (32%-54%), avoiding needle recapping (29%-70%), and underreporting sharps injuries (22%-^2%) varied by occupation (P<.01). Logistic regression was used to estimate the adjusted odds of percutaneous injury (aORinjury), which increased 2%-3% for each sharp handled in a typical week. The overall aORinjury for never recapping needles was 0.74 (95% CI, 0.06-0.91). Any recent blood contact, a measure of consistent use of barrier precautions, had an overall aORinjury of 1.57 (95% CI, 1.32-1.86); among physicians, the aORinjury was 2.18 (95% CI, 1.34-3.54). Adherence to standard precautions was found to be suboptimal. Underreporting was found to be common. Percutaneous injury and mucocutaneous blood exposure are related to frequency of sharps handling and inversely related to routine standard-precaution compliance. New strategies for preventing exposures, training, and monitoring adherence are needed.

16.

Effect of bedside needle disposal units on needle recapping frequency and needlestick injury

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

9

Page 10: Safety device studies

Edmond M, Khakoo R, McTaggart B, Solomon R.

Section of Infectious Diseases, West Virginia University Medical Center, Morgantown 26506.

Infect Control Hosp Epidemiology 1988 Mar;9(3):114-6

Needle recapping has been shown to be one of the leading causes of needlestick injuries. Frequency of recapping has not been reported. This study was designed to determine the frequency of needle recapping by nursing personnel and the effect of bedside needle disposal units on the frequency of recapping and needlesticks. Seventy-four nurses carrying out 312 activities involving use of needles were observed. The subjects were not aware of the nature of the study. The recapping frequency was 93.9%. The study was repeated after educational programs and following installation of a hospital-wide bedside needle disposal system. Fifty-three nurses performing 151 activities with needles were observed. Frequency of recapping was 94%. There was no significant difference in the rate of recapping or needlestick injuries after installation of the new needle disposal system. Educational programs regarding recapping, a very common practice, may be ineffective. Alternate methods for preventing needlesticks may be necessary.

17.

Predictive value of surveillance skin and hub cultures in central venous catheters sepsis

Fan ST, Teoh-Chan CH, Lau KF, Chu KW, Kwan AK, Wong KK.

Government Surgical Unit, Queen Mary Hospital, Hong Kong.

J Hosp Infect. 1988 Oct;12(3):191-8

In a prospective study of septic complications of central venous catheters used for total parenteral nutrition, daily surveillance catheter hub cultures and twice weekly skin cultures at the catheter entry site were evaluated for their predictive value for catheter sepsis, i.e. bacteraemia with an identical species as that recovered from the catheter tip, or catheters which grew greater than or equal to 15 cfus by a semiquantitative method and/or greater than or equal to 10(3) cfus by a quantitative method. Of 142 catheters studied, 29 were identified to have catheter sepsis. For these the sensitivity of the surveillance hub culture was 34.5% and the sensitivity of the skin culture was 37.9%. When either the hub or the skin culture result was considered as an indication of catheter sepsis, the sensitivity increased to 79.3%. The positive and negative predictive value of the combined result was 44.2% and 93.3% respectively. This study suggests that simultaneous hub and skin cultures are required for a satisfactory surveillance.

18.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

10

Page 11: Safety device studies

Sharps-Related Injuries in Health Care Workers: A Case-Crossover Study

Fisman DN, Harris AD, Sorock GS, Mittlemann MA.

The American Journal of Medicine. 2003 June: 14: 688-694.

The current hospital environment presents employees with various challenges, including worker fatigue, rushing and distraction. The authors of this study postulated that such brief, transient factors might increase the risk of sharps-related injuries in health care workers. A case-crossover study design was used, each case served as its own control, to identify and quantify transient factors that might increase the risk of such injuries. Between February 200 and October 2001, 139 health care workers employed at the University of Maryland Medical Center in Baltimore, Maryland or at Beth Israel Deaconess Medical Center in Boston, Massachusetts, who reported sharps-related injuries to employee health services were recruited. Subjects were assessed for rushing, distraction, anger, fatigue, performance of a task in an emergency situation, and teaching. Most injuries occurred with hollow-bore devices. Definite or suspected exposure to HIV, hepatitis B virus, or hepatitis C virus was reported by approximately half of the subjects. A minority of subjects incurred their injuries in a continuous-risk environment, such as an operating room or procedure suite. An increased risk of sharps-related injuries was associated with rushing, anger, distraction, and multiple passes. A trend toward increased risk was seen when subjects were fatigued, working with an uncooperative patient, or working as part of a team that was short staffed, and among surgeons working in a noisy operating room environment. No change in the risk of sharps-related injuries was found in association with teaching, performing highly complex operative procedures, or working in a bloody operative field. The authors conclude that interventions to minimize these factors should be explored by those who wish to prevent sharps-related injuries in the health care work place.

19.

The impact of multifocused interventions on sharps injury rates at an acute-care hospital

Gershon RR, Pearse L, Grimes M, Flanagan PA, Vlahov D.

Department of Environmental Health Sciences, the Johns Hopkins University School of Public Health, Baltimore, Maryland 21205, USA.

Infect Control Hosp Epidemiology 1999 Dec;20(12):806-11

OBJECTIVE: To determine the impact of a multifocused interventional program on sharps injury rates.

DESIGN: Sharps injury data were collected prospectively over a 9-year period (1990-1998). Pre- and postinterventional rates were compared after the implementation of sharps injury prevention interventions, which consisted of administrative, work-practice, and engineering

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

11

Page 12: Safety device studies

controls (i.e., the introduction of an anti-needlestick intravenous catheter and a new sharps disposal system).

SETTING: Sharps injury data were collected from healthcare workers employed by a mid-sized, acute-care community hospital.

RESULTS: Preinterventional annual sharps injury incidence rates decreased significantly from 82 sharps injuries/1,000 worked full-time-equivalent employees (WFTE) to 24 sharps injuries/1,000 WFTE employees postintervention (P<.0001), representing a 70% decline in incidence rate overall. Over the course of the study, the incidence rate for sharps injuries related to intravenous lines declined by 93%, hollow-bore needlesticks decreased by 75%, and non-hollow-bore injuries decreased by 25%.

CONCLUSION: The implementation of a multifocused interventional program led to a significant and sustained decrease in the overall rate of sharps injuries in hospital-based healthcare workers.

20.

Sharps-Related Injuries in California Healthcare Facilities: Pilot Study Results From the Sharps Injury Surveillance Registry

Gillen M, McNary J, Lewis J, Davis M, Boyd A, Schuller M, Curran C, Young CA, Cone J.

Infect Control Hosp Epidemiology 2003;24:113-121

BACKGROUND AND OBJECTIVESIn 1998, the California Department of Health Services invited all healthcare facilities in California (n = 2,532) to participate in a statewide, voluntary sharps injury surveillance project. The objectives were to determine whether a low-cost sharps registry could be established and maintained, and to evaluate the circumstances surrounding sharps injuries in California.

RESULTSApproximately 450 facilities responded and reported a total of 1,940 sharps-related injuries from January 1998 through January 2000. Injuries occurred in a variety of healthcare workers (80 different job titles). Nurses sustained the highest number of injuries (n = 658). In hospital settings (n = 1,780), approximately 20% of the injuries were associated with drawing venous blood, injections, or assisting with a procedure such as suturing. As expected, injuries were caused by tasks conventionally related to specific job classifications. The overall results approximate those reported by the Centers for Disease Control and Prevention’s National Surveillance System for Health Care Workers and the University of Virginia’s Exposure Prevention Information Network.

CONCLUSION

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

12

Page 13: Safety device studies

These data further support findings from previous studies documenting the complex and persistent nature of sharps-related injuries in healthcare workers. In the future, mandated reporting using standardized forms and consistent application of decision rules would facilitate a more thorough analysis of injury events

21.

Sharps injury reduction using Sharpsmart--a reusable sharps management system

Grimmond T, Rings T, Taylor C, Creech R, Kampen R, Kable W, Mead P, Mackie P, Pandur R.

The Daniels Corporation International Ltd, Dandenong, Australia 3175.

J Hosp Infect. 2003 Jul;54(3):232-8

Sharps containers are associated with 11-13% of total sharps injuries (SI) yet have received little attention as a means of SI reduction. A newly developed reusable sharps containment system (Sharpsmart) was trialed in eight hospitals in three countries. The system was associated with an 86.8% reduction of container-related SI (CRSI) (P=0.012), a 25.7% reduction in non-CRSI (P=0.003), and a 32.6% reduction in total SI (P=0.002) compared with historical data. The study concludes that the Sharpsmart system is an effective engineered control in reducing SI.

22.

A five-year study of needlestick injuries: significant reduction associated with communication, education, and convenient placement of sharps containers

Haiduven DJ, DeMaio TM, Stevens DA.

Infection Control Department, Santa Clara Valley Medical Center, San Jose, CA 95128

Infect Control Hosp Epidemiology 1992 May;13(5):265-71

OBJECTIVE: To decrease the numbers of needlesticks among healthcare workers.

DESIGN: All reported needlestick injuries at Santa Clara Valley Medical Center, San Jose, California, were reviewed, analyzed, and tabulated by the infection control department yearly from 1986 to 1990.

SETTING: A 588-bed county teaching hospital in San Jose, California, affiliated with Stanford University.

PARTICIPANTS: All employees of Santa Clara Valley Medical Center who reported needlestick injuries on injury report forms.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

13

Page 14: Safety device studies

INTERVENTIONS: From April to December 1987, more needle disposal containers were added to as many patient care areas and as close to the area of use as possible. Results of 1986, 1988, 1989, and 1990 analyses were communicated yearly to all personnel, extensive educational programs were conducted in 1987 and 1988, and educational efforts continued in 1989 and 1990.

RESULTS: In 1986, there were 259 needlestick injuries at our institution, 22% (32) from recapping. After needle disposal containers were added to all patient care areas, needlestick injuries for 1988 totaled 143, a 45% decrease in the total needlestick injuries and a 53% decrease in recapping injuries. Communication of results to all areas of the hospital and educational activities were started in 1987 and continued through the next 3 years. In 1989, there were 135 needlestick injuries, a decrease of 6% from 1988; recapping injuries decreased 40% from 1988. In 1990, there were 104 needlestick injuries, a 23% decrease since 1989, and a 33% decrease in recapping injuries. The total number of needlestick injuries from 1986 to 1990 decreased by 60%, and those injuries from recapping decreased by 81% to 89%.

CONCLUSIONS: We have continued to monitor needlestick injuries, communicate findings to all personnel, and include needlestick prevention in educational programs. We contend that more convenient placement of needle disposal containers, communication of findings, and education do decrease needlestick injuries in healthcare workers.

23.

Percutaneous injury analysis: consistent categorization, effective reduction methods, and future strategies

Haiduven DJ, Phillips ES, Clemons KV, Stevens DA.

Santa Clara Valley Medical Center, San Jose, California 95128, USA.

Infect Control Hosp Epidemiology 1995 Oct;16(10):582-9

OBJECTIVE: To report the results of an 8-year analysis of percutaneous injuries (PI), to describe interventions to decrease these injuries, and to discuss future prevention strategies.

DESIGN: Using consistent methods, 881 percutaneous injury reports were reviewed, categorized, and analyzed from 1986 through 1993.

SETTING: A 620-bed acute-care county teaching hospital located in San Jose, California, that is affiliated with Stanford University Medical School, Palo Alto, California.

PARTICIPANTS: Employees of Santa Clara Valley Medical Center who reported percutaneous injuries from 1986 through 1993.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

14

Page 15: Safety device studies

INTERVENTIONS: Placement of needle disposal containers in all patient care areas, 1987; education, 1987 to present; communication of percutaneous injury analyses to all departments, 1988 to present; and safety product evaluation and purchases, 1991 to present.

RESULTS: The total number of PI decreased by 65% (P = .0007) from 1986 through 1993. Recapping injuries decreased from 1986 through 1993 by 88% (P < .0002); interventions that included convenient placement of needle disposal containers and consistent annual education may have contributed to this decrease. Injuries from manipulating intravenous lines or heparin locks decreased in 1992 (P < .03) after purchase of a needleless system for intravenous lines. Injuries from improper disposal or from abrupt patient movement did not decrease significantly over the 8-year period.

CONCLUSIONS: This institution has conducted percutaneous injury analysis for 8 years, utilizing consistent reviewers and categorization methods. Successful interventions have reduced recapping injuries, injuries from manipulating intravenous lines/heparin locks, and the overall numbers of PI. The categories of "Improper Disposal" and "Patient Moved Abruptly" present challenges for future reductions, as well as the recently identified problem of staff not using available safety devices or using them improperly.

24.

Outbreak of bloodstream infections temporally associated with a new needleless IV infusion system

Hall, Keri K, Geffers, Christine, Giannetta, Eve, Flanagan, Heidi, Farr, Barry M.

Fourteenth Annual Scientific Meeting; Society for Healthcare Epidemiology of America, 2004 SHEA

Background: In June 2002, this hospital experienced a 61% increase in the primary nosocomial bloodstream infection (BSI) rate (2.2/1,000 patients-days from 1/02-5/02 versus 3.5/1,000 patient-days from 6/02-12/02; RR=1.6067, 95%CI=1.28-2.03, p=0.00003). This increase temporally coincided with hospital-wide implementation in late 5/02 of a new needleless IV infusion system (NIIS). Initial investigation revealed that the outbreak included an increase in BSIs caused by common skin organisms (CSOs), and other more pathogenic organisms. Retrospective investigation was performed to determine the rates of contaminated blood cultures and true BSIs before and after onset of the outbreak.

Methods: All positive blood cultures drawn percutaneously or from a central venous catheter (CVC) from 10/01 to 9/03 were reviewed. A culture was classified as contaminated if a CSO was isolated from only one of two or more sets of blood cultures obtained from different sites within a 5-hour time-period. Blood cultures positive for CSOs that did not have a comparator blood culture set were excluded from analysis. Blood cultures positive for non-CSOs were included in the analysis regardless of whether there was a comparator blood culture set. CSOs included the following: coagulase-negative staphylococci, Micrococcus species,

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

15

Page 16: Safety device studies

Propionibacterium species, viridans streptococci, Corynebacterium species, and Bacillus species. Data were obtained from clinical microbiology records.

Results: From 10/01 to 9/03, 1,580 (5.3%) of 29,270 percutaneous (P) cultures grew CSOs, and 646 (6.3%) of 10,297 CVC-drawn cultures grew CSOs. The contamination rate for CVC cultures increased from 1.7% to 6.5% (RR 3.74, 95%CI 2.84-4.91, p<10-7). The rate of true positive CVC cultures also increased from 6.1% to 11.6% (RR 1.90, 95%CI 1.63-2.20, p<10-7). True positive CVC cultures that grew CSOs increased from 0.85% to 2.4% (RR 2.86, 95%CI 1.91-4.27, p=10-7), whereas true positive CVC cultures that grew non-CSO’s increased from 5.2% to 9.5% (RR 1.79, 95%CI 1.52-2.11, p<10-7).

Conclusions: Coincident with implementation of a new NIIS to reduce the risk of contaminated needlesticks among HCWs and the beginning of a BSI outbreak among patients, there was a significant 3.7-fold increase in the contamination rate of CVC-drawn cultures. Additionally, an increase in true BSIs occurred, with infections from CSOs increasing 2.9-fold and infections from non-CSOs increasing 1.8-fold. These data suggest an increased CVC contamination rate coincident with the institution of a NIIS that may be more apt to become colonized and infected.

25.

A critical review of the literature on sharps injuries: epidemiology, management of exposures and prevention

Hanrahan A, Reutter L.

Capital Health Authority, Edmonton, Alberta, Canada.

J Adv Nurs 1997 Jan;25(1):144-54

This article reviews the literature related to the epidemiology, prevention and management of sharps injuries in health care workers, particularly nurses, and the subsequent risk of harm. The studies are reviewed chronologically, beginning with the efforts to reduce sharps injuries by changing behaviours, followed by the introduction of barriers to protect the caregiver, and finally, the engineering of safer products. Initial efforts to prevent sharps injuries focused on placing rigid, disposal containers at the site where sharps were used and instructing health care workers to refrain form the practice of recapping. When these interventions were shown to alter the type, but not the overall number, of sharps injuries, alternative measures were sought. This search intensified with the increasing evidence of the small, but measurable, risk of the transmission of human immunodeficiency virus from sharps injuries. The current knowledge of the factors related to sharps injuries has been collected primarily through retrospective surveillance. This surveillance has been conducted primarily in hospital settings and has focused on the type of sharp and the purpose for which it was used rather than prospective research.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

16

Page 17: Safety device studies

Research is now needed to elucidate the organizational and behavioural factors leading to sharps injury both within the hospital as well as other health care settings. The implications for nursing practice are discussed.

26.

Device-specific risk of needlestick injury in Italian health care workers.

Ippolitto G, De Carli G, Puro V, Petrosillo N, Arici C, Bertucci R, Bianciardi L, Bonazzi L, Cestrone A, Daglio M, Perna M, Pietrobon F, & Jagger J.

Lazzaro Spallanzani Hospital for Infectious Diseases, Rome, Italy.

JAMA 1994 Aug 24-31;272(8):607-10

OBJECTIVES--To identify the types of medical devices causing needlestick injuries among Italian health care workers, to document the device-specific injury rates and time trends for different hollow-bore needles, and to compare injury rates from these devices with those reported in the United States. DESIGN--Longitudinal survey.

SETTINGS--Twelve Italian acute care public hospitals. METHODS--Data were obtained from a multihospital surveillance database on the number of total injuries reported in each device category. Hospitals provided the corresponding number of devices used annually for each needle type.

MAIN OUTCOME MEASURE--Number of needlestick injuries by type of hollow-bore needle per 100,000 devices used per year. RESULTS--A total of 2524 injuries from hollow-bore needles were reported. Disposable syringes/hypodermic needles accounted for 59.3% of injuries, followed by winged steel needles (33.1%), intravenous catheter stylets (5.4%), and vacuum-tube phlebotomy needles (2.2%). Intravenous catheter stylets had the highest needlestick injury rate (15.7/100,000 devices used), and disposable syringes had the lowest needlestick injury rate (3.8/100,000). In contrast to the other devices, the injury rate from winged steel needles increased from 6.2 per 100,000 in 1990 to 13.9 per 100,000 in 1992.

CONCLUSIONS--The device-specific needlestick injury rates in Italy are similar to those reported in the United States, suggesting similar exposure experience in two countries. However, in contrast to the United States, needleless intravenous access is standard practice in Italy and thus eliminates one potential risk to Italian health workers. Implementation of safer equipment, such as shielded or retracting needles, and continuing training programs are needed to further reduce the hazards that health care workers face.

27.

Inadequate standard for glove puncture resistance: allows production of gloves with limited puncture resistance.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

17

Page 18: Safety device studies

Jackson EM, Wenger MD, Neal JG, Thacker JG, Edlich RF

The Department of Plastic Surgery, University of Virginia School of Medicine, Charlottesville 22908, USA.

The Journal of Emergency Medicine 1998 May-Jun;16(3):461-5

The National Fire Protection Association has developed standards for glove puncture resistance using a metal puncture probe. Biomechanical performance studies have demonstrated that glove puncture resistance to the probe is significantly greater than that of the hypodermic needle, suggesting that these standards have no clinical relevance. These standards give a false sense of security to health care personnel and sanction the production and use of gloves that give inadequate protection. The result is potentially harmful for medical personnel.

28.

Rates of needle-stick injury caused by various devices in a university hospital.

Jagger J, Hunt EH, Brand-Elnaggar J, Pearson RD.

Department of Neurosurgery, University of Virginia, Charlottesville 22908.

New England Journal of Medicine 1988 Aug 4;319(5):284-8

We identified characteristics of devices that caused needle-stick injuries in a university hospital over a 10-month period. Hospital employees who reported needle sticks were interviewed about the types of devices causing injury and the circumstances of the injuries. Of 326 injuries studied, disposable syringes accounted for 35 percent, intravenous tubing and needle assemblies for 26 percent, prefilled cartridge syringes for 12 percent, winged steel-needle intravenous sets for 7 percent, phlebotomy needles for 5 percent, intravenous catheter stylets for 2 percent, and other devices for 13 percent. When the data were corrected for the number of each type of device purchased, disposable syringes had the lowest rate of needle sticks (6.9 per 100,000 syringes purchased). Devices that required disassembly had rates of injury of up to 5.3 times the rate for disposable syringes. One third of the injuries were related to recapping. Competing hazards were often cited as reasons for recapping. They included the risk of disassembling a device with an uncapped, contaminated needle and the difficulty of safely carrying several uncapped items to a disposal box in a single trip. New designs could provide safer methods for covering contaminated needles. Devices should be designed so that the worker's hands remain behind the needle as it is covered, the needle should be covered before disassembly of the device, and the needle should remain covered after disposal. Such improvements could reduce the incentives for recapping needles and lower the risk of needle-stick injuries among health care workers.

29.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

18

Page 19: Safety device studies

Sharp object injuries in the hospital: causes and strategies for prevention.

Jagger J, Hunt EH, Pearson RD.

Employee Health Department, University of Virginia, Charlottesville 22908

American Journal of Infection Control 1990 Aug;18(4):227-31

We identified characteristics of items causing sharp object injuries in hospital personnel during a 10-month interval. Sharp objects were defined as items that were not hollow-bore needles that cause lacerations or puncture wounds. Workers reporting sharp object injuries were interviewed to determine what items caused injury and the circumstances of their injuries. Of 89 incidents, 51% were surgical instrument injuries, 19% were lancet injuries, 16% were glass injuries, and 15% were caused by other sharp items. A frequent feature of sharp objects causing injuries was the necessity of disengaging a disposable sharp item from a reusable holder. The application of manual force to fragile glass items also caused many injuries. Opportunities for safer product design and improved materials are discussed to reduce this common occupational hazard.

30.

Zero-Stik-Safety Syringe: an automatic safety syringe

Jeanes A.

University Hospital Lewisham.

Br J Nurs 1999 Apr 22-Mar 12;8(8):530-1, 534-5

Needlestick injury (NSI) is an important although rare cause of the transmission of blood-borne viruses to healthcare staff. Many NSIs are avoidable. The use of universal precautions and careful sharps disposal are key factors in the prevention of injury. The availability of safety devices is increasing as it is recognized that action is necessary to prevent NSI. Devices that are easy to use and require no extra effort on behalf of the healthcare worker are preferred. There is evidence that safety devices decrease the rate of NSI. Zero-Stik Safety Syringe, manufactured by New Medical Technology, automatically retracts the needle once the injection is complete. It is simple to use and requires no formal training.

31.

Needleless Valve Ports May be Associated with a High Rate of Catheter-Related Bloodstream Infection

Karchmer, Tobi B. MD, MS, Cook, Evelyn M. RN, CIC, Palavecino, Elizabeth MD, Ohl, Christopher A. MD, Sherertz, Robert J. MD.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

19

Page 20: Safety device studies

Fifteenth Annual Scientific Meeting; Society for Healthcare Epidemiology of America, 2005 SHEA

Keywords: BLOODSTREAM INFECTIONS, NOSOCOMIAL INFECTION

Central venous catheters (CVC) are increasingly used as a source for routine blood draws; in our hospital ≥ 50% of ICU blood specimens are drawn from CVCs. In recent years CVC access is obtained through needleless ports. First generation needleless ports have a split septum design, while current second generation ports have a more complex "valve" design. There have been reports of increased catheter-related bloodstream infection (CRBSI) rates due to first generation needleless ports that subsequently decreased after education on proper access technique. Recent reports have suggested an increased risk of CRBSI associated with valved ports. Our hospital began using such a valved port in September 2003 which preceded a rise in intensive care unit (ICU) CRBSI rates from 6.3 to 8.5 per 1000 catheter-days (p = 0.02, 95% CI, 1.03, 1.74).

OBJECTIVE: To determine if the needleless valve system used at our hospital was contributing to an increase in CRBSI.

METHODS: Quantitative cultures of blood from ICU patients (pts) were drawn from CVCs through a 2nd generation needleless valve port (Clearlink IV Delivery System, Baxter, Inc., Deerfield, IL) from 12/12/04 - 1/31/05. Blood was obtained from the initial syringe pull back (normally discarded) of the daily morning blood draw and placed into Wampole Isolator® tubes at the bedside. ICU nursing was not informed about the aim of the study. A one-time, open-ended survey of nurses was used to determine the access method from the device. CRBSI was defined by CDC criteria and measured by ICPs. Study protocol was approved by the institutional review board.

RESULTS: 226 “discarded” blood samples were sent for culture from 83 pts. The rate of positive cultures was 17% (39/226) and varied by type of ICU (table). 12% (3/26) of pts with a positive sample had a CRBSI with the same organism. The median number of colony forming units was 0.3 per ml with a mean of 9 and range of 0.1 to > 100. Identified organisms include 25 coagulase-negative staphylococci, 5 yeast, 2 S. aureus, 2 Serratia spp, 2 Enterococcus spp, 1 S. maltophilia, 1 Acinetobacter spp, 1 diphtheroids. 33% (13/39) of organisms would be considered pathogens if isolated from blood. Nursing practice survey revealed that 31% of nurses did not disinfect the needleless valve port prior to accessing the system.

CONCLUSIONS: The introduction of a new valved needleless device appears to correspond to an increase in our rate of CRBSI. This increase is likely due to both true and pseudo-bacteremia related to the device design or its use. Monitoring of CRBSI rates and a follow-up culture study will be necessary after nursing education on proper port disinfection and utilization.

Positive Cultures by Intensive Care Unit

Unit Number of positive cultures Total number of samples Percent of positive cultures

4A 10 58 17%

4B 4 36 11%

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

20

Page 21: Safety device studies

4C 3 37 8%

5A 6 12 50%

5B 3 12 25%

5C 13 71 18%

Total 39 226 17%

32.

Assessing blunt cannulae as replacements for hypodermic needles during intravenous therapy: safety and utility.

Kempen PM.

Department of Anesthesiology, Louisiana State University Medical Center, Shreveport, USA.

Infection Control and Hospital Epidemiology 1997 Mar;18(3):169-74.

OBJECTIVE: Recently, blunt 18-gauge (ga) metal cannulae have become nationally commercially available as safety products. The ability of these blunt cannulae to prevent needlestick injury and to enable direct access of all standard latex ports and vial membranes, thus eliminating hypodermic needles entirely from the intravenous (i.v.) drug administration process, is assessed.

DESIGN AND SETTING: In the laboratory setting, the needlestick injury potential of small-bore blunt cannulae versus hypodermic needles was studied using blinded and randomized methods. Insertion force requirements were studied for cannulae and needles. Metal 18-ga blunt cannulae were inserted into four brands of standard Y-ports and vial stoppers to assess postpuncture integrity and force requirements.

RESULTS: Needlestick injury did not occur using small-bore blunt cannulae (P < .001; n = 51). Metal 18-ga cannulae passed into prepierced standard Y-ports as easily as hypodermic needles and without loss of Y-port integrity. Insertion of metal 18-ga cannulae without prior port puncture was possible, but was associated with substantial coring and loss of integrity of the port seal, except for IVAC brand ports (P < .03).

CONCLUSIONS: Metal 18-ga cannulae can be inserted through virtually all intact standard rubber vial membranes or standard Y-ports to allow safe IV access. A single prepuncture of any standard latex membrane allows economical blunt metal cannula access equally efficiently as with expensive pre-slit membranes and without loss of membrane integrity.

33.

Effect of changing needle disposal systems on needle puncture injuries

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

21

Page 22: Safety device studies

Krasinski K, LaCouture R, Holzman RS.

Infect Control 1987 Feb;8(2):59-62

Accidental needle puncture injuries continue to pose a hazard to hospital workers. In order to reduce the number of such injuries in our hospital, needle disposal procedures were revised to discourage recapping and prevent bending or clipping of needles before discard. Collapsible cardboard boxes were replaced with impervious containers. An educational program accompanied these changes. We compared reports of needlestick injuries before and after the change of procedure, for three parallel 9-month periods. During the 27-month study, injuries occurred during administration of medication (22%), or recapping of used needles (16%), from needles protruding through (10%) or out of the "mouth" (9%) of the container, from needles left in the patient's environment (10%), or those left on procedure trays (7%). Seven percent were the result of being stuck by someone else, usually in the operating room. The mechanism of injury for 19% was not described. Altering the disposal procedures did not change the number or anatomic site of injuries, nor the risk of injury among the various job categories. A reduction in the rate of sticks from needles protruding through the container (1.3 vs 0.3/mo, p less than or equal to 0.005) was the only difference observed. Changing the needle receptacle changed the type but not the overall number of injuries. The education program had little effect on the number and types of injuries. These data point to the need for developing innovative approaches for eliciting changes in behavior of health care personnel.

34.

Application of cost-effectiveness methodology to the consideration of needlestick-prevention technology.

Laufer FN, Chiarello LA.

Bureau of Health Economics, New York State Department of Health, Albany 12237.

Am J Infect Control 1994 Apr;22(2):75-82

Data from the study of needlestick-prevention devices in 10 New York State hospitals enabled application of cost-effectiveness analysis techniques for determining relative benefits of various safety interventions. This article introduces to infection control practitioners several economic concepts related to cost-effectiveness methodology and provides two examples of how they may be applied for decision-making purposes. A critical aspect of the analysis described is the determination of a base cost of needlestick injury. By applying decision analysis to experience-based data aggregated from participating institutions, base expected cost of needlestick injury was determined to be $363.

35.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

22

Page 23: Safety device studies

The effectiveness of a needleless intravenous connection system: an assessment by injury rate and user satisfaction.

Lawrence LW, Delclos GL, Felknor SA, Johnson PC, Frankowski RF, Cooper SP, Davidson A.

Department of Medical Technology, Louisiana State University, New Orleans 70112-2262, USA.

Infect Control Hosp Epidemiology 1997 Mar;18(3):175-82

OBJECTIVE: To assess the impact of a needleless intravenous (i.v.) connection system on the rate of reported intravenous-connection-related (IVCR) percutaneous injuries, and to assess user satisfaction, frequency of use, and barriers to use.

DESIGN: A pre-post intervention design, with injury incidence rates being compared 3 years before and 1 year after hospital wide device implementation; and a cross-sectional descriptive user satisfaction survey.

SETTING: Two tertiary-care teaching hospitals, one general and one pediatric, located in a large metropolitan medical center.

OUTCOME VARIABLE: All IVCR percutaneous injuries reported to the employee health services at both hospitals during the years from 1989 to 1991 and 1993.

STUDY POPULATION: Survey participants were selected randomly from licensed nursing employees at both hospitals.

INTERVENTION: i.v. connection system consisting of blunt plastic cannulas and compressed latex injection sites.

RESULTS: After device implementation, the IVCR injury rate was reduced 62.4% (rate ratio [RR], 0.38; 95% confidence interval [CI95], 0.27-0.53) at the general hospital and 70.2% (RR, 0.30; CI95, 0.17-0.53) at the pediatric hospital. After adjusting for the reduction in injury rate due to factors other than device implementation, the IVCR injury rate was reduced 54.5% (adjusted RR, 0.46; CI95, 0.32-0.65) at the general hospital and 57.2% (adjusted RR, 0.43; CI95, 0.24-0.78) at the pediatric hospital. Approximately 94% of survey respondents (n = 478, response rate = 51%) were satisfied with the device and recommended continued use. However, needles still were being used for activities that could have been performed with the needleless system because of compatibility, accessibility, and other technical problems related to the device.

CONCLUSIONS: The device was effective in reducing the rate of reported IVCR percutaneous injuries and users were satisfied with the device, but barriers to universal use were identified.

36.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

23

Page 24: Safety device studies

Randomized prospective study of the impact of three needleless intravenous systems on needlestick injury rates.

L'Ecuyer PB, Schwab EO, Iademarco E, Barr N, Aton EA, Fraser VJ

Division of Infectious Diseases, Washington University School of Medicine, St Louis, MO 63110-1093, USA.

Infect Control Hosp Epidemiology 1996 Dec;17(12):803-8

OBJECTIVE: To determine the impact of three needleless intravenous systems on needlestick injury rates. DESIGN: Randomized controlled trial.

SETTING: 1,000-bed tertiary-care Midwestern hospital.

PARTICIPANTS: Nursing personnel from general medical, general surgical, and intensive-care units.

INTERVENTIONS: From June 1992 through March 1994, a metal blunt cannula (MBC), two-way valve (2-way), and plastic blunt cannula (PBC) were introduced into three study areas, and needlestick injury rates were compared to three control areas using traditional needled devices.

RESULTS: 24 and 29 needlestick injuries were reported in study and control areas. Intravenous-therapy-related injuries comprised 45.8% and 57.1% of injuries in each area. Thirty-seven percent and 20.7% of study and control area needlestick injuries were considered to pose a high risk of bloodborne infection. The 2-way group had similar rates of total and intravenous-related needlestick injuries compared to control groups. The PBC group had lower rates of total and intravenous-related needlestick injuries per 1,000 patient-days (rate ratios [RR], 0.32 and 0.24; 95% confidence intervals [CI95], 0.12-0.81 and 0.09-0.61; P = .02 and P = .003, respectively) and per 1,000 productive hours worked (RR, 0.11 and 0.08; CI95, 0.01-0.92 and 0.01-0.69; P = .03 and P = .005, respectively) compared to controls.

CONCLUSIONS: Needlestick injuries continued in study areas despite the introduction of needleless devices, and risks of bloodborne pathogen transmission were similar to control areas. The PBC device group noted lower rates of needlestick injuries compared to controls, but there were problems with product acceptance, correct product use, and continued traditional device use in study areas. Low needlestick injury rates make interpretations difficult. Further studies of safety devices are needed and should attempt greater control of worker behavior to aid interpretation.

37.

Effect of educational programs, rigid sharps containers, and universal precautions on reported needlestick injuries in healthcare workers

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

24

Page 25: Safety device studies

Linnemann CC Jr, Cannon C, DeRonde M, Lanphear B.

Infection Control Department, University of Cincinnati Hospital, Ohio.

Infect Control Hosp Epidemiology 1991 Apr;12(4):214-9

OBJECTIVE: To evaluate the effect of infection control programs on reported needlestick injuries in a general hospital.

DESIGN: Surveillance of all reported needlestick injuries at the University of Cincinnati Hospital was maintained by the infection control department for five years, from 1985 through 1989. Data on individual workers were collected, tabulated on a monthly basis, and reviewed continually to monitor trends in injuries. During this time, the effects of each of three new infection control programs on reported injuries were evaluated sequentially.

SETTING: A 700-bed general hospital that serves as the main teaching hospital of the University of Cincinnati.

PARTICIPANTS: All employees of University Hospital who reported to personnel health for management of needlestick injuries.

INTERVENTIONS: In 1986, an educational program to prevent injuries was initiated and continued throughout the surveillance period. In 1987, rigid sharps disposal containers were placed in all hospital rooms. In 1988, universal precautions were introduced with an intensive inservice.

RESULTS: Surveillance identified 1,602 needlestick injuries (320/year) or 104/1,000/year. After the educational program began, reported injuries increased rather than decreased, and this was attributed to increased reporting. Subsequently, after installation of the new disposal containers, reported injuries returned to the levels seen prior to the educational program, but recapping injuries showed a significant decrease from 63/year to 30, or 20/1,000/year to 10. This decrease was observed in nurses but not in other healthcare workers. After universal precautions were instituted, total injuries increased slightly, but recapping injuries remained at 50% of the levels reported prior to the use of rigid sharps disposal containers.

CONCLUSIONS: The three infection control programs failed to produce a major reduction in reported needlestick injuries, except for a decrease in recapping injuries associated with the placement of rigid sharps disposal containers in all patient rooms. These observations indicate that new approaches are needed to reduce needlestick injuries.

38.

A multicenter study of costs and nursing impact of cartridge-needle units.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

25

Page 26: Safety device studies

Llewellyn J, Giese R, Nosek LJ, Lager JD, Turco SJ, Goodell J, Coleman J, McQuone MJ, Collier PA, Minard DA, et al.

Nursing Economics 1994 Jul-Aug;12(4):208-14

A multicenter study of the overall costs and impact on medication administration of sterile cartridge-needle units (Carpuject) and conventional disposable syringes with multiple- and single-dose vials was conducted in six hospitals in the Voluntary Hospitals of America (VHA) system. Data collection involved a time and motion phase measuring procurement, preparation, and disposal of both systems and a 6-month prospective global cost analysis of component costs, waste disposal, nursing time, needlestick injuries, and subsequent treatment. Nonacquisition costs were substantially lower with the use of the cartridge-needle units. When all costs are considered, sterile cartridge-needle units are a cost-effective alternative to conventional disposable syringes and multidose vials.

39.

Epidemiology of hospital sharps injuries: a 14-year prospective study in the pre-AIDS and AIDS eras

McCormick RD, Meisch MG, Ircink FG, Maki DG.

Am J Med. 1991 Sep 16;91(3B):301S-307S

The world pandemic of acquired immunodeficiency syndrome (AIDS) has focused enormous attention on the problem of accidental sharps injuries sustained by health care workers (HCWs) and the risk of occupationally acquired infection by human immunodeficiency virus (HIV). At the 1980 Conference, we reported a 4-year epidemiologic study (1975-1979) of sharps injuries in HCWs at our hospital. Using the same reporting system and analyses, we now report the epidemiology of sharps injuries in our center during the current AIDS era (1987-1988) and assess trends over the 14-year period. Despite greatly increased institutional efforts to prevent sharps injuries, the annual incidence has increased more than threefold (60.4 to 187.0/1,000 HCWs), reflecting better reporting and increased exposure. Reported injuries by house officers have increased ninefold. Adjusting for inflation, the direct costs of sharps injuries has increased sevenfold ($5,354 to $37,271/year). Environmental service HCWs (305.8 sharps injuries per 1,000 employees) now have the highest incidence in our center, followed by nursing personnel (196.5/1,000) and laboratory personnel (169.9/1,000), but as in 1975-1979, two thirds of all injuries occur in nursing personnel. Although phlebotomy team members have a very low risk per procedure (1/26,871 draws), their annual incidence is extraordinarily high, 407.0/1,000. Injuries continue to occur mainly during disposal of waste, linen, or used procedure trays (19.7% of all injuries), administration of parenteral injections or infusion therapy (15.7%), surgery (16.0%), blood drawing (13.3%), or recapping of used needles (10.1%). Making disposal units available at every bedside has reduced injuries from needle disposal two-fold since 1975-1979. With consistent application of a stringent postexposure protocol, and wide acceptance of the hepatitis B vaccine, we have had no sharps injury-related infections during the past 3 years.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

26

Page 27: Safety device studies

These data indicate the increasing risk, complexity and cost of sharps injuries in HCWs and the need for more innovative--ideally, technology-based--approaches to prevention. Certain groups of HCWs are at very high risk. Comprehensive postexposure protocols that are uniformly applied can provide substantial protection to exposed HCWs.

40.

Epidemiology of needlestick injuries in house officers

McGeer A, Simor AE, Low DE.

J Infect Dis. 1990 Oct;162(4):961-4

Eighty-eight medical students, interns, and residents were surveyed to study the epidemiology of their percutaneous exposures to blood. Respondents described 159 injuries in 221 person-years (py) of exposure in hospital wards and 213 injuries in 166 py of exposure in operating rooms. Nearly all injuries (greater than 98%) were needlesticks; less than 5% were reported to occupational health services. Rates of ward-related injury were highest for students (0.97/py) and decreased during training. Most injuries were due to recapping of used needles. In contrast to ward-related injury, rates of operating room-related injury were relatively low for nonsurgical students and interns (0.3/py), higher for surgical students (1.36/py), and stable over surgical residency training (mean, 5.4/py). Virtually all surgical injuries occurred during suturing. Further research into mechanisms of needlestick injuries and product design for their prevention are needed.

41.

Evaluation of a Safety Resheathable Winged Steel Needle for Prevention of Percutaneous Injuries Associated With Intravascular-Access Procedures Among Healthcare Workers

Mendelson M; Lin-Chen BY, Solomon R, Bailey E, Kogan G, Goldbold J.

Infection Control and Hospital Epidemiology 2003;24:105-112

OBJECTIVETo compare the percutaneous injury rate associated with a standard versus a safety resheathable winged steel (butterfly) needle.

DESIGNBefore–after trial of winged steel needle injuries during a 33-month period (19-month baseline, 3-month training, and 11-month study intervention), followed by a 31-month poststudy period.

SETTINGA 1,190-bed acute care referral hospital with inpatient and outpatient services in New York City.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

27

Page 28: Safety device studies

PARTICIPANTSAll healthcare workers performing intravascular-access procedures with winged steel needles.

INTERVENTIONSafety resheathable winged steel needle.

RESULTSThe injury rate associated with winged steel needles declined from 13.41 to 6.41 per 100,000 (relative risk [RR], 0.48; 95% confidence interval [CI95], 0.31 to 0.73) following implementation of the safety device. Injuries occurring during or after disposal were reduced most substantially (RR, 0.15; CI95, 0.06 to 0.43). Safety winged steel needle injuries occurred most often before activation of the safety mechanism was appropriate (39%); 32% were due to the user choosing not to activate the device, 21% occurred during activation, and 4% were due to improper activation. Preference for the safety winged steel needle over the standard device was 63%. The safety feature was activated in 83% of the samples examined during audits of disposal containers. Following completion of the study, the safety winged steel needle injury rate (7.29 per 100,000) did not differ significantly from the winged steel needle injury rate during the study period.

CONCLUSIONImplementation of a safety resheathable winged steel needle substantially reduced injuries among healthcare workers performing vascular-access procedures. The residual risk of injury associated with this device can be reduced further with increased compliance with proper activation procedures

42.

Study of Introcan Safety™ IV Catheter (IVC), B Braun Medical Inc) for the prevention of percutaneous injuries (PIs) in healthcare workers (HCWs)

Mendelson MH, Lin-Chen BY, Finkelstein-Blond LE, Kogan MS, Hollinger MD

Thirteenth Annual Scientific Meeting; Society for Healthcare Epidemiology of America, 2003 SHEA – Abstract.

Background: PIs due to hollow-bore needles for IV access are responsible for transmission of blood-borne pathogends to HCWs. Although safety IVCs have been demonstrated to reduce injuries, replacement of non-safety IVCs is not complete and PIs continue to occur with non-safety IVs stylets.

Objectives: Evaluate Introcan Safety ™ IVCs for reduction in IV stylet PIs.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

28

Page 29: Safety device studies

Methods: A before and after trial of IVC stylet injuries over 44 mos (36 mos baseline, 2 mos training/pilot and 6 mos study) in selected areas (ORs, post-anesthesia care units, neonatal and pediatric ICUs-NICU, PICU. Although a safety IVS was used in the remainder of the hospital, HCWs in these areas did not choose to utilize previously available safety IVSs due to the size of the safety mechanism and flashback visibility. A safety IVC (Introcan Safety ™ ) was used for intravenous and arterial line insertions during this evaluation.

Results: During the baseline period (1/99-12/01, 36 months) there were 13 non-safety stylet injuries and 255,900 IVCs utilized (IR-injury rate 5.08 per 100,00). 6 injuries were sustained by OR, and 7 by ICU (4 NICU, 3 PICU) staff. 5 occurred during use, 6 after use and before disposal, and 3 during or after disposal. During the study period there were no safety IVC stylet injuries (IR 0/100,000 p=0.07) One injury occurred with the Introcan Safety ™ IVC during thje post-study period due to non withdrawal of the stylet from the IVC following an unsuccessful insertion. A product evaluation survey completed by 59 HCWs (23 RNs, 36 MDs) showed 63% completely comfortable with the study IVC by the 10th insertion, 49% felt it was easy or very easy to use and 90% noted that a change in technique was necessary.

Conclusions: Although a variety of safety IVCs are currently available, applicability for either intravenous or intra-arterial insertions and easy of use during insertions in certain patient populations may be device and patient-type specific. There was a trend towards significance in reducing injuries with the Introcan Safety™ . Withdrawal of the stylet from the safety IVC is necessary for protection against PIs (Editorial note..with this particular device)

43.

Evaluation of a Safety IV Catheter (IVC) (Becton Dickinson, INSYTE™ AUTOGUARD™): Final Report

Mendelson M, Lin-Chen BY, Finkelstein-Blond L, Bailey E, Kogan G.

The Mount Sinai Medical Center, New York, NY

Eleventh Annual Scientific Meeting; Society for Healthcare Epidemiology of America, 2001 SHEA

A safety IVC (Becton Dickinson, Insyte™ Autoguard™) was evaluated at a 1,100 bed university affiliated medical center to determine efficacy in reducing needlestick injuries (NIs). NI rate during a baseline period I (non-safety; 6/93-8/96, 39 months) was compared to the study period II (2/99-7/00, 18 months). The study period included a two-month training (2-3/99) and a three-month pilot (4-6/99). Protectiv® Plus Catheter (Johnson and Johnson) was evaluated during the interim time between Period I and II. NI data was analyzed utilizing the CDC NaSH database. Two sharp disposal surveys were performed to assess usage and activation rates in 6/99 and 7/00; and two product evaluation surveys were conducted in 12/99 and 7/00. A 95% reduction in IV stylet-related NIs was demonstrated comparing the baseline Period I NI rate of 6.6/100,000 IV stylets (56 injuries/848,958 stylets) to the study Period II NI rate of 0.3/100,000 IV stylets (1

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

29

Page 30: Safety device studies

injury/331,516 safety IV stylets) (p<0.001). The period II NI occurred while the stylet was being withdrawn from the patient and the healthcare worker (HCW) failed to activate the safety mechanism. When comparing the two sharps disposal surveys, the 2nd survey activation rate was 91% vs. 85% (1st). The 2nd product evaluation showed 98% of HCWs answering the safety IV catheter was very easy/easy to use compared to 78% (1st). 95% (2nd) vs. 76% (1st) felt there was either no change/a slight change in technique needed to use this safety device. 99% of HCWs during both surveys answered the safety IV catheter provided effective protection against needlesticks. In conclusion, the Insyte Autoguard resulted in a marked and significant reduction in IV stylet-related injuries during the study period. Although this safety IVC requires activation by the user, the simplicity of the activation process promotes user compliance and therefore, reduction in injuries. In that IV stylet-related injuries are high risk (hollow-bore needle, inserted directly into vein or artery), usage of this safety device should result in decreased blood-borne pathogen transmission to HCWs.

44.

Study of a needleless intermittent intravenous-access system for peripheral infusions: analysis of staff, patient, and institutional outcomes.

Mendelson MH, Short LJ, Schechter CB, Meyers BR, Rodriguez M, Cohen S, Lozada J, DeCambre M, Hirschman SZ.

Department of Medicine, Mount Sinai Hospital, New York, NY, USA.

Infect Control Hosp Epidemiology 1998 Jun;19(6):401-6

OBJECTIVE: To assess the effect on staff- and patient-related complications of a needleless intermittent intravenous access system with a reflux valve for peripheral infusions.

DESIGN: A 6-month cross-over clinical trial (phase I, 13 weeks; phase II, 12 weeks) of a needleless intermittent intravenous access system (NL; study device) compared to a conventional heparin-lock system (CHL, control device) was performed during 1991 on 16 medical and surgical units. A random selection of patients was assessed for local intravenous-site complications; all patients were assessed for the development of nosocomial bacteremia and device-related complications. Staff were assessed for percutaneous injuries and participated in completion of product evaluations. A cost analysis of the study compared to the control device was performed.

SETTING: A 1,100-bed, teaching, referral medical center.

PATIENTS AND STAFF PARTICIPANTS: 594 patients during 602 patient admissions, comprising a random sample of all patients with a study or control device inserted within a previous 24-hour period on study and control units, were assessed for local complications. The 16 units included adult inpatient general medicine, surgical, and subspecialty units. Pediatrics, obstetrics-gynecology, and intensive-care units were excluded. All patients on study and control

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

30

Page 31: Safety device studies

units were assessed for development of nosocomial bacteremia and device-related complications. All staff who utilized, manipulated, or may have been exposed to sharps on study and control units were assessed for percutaneous injuries. Nursing staff completed product evaluations.

INTERVENTION: The study device, a needleless intermittent intravenous access system with a reflux valve, was compared to the control device, a conventional heparin lock, for peripheral infusions. RESULTS: During the study, 35 percutaneous injuries were reported. Eight injuries were CHL-related; no NL-related injuries were reported (P=.007). An evaluation of 602 patient admissions, 1,134 intermittent access devices, and 2,268 observed indwelling device days demonstrated more pain at the insertion site for CHL than NL; however, no differences in objective signs of phlebitis were noted. Of 773 episodes of positive blood cultures on study and control units, 6 (0.8%) were device- related (assessed by blinded investigator), with no difference between NL and CHL. Complications, including difficulty with infusion (P<.001) and disconnection of intravenous tubing from device (P<.001), were reported more frequently with CHL than with NL. Of nursing staff responding to a product evaluation survey, 95.2% preferred the study over control device. The projected annual incremental cost to our institution for hospitalwide implementation of NL for intermittent access for peripheral infusions was estimated at $82,845, or $230 per 1,000 patient days.

CONCLUSIONS: A needleless intermittent intravenous access system with a reflux valve for peripheral infusions is effective in reducing percutaneous injuries to staff and is not associated with an increase in either insertion-site complications or nosocomial bacteremia. Institutions should consider these data, available institutional resources, and institution-specific data regarding the frequency and risk of intermittent access-device-related injuries and other types of sharps injuries in their staff when selecting the above or other safety devices.

45.

Initial worker evaluation of a new safety syringe.

Mulherin S, Rickman LS, Jackson MM.

Epidemiology Unit, UCSD Medical Center 8951, 92103, USA.

Infect Control Hosp Epidemiology 1996 Sep;17(9):593-4

A prospective evaluation of a new safety syringe requiring a one-step activation was carried out at the University of California, San Diego Medical Center. Only 59.5% of 390 syringes were activated, and user acceptance and satisfaction were unfavorable. The development of safety devices should incorporate passive activation and take end-user satisfaction into consideration.

46.

Preventing Needlestick Injuries in Health Care Settings.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

31

Page 32: Safety device studies

National Institute for Occupational Safety and Health (NIOSH). Preventing Needlestick Injuries in Health Care Settings. NIOSH Alert Publication No. 2000-108, November 1999. Website: www.cdc.gov/niosh/2000-108.htm47.

The benefits and limitations of needle protectors and needleless intravenous systems

Orenstein R.

HIV/AIDS Program, Hunter Holmes McGuire Veterans Affairs Medical Center, Richmond, Virginia, USA

J Intraven Nurs 1999 May-Jun;22(3):122-8

Needleless and needle protector intravenous systems have taken the place of 80% of needles used in i.v. therapy. Although these new systems are marketed as safe, many have not been widely tested and are not fail-safe. Each of the needleless i.v. systems and needle protector systems has limitations and potential benefit when applied in the appropriate circumstances. The benefits and limitations of these devices in today's healthcare market are discussed.

48.

Do protective devices prevent needlestick injuries among health care workers?

Orenstein R, Reynolds L, Karabaic M, Lamb A, Markowitz SM, Wong ES.

Division of Hospital Epidemiology, Medical College of Virginia, Richmond, USA

American Journal of Infection Control 1995 Dec;23(6):344-51

OBJECTIVES: To determine the effectiveness and direct of two protective devices-a shielded 3 ml safety syringe (Safety-Lok; Becton Dickinson and Co., Becton Dickinson Division, Franklin Lakes, N.J.) and the components of a needleless IV system (InterLink; Baxter Healthcare Corp., Deerfield, Ill.)--in preventing needlestick injuries to health care workers. DESIGN: Twelve-month prospective, controlled, before-and-after trial with a standardized questionnaire to monitor needlestick injury rates.

SETTING: Six hospital inpatient units, consisting of three medical units, two surgical units (all of which were similar in patient census, acuity, and frequency of needlesticks), and a surgical-trauma intensive care unit, at a 900-bed urban university medical center.

PARTICIPANTS: All nursing personnel, including registered nurses, licensed practical nurses, nursing aides, and students, as well as medical teams consisting of an attending physician, resident physician, interns, and medical students on the study units.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

32

Page 33: Safety device studies

INTERVENTION: After a 6-month prospective surveillance period, the protective devices were randomly introduced to four of the chosen study units and to the surgical-trauma intensive care unit. RESULTS: Forty-seven needlesticks were reported throughout the entire study period, 33 in the 6 months before and 14 in the 6 months after the introduction of the protective devices. Nursing staff members who were using hollow-bore needles and manipulating intravenous lines accounted for the greatest number of needlestick injuries in the pre-intervention period. The overall rate of needlestick injury was reduced by 61%, from 0.785 to 0.303 needlestick injuries per 1000 health care worker-days after the introduction of the protective devices (relative risk = 1.958; 95% confidence interval, 1.012 to 3.790; p = 0.046). Needlestick injury rates associated with intravenous line manipulation, procedures with 3 ml syringes, and sharps disposal were reduced by 50%; however, reductions in these subcategories were not statistically significant. No seroconversions to HIV-1 or hepatitis B virus seropositivity occurred among those with needlestick injuries. The direct cost for each needlestick prevented was $789.

CONCLUSIONS: Despite an overall reduction in needlestick injury rates, no statistically significant reductions could be directly attributed to the protective devices. These devices are associated with a significant increase in cost compared with conventional devices. Further studies must be concurrently controlled to establish the effectiveness of these devices.

49.

Device-specific sharps injury and usage rates: An analysis by hospital department

Patel N, & Tignor GH.

American Journal of Infection Control 1997;25:77-84

BACKGROUND: Whether universal precautions training has reduced percutaneous sharps injuries is questioned. Prevention programs directed to specific problem areas are required to further reduce injury. Our purpose was to identify target areas.

METHODS: Device-specific sharps injury rates per 100,000 devices purchased were determined by department at Yale New Haven Hospital (1993 to 1994). Usage per full-time equivalent was calculated by department. Rates were modeled using Poisson regression.

RESULTS: Three epidemiologic patterns resulted: (1) injury rates were independent of usage (butterfly needles); (2) injury rates varied directly with usage (lancets); (3) injury rates varied inversely with usage (intravenous catheters, sutures, and scalpels). Device-specific usage and injury rates varied by department. Devices used little (9/full-time equivalent) but under difficult circumstances, such as intravenous catheters in pediatric patients, were associated with high injury rates (67.7/100,000). Devices, sometimes disassembled, such as blood collecting tubes, caused significantly more injury in departments where health care professionals work under time constraints, such as in the emergency department and nursing. Unconventional use of devices

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

33

Page 34: Safety device studies

(Luer-Lok syringes and scalpels) resulted in higher rates of injury (nursing and laboratories). Building services appeared to be at risk for injury.

CONCLUSIONS: With device-specific injury and usage rates by department, injury prevention programs can now focus on specific devices and departments.

50.

Building Better Programs to Prevent Transmission of Blood-Borne Pathogens to Healthcare Personnel: Progress in the Workplace, But Still No End in Sight

Pegues, D.

Division of Infectious Disease, Department of Internal Medicine, David Geffen School of Medicine, University of California Los Angles, Los Angles, California.

Infect Control Hosp Epidemiology 2003 Oct;24(10):719-721.

With the increasing prevalence of blood-borne pathogens and drug-resistant HIV among source patients, building a comprehensive program to prevent occupational transmission requires greater resources and state-of-the-art clinical expertise. Occupational exposure to blood or body fluids harboring or potentially harboring blood-borne pathogens is a medical emergency, but as many as 70% of percutaneous injuries have gone unreported. All healthcare organizations should not only periodically train all healthcare personnel in standard precautions and the proper and consistent use of safety devices, but also reemphasize the critical importance of reporting occupational exposures. Administrative barriers that delay postexposure management must be minimized.

Improving and standardizing electronic percutaneous injuries registries, such as that used by the occupational health clinics in the study by Babcock and Fraser, can increase the reliability and timeliness of reported percutaneous injury rates. Use of an appropriate denominator (eg, 100 patient beds or 1,00 patient days) can assist healthcare facilities to trend and to benchmark percutaneous injury rates and to better assess the impact of safety devices and training and education programs. Although more than half of U.S. healthcare personnel work in non-hospital settings, there are only limited, published data on percutaneous injuries in these settings. More data are also needed on the safety, tolerability, and effectiveness of post-exposure prophylaxis regimens, especially for the management of exposures to source patients with antiretroviral-resistant HIV. As resistance testing of the source virus at the time of an exposure remains impractical, additional epidemiologic studies should help to refine clinical markers for source HIV drug resistance. Despite substantial progress, the recent challenge by the Centers for Disease Control and Prevention to eliminate occupational needlestick injuries among healthcare workers remains to be met.

51.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

34

Page 35: Safety device studies

Evaluating sharps safety devices: meeting OSHA's intent. Occupational Safety and Health Administration.

Pugliese G, Germanson TP, Bartley J, Luca J, Lamerato L, Cox J, Jagger J.

Premier Safety Institute, Premier Inc, Oak Brook, IL 60523, USA.

Infect Control Hosp Epidemiology 2001 Jul;22(7):456-8

The Occupational Safety and Health Administration (OSHA) revised the Bloodborne Pathogen Standard and, on July 17, 2001, began enforcing the use of appropriate and effective sharps devices with engineered sharps-injury protection. OSHA requires employers to maintain a sharps-injury log that records, among other items, the type and brand of contaminated sharps device involved in each injury. Federal OSHA does not require needlestick injury rates to be calculated by brand or type of device. A sufficient sample size to show a valid comparison of safety devices, based on injury rates, is rarely feasible in a single facility outside of a formal research trial. Thus, calculations of injury rates should not be used by employers for product evaluations to compare the effectiveness of safety devices. This article provides examples of sample-size requirements for statistically valid comparisons, ranging from 100,000 to 4.5 million of each device, depending on study design, and expected reductions in needlestick injury rates.

52.

How to select and evaluate new products on the market.

Quebbeman EJ, Short LJ.

Department of Surgery, Medical College of Wisconsin, Milwaukee, USA

Surgical Clinics of North America 1995 Dec;75(6):1159-65

New devices and products often promise to protect health-care workers and patients from transmission of viral infections. These need to be evaluated carefully for efficacy, applicability, and cost in an objective, structured manner.

53.

An effective educational program to reduce the frequency of needle recapping

Ribner BS.

Department of Medicine, Duke University Medical Center, Durham, North Carolina.

Infect Control Hosp Epidemiology 1990 Dec;11(12):635-8

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

35

Page 36: Safety device studies

We developed an educational program that reported the rate of needle recapping to healthcare workers, in conjunction with emphasis on appropriate disposal procedures. Over 12 months, the rate of recapping needles used for venipuncture and for percutaneous medication injections fell from 61% to 16% (p less than .0001). Over the same period, the recapping of needles used primarily for intravenous (IV) administration fell from 44% to 33% (p = .03). Re-evaluation of the rate of recapping eight months later showed a continuation of these lowered rates. Needlestick injuries were too few in number during the study period to detect any change accompanying the decreased recapping rate. We conclude that programs that report back to employees their rate of recapping can significantly reduce this activity in the disposal of needles used for venipuncture and for percutaneous medication injections. While such reporting may reduce the rate of recapping of needles used for IV administration, the effect is not nearly so marked. Modifications in design remain the most promising approach to preventing needlestick injuries from recapping needles used for IV administration.

54.

Needle stick injury. Reducing the risk.

Rice JJ, McCabe JP, McManus F

Orthopaedic Department, Cappagh Orthopaedic Hospital, Finglas, Dublin, Eire.

International Orthopaedics 1996;20(3):132-3

The incidence of penetrating skin wounds and needle penetration of gloves during operation was studied in orthopaedic surgeons. Significant hand wounds were found in 11% of surgeons before operations. Glove penetration during closure of the deep tissues occurred in 16% of outer gloves and 6% of inner gloves when standard needle points were used. The surgeon sustained a needle-stick injury in 6% of this group. When a needle with a protective point was used, there were no glove perforations. This simple precaution reduces the risk of transmission of blood-borne disease during operation.

55.

Predictors of nurses' acceptance of an intravenous catheter safety device

Rivers DL, Aday LA, Frankowski RF, Felknor S, White D, Nichols B.

Nurs Res 2003 Jul-Aug;52(4):249-55

BACKGROUND: It is important to determine the factors that predict whether nurses accept and use a new intravenous (IV) safety device because there are approximately 800,000 needlesticks per year with the risk of contracting a life-threatening bloodborne disease such as HIV or hepatitis C.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

36

Page 37: Safety device studies

OBJECTIVES: To determine the predictors of nurses' acceptance of the Protectiv Plus IV catheter safety needle device at a teaching hospital in Texas.

METHOD: A one-time cross-sectional survey of nurses (N = 742) was conducted using a 34-item questionnaire. A framework was developed identifying organizational and individual predictors of acceptance. The three principal dimensions of acceptance were (a) satisfaction with the device, (b) extent to which the device is always used, and (c) nurse recommendations over other safety devices. Measurements included developing summary subscales for the variables of safety climate and acceptance. Descriptive statistics and multiple linear and logistic regression models were computed.

RESULTS: The findings showed widespread acceptance of the device. Nurses who had adequate training and a positive institutional safety climate were more accepting (p <or=.001). Also, nurses who worked at the hospital a shorter period were more likely to be accepting of the device (p <or=.001). Nurses who felt that the safety climate was positive and who had used the device for at least 6 months were more likely to use the device (p <or=.001).

DISCUSSION: To achieve maximum success in implementing IV safety programs, high quality training and an atmosphere of caring about nurse safety are required.

56.

Evaluation of interventions to prevent needlestick injuries in health care occupations

Rogers B, Goodno L.

School of Public Health, Occupational Health Nursing, University of North Carolina, Chapel Hill, North Carolina 27955-7400, USA

Am J Prev Med 2000 May;18(4 Suppl):90-8

OBJECTIVE: The objective of this study was to evaluate interventions that reduce or prevent needlestick injuries in health care occupations.

METHODS: Cochrane Collaboration search strategies to locate studies that evaluated interventions to reduce needlestick injuries in health care occupations were used. Studies were selected if they met the following criteria: (1) interventions were evaluated in the defined population; (2) interventions were randomized, with a comparison group(s); (3) outcomes were objectively measured and had interpretable data. Eleven studies met inclusion criteria. The main outcomes of interest were changes in the number of glove or skin perforations and changes in amount of skin contamination.

RESULTS: Three studies found a decrease in glove or skin perforations when double gloves or combinations of gloves were used by surgeons and their assistants. One study found an increase

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

37

Page 38: Safety device studies

in glove perforations but a decrease in hand contamination. Three studies evaluated the effectiveness of specialized needles in reducing needlestick injuries during surgical wound closure with decreases in glove or skin perforations reported. Protective devices were evaluated in three studies and significant reductions in glove perforations were found with the use of a needleless intravenous system and surgical assist device. One study evaluated a "no-touch" technique used by surgeons during wound closure and found a significant decrease in the number of glove perforations compared to the traditional "hand-in" method of closure.

CONCLUSIONS: Few randomized controlled trials have been employed to evaluate the effectiveness of interventions to reduce needlestick injuries in health care occupations. The majority of the studies evaluated interventions during surgical procedures, rather than during patient care on nursing units, probably because the latter is more difficult to observe.

57.

Impact of safety devices for preventing percutaneous injuries related to phlebotomy procedures in health care workers

Rogues AM, Verdun-Esquer C, Buisson-Valles I, Laville MF, Lasheras A, Sarrat A, Beaudelle H, Brochard P, Gachie JP.

Service d'Hygiene Hospitaliere, Batiment PQR, Groupe Hospitalier Pellegrin CHU de Bordeaux, Place Amelie Raba-Leon, 33076 Bordeaux Cedex, France

Am J Infect Control. 2004 Dec;32(8):441-4.

BACKGROUND: Use of protective devices has become a common intervention to decrease sharps injuries in the hospitals; however few studies have examined the results of implementation of the different protective devices available.

OBJECTIVE: To determine the effectiveness of 2 protective devices in preventing needlestick injuries to health care workers.

METHODS: Sharps injury data were collected over a 7-year period (1993-1999) in a 3600-bed tertiary care university hospital in France. Pre- and postinterventional rates were compared after the implementation of 2 safety devices for preventing percutaneous injuries (PIs) related to phlebotomy procedures.

RESULTS: From 1993 to 1999, an overall decrease in the needlestick-related injuries was noted. Since 1996, the incidence of phlebotomy-related PIs has significantly decreased. Phlebotomy procedures accounted for 19.4% of all percutaneous injuries in the preintervention period and 12% in the postintervention period (RR, O.62; 95% CI, 0.51-0.72; P < .001). Needlestick-related injuries incidence rate decreased significantly after the implementation of the 2 safety devices, representing a 48% decline in incidence rate overall.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

38

Page 39: Safety device studies

CONCLUSIONS: The implementation of these safety devices apparently contributed to a significant decrease in the percutaneous injuries related to phlebotomy procedures, but they constitute only part of a strategy that includes education of health care workers and collection of appropriate data that allow analysis of residuals percutaneous injuries.

58.

Needleless intravenous systems: A review

Russo PL, Harrington GA, & Spelman DW.

American Journal of Infection Control 1999;27(5)431-43

BACKGROUND: Needleless intravenous devices have now been implemented by many institutions worldwide. A rationale for their use has been a reduction in the number of needlestick injuries.

OBJECTIVE: The aim of this review is to outline the possible benefits and dangers of needleless intravenous systems.

REVIEW: Many early reports demonstrate a reduction in needlestick injuries after the implementation of a needleless intravenous device; however, not all such reductions are directly attributable to the device itself. Furthermore, good evidence suggests that needlestick accidents prevented by needleless intravenous devices pose little threat to health care workers. Finally, increasing reports associate bacteremias with the use of needleless intravenous devices. Early reports described devices used in the home care setting; however, recent reports are from acute health care settings, including intensive care units.

CONCLUSION: Ongoing critical review of the benefits, risks, and costs of needleless intravenous devices is required.

59.

Risk of needle stick and sharp object injuries among medical students.

Shen C, Jagger J, Pearson RD

Department of Pediatrics at the University of Virginia, Charlottesville 22908.

American Journal of Infection Control 1999 Oct;27(5):435-7

BACKGROUND: Much is known about sharp object and needle stick injuries among employee health care workers, but relatively little attention has been directed to exposures among medical students.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

39

Page 40: Safety device studies

METHOD: The frequency and mechanisms of needle stick and sharp object injuries were determined retrospectively by surveying students in their fourth year of medical school. Students were questioned about the number of percutaneous injuries that they had sustained during their clinical years. Descriptive information was collected on their most recent injury.

RESULTS: Of 137 students in the class, 106 (77%) responded. Thirty-five (33%) of the students who responded sustained one or more injuries; 24 (69%) were injured while on a surgical service, and 60% of the injuries occurred in an operating room. Suturing was the procedure most frequently associated with injury. In 34% of cases, the injury was caused by a needle or device being used by another person. The most frequent site of injury was the hand (97%). Ninety-four percent of students were wearing gloves at the time of the injury. None of the injuries was associated with recapping needles. Only 43% of students reported their injuries to proper authorities.

CONCLUSION: Medical students frequently sustain needle stick and sharp object injuries during their clinical training. Concerted efforts are needed to protect them.

60.

Evaluation and implementation of a needleless intravenous system: making needlesticks a needless problem.

Skolnick R, LaRocca J, Barba D, Paicius L.

Department of Nursing, Olive View Medical Center, Sylmar, CA.

American Journal of Infection Control 1993 Feb;21(1):39-41

A needleless intravenous (IV) system with blunt plastic cannulas and specially designed injection sites was introduced at Olive View Medical Center to reduce needlestick injuries, particularly IV-related needlesticks. IV-related needlestick injuries decreased 72% during the first 8 months of use, costs were reduced $1.85 for a typical IV piggyback administration set-up by revising the IV piggyback procedure, and a staff survey revealed satisfaction with the new system.

61.

Safety-Engineered Device Implementation: Does It Introduce Bias in Reporting Percutaneous Injury Reporting?

Sohn S, Eagan J, & Sepkowitz, KA.

Infection Control and Hospital Epidemiology 2004 Jul;25:543-7

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

40

Page 41: Safety device studies

OBJECTIVE: To examine whether implementation of safety-engineered devices in 2001 had an effect on rates of percutaneous injury (PI) reported by HCWs.

DESIGN: Before-and-after intervention trial comparing 3-year preintervention (1998-2001) and 2-year postintervention (2001-2002) periods. PI data from anonymous, self-administered surveys were prospectively entered into CDC NaSH software.

SETTING: A 427-bed, tertiary care hospital in Manhattan.

PARTICIPANTS: HCWs who attended state-mandated training sessions and completed the survey (1,132 preintervention; 821 postintervention).

INTERVENTION: Implementation of a “safer-needle system” composed of various safety-engineered devices for needle-safe IV delivery-insertion, blood collection, and intramuscular-subcutaneous injection.

RESULTS: Preintervention, the overall annual rate of Pis self-reported on the survey was 36.5 per 100 respondents, compared with 13.9 per 100 respondents postintervention (P < .01). The annual rate of formally reported Pis decreased from 8.3 to 3.1 per 100 respondents (P < .01). Report rates varied by occupational group (P </= .02). The overall rate did not change between study periods (22.7% to 22.3%), although reporting improved among nurses (23.6% to 44.4%, P = .03) and worsened among building services staff (90.5% to 50%, P = .03). HCWs with greater numbers of Pis self-reported on the survey were less likely to formally report injuries (P < .01). The two most common reasons for nonreport (ie, thought injury was low risk or believed patient was low risk for blood-borne disease) did not vary from preintervention to postintervention.

CONCLUSIONS: Safety-engineered device implementation decreases rates of Pis formally reported and self-reported on the survey. However, this intervention, with concomitant intensive education, had varying effects on reporting behavior by occupation and minimal effect on overall reporting rates.

62.

Effect of Implementing Safety-Engineered Devices on Percutaneous Injury Epidemiology

Sohn S, Eagan J, Sepkowitz, KA. & Zuccotti G.

Infection Control and Hospital Epidemiology 2004 Jul;25:536-42

OBJECTIVE: To assess the effect of implementing safety-engineered devices on percutaneous injury epidemiology, specifically on percutaneous injuries associated with a higher risk of blood-borne pathogen exposure.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

41

Page 42: Safety device studies

DESIGN: Before-and-after intervention trial comparing 3-year preintervention (1998-2000) and 1-year postintervention (2001-2002) periods. Percutaneous injury data have been entered prospectively into CDC NaSH software since 1998.

SETTING: A 427-bed, tertiary care hospital in Manhattan.

PARTICIPANTS: All employees who reported percutaneous injuries during the study period.

INTERVENTION: A “safer-needle system”, composed of a variey of safety-engineered devices to allow for needle-safe IV delivery, blood collection, IV insertion, and intramuscular and subcutaneous injection was implemented in February 2001.

RESULTS: The mean annual incidence of percutaneous injuries decreased from 34.08 per 1,000 full-time equivalent employees preintervention to 14.25 postintervention (P < .001). Reductions in the average monthly number of percutaneous injuries resulting from both low-risk (P < .01) and high-risk (P was not significant) activities were observed. Nurses experienced the greatest decrease (74.5%, P < .001), followed by ancillary staff (61.5%, P = .03). Significant rate reductions were observed for the following activities: manipulating patients or sharps (83.5%, P < .001), collisions or contact with sharps (73.0%, P = .01), disposal-related injuries (21.41%, P = .001), and catheter insertions (88.2%, P < .001). Injury rates involving hollow-bore needles also decreased (70.6%, P < .001).

CONCLUSIONS: The implementation of safety-engineered devices reduced percutaneous injury rates across occupations, activities, times of injury, and devices. Moreover, intervention impact was observed when stratified by risk for blood-borne pathogen transmission.

63.

A Peripheral Catheter Indwell Study Comparing Phlebitis Rates Between Two Different Catheter Materials

Stebor A, Liao J.

Tampa, Florida

Ethicon Endo-Surgery, Inc. Published in Vascular Access

PURPOSE: A comparative, randomized clinical study of the risk factors for infusion-related phlebitis was conducted over three months in a community hospital. The main purpose was to compare marketed intravenous (I.V.) catheters made of FEP polymer [Fluorinated Ethylene Propylene (PROTECTIV™ I.V. Catheter Safety System)] and OCRILON™ polymer [OCR polyurethane (PROTECTIV™ PLUS I.V Catheter Safety System).

DESIGN AND PROCEDURE: A total of 1014 I.V catheters was studied in hospitalized adults without granulocytopenia who received a peripheral I.V. catheter. I.V. team (I.V.T.) nurses

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

42

Page 43: Safety device studies

followed a randomization chart for the type of catheter that they inserted. Selected I.V.T. nurses scored insertion sites according to a 4-point phlebitis scale each day that the I.V. dwelled and then for three days post-infusion if subjects remained hospitalized.

METHOD OF ANALYSIS: Survival analysis techniques were the primary methods of analysis. The time to the development of phlebitis was the major outcome variable. In step one of the analysis, the Kaplan-Meier method was used to determine the proportion of catheters without phlebitis (phlebitis-free). In step two of the analysis, the Cox proportional hazards model was used to assess the major factors affecting the time to the development of phlebitis.

RESULTS: Overall, there were 53 catheters discontinued due to phlebitis. The proportion of catheters without phlebitis was consistently higher for the OCRILON polymer group than for the FEP polymer group up to 72 hours after insertion using the Kaplan-Meier method. More OCRILON polymer catheters were phlebitis-free at the end of each day than FEP polymer catheters. Further, data suggested that there were more phlebitis-free OCRILON polymer catheters in place at 72 hours after insertion than there were phlebitis-free FEP polymer catheter in place at 48 hours after insertion. Interaction terms between all of the variables in the final model were examined. Only the interaction term for erythromycin and catheter material was significant. Subjects with OCRILON polymer catheters who were receiving erythromycin intravenously had a greater risk of developing phlebitis than subjects with FEP polymer catheters who were receiving erythromycin. This interaction phenomena was consistent in the Cox models and the Kaplan-Meier estimate.

DISCUSSION: The results of this clinical study further the growing evidence that catheter material is one variable that can affect the development of infusion-related phlebitis (Gaukroger, Roberts, & Manners, 1988; McKee, Shell, Warren & Campbell, 1989; Maki & Ringer, 1991). OCRILON polymer catheter material was compared to FEP polymer catheter material in this study. Catheters made of OCRILON polymer reduced the risk of phlebitis by nearly 60% (OCRILON polymer: FEP polymer, relative risk = 0.41; p = 0.007). There was a greater risk for phlebitis in subjects receiving erythromycin with OCRILON polymer catheters intravenously (number of catheters = 6) than in subjects receiving erythromycin with FEP polymer catheters (number of catheters= 11). Due to the small number of catheters in the erythromycin groups a conclusive statement regarding the interaction between erythromycin and catheter material and the development of phlebitis cannot be made. This phenomena merits further investigation with a larger number of catheters in each group.

CONCLUSION: In summary, catheter material is one variable that can affect the development of infusion-related phlebitis. OCRILON polymer catheter material was found to have a smaller risk of phlebitis (with highly significant p-value; p = 0.007) than the FEP polymer catheter material. I.V. catheters made of OCRILON polymer dwelled longer phlebitis-free than the catheters made of FEP polymer.

64.

Quantifying and reducing the risk of bloodborne pathogen exposure.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

43

Page 44: Safety device studies

Stringer B, Infante-Rivard C, Hanley J.

Department of Epidemiology and Biostatistics, University of Western Ontario, London, Ontario.

AORN J 2001 Jun;73(6):1135-40, 1142-3, 1145-6; quiz 1147-8, 1151-4

The risk of becoming infected with bloodborne pathogens (e.g., hepatitis B, hepatitis C, HIV) during surgery is real. The degree of risk for perioperative personnel is related to factors that include participating in large numbers of surgical procedures each year; the nature of perioperative work (e.g., use of different types of sharp instruments): exposure to large amounts of blood and body fluids; the prevalence of bloodborne pathogens in the surgical population; the variation in different organisms' ability to be transmitted; the existence of vaccines and the level of vaccination; the availability of postexposure treatment; and the consequences of acquiring the disease. Controlling risks to perioperative personnel can be accomplished by using the Occupational Safety and Health Administration's three methods of control--redesigning surgical equipment and procedures, changing work practices, and enhancing the personal protection equipment of perioperative personnel.

65.

Risk of cross-patient infection with clinical use of a needleless injector device

Suria H, Van Enk R, Gordon R, & Mattano Jr. LA.

American Journal of Infection Control 1999;27(5):444-5

BACKGROUND: Needleless injection devices use multiple-dose vials for the administration of local anesthetics to patients. There is a theoretic risk of iatrogenic infection associated with use of these devices.

METHODS: This study used in vitro models to investigate the potential for transferring microbial pathogens among patients by using the Syrijet (Keystone Industries, Inc, Cherry Hill, NJ). Staphylococcus aureus and coagulase-negative staphylococci were used to determine whether patient skin flora could contaminate the instrument internal canal by postejection reverse flow and whether the staphylococci could survive on the ejection surface, in the internal canal, or in the anesthetic vial.

RESULTS: The ejection surface was contaminated by firing the device while it was in contact with a contaminated surface. Postejection reverse flow drew contaminants into the device, and increased with ejection volume. Reverse flow did not reach the multidose vial, and staphylococci did not grow in the commercial anesthetic solution typically administered with the device. Surface, but not internal, contamination could be removed by swabbing with disinfectant.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

44

Page 45: Safety device studies

CONCLUSION: Although autoclaving is the only way to ensure sterilization of this device, frequent cleaning of the ejection surface during clinical use minimizes the risk of cross-patient bacterial transfer.

66.

Occupational blood and body fluids exposures in health care workers: four-year surveillance from the Northern France network

Tarantola A, Golliot F, Astagneau P, Fleury L, Brucker G, Bouvet E.

CCLIN Paris-Nord Blood and Body Fluids (BBF) Exposure Surveillance Taskforce; CCLIN Paris-Nord, Institut Biomedical des Cordeliers, 15-21 rue de l'Ecole de Medecine, 75006 Paris, France.

American Journal of Infection Control 2003 Oct;31(6):357-63

The risk of accidental blood and body fluid (BBF) exposure is a daily concern for health care workers throughout the world, and various strategies have been introduced during the past decade to help reduce that risk. To assess the impact of multifocal reduction strategies introduced in hospitals affiliated with the Northern France network, we recently examined data from 4 years of BBF-exposure reports filed by network employees. A total of 7,649 BBF exposures were reported by health care workers to occupational medicine departments in 61 hospitals. Nurses and nursing students accounted for 4,587 (60%) of exposures, followed by nurses' aides and clinicians. Most (77.6%) of the reports were related to needlestick injury (NSI). In addition, we examined BBF exposure trends over time by analyzing data from 18 hospitals (29.5%) with data available for the time period of 1995 to 1998. These were assessed in nurses, who have the highest and most consistent reporting rate. We noted that the BBF-exposure incidence rate for all BBF exposures in nurses decreased from 10.8 to 7.7 per 100 nurses per year between 1995 and 1998 (P <.001), whereas the NSI rate decreased 8.9 per 100 nurses per year in 1995 to 6.3 in 1998 (P <.001). The percentage of NSIs that resulted from noncompliance with universal precautions also decreased significantly (P =.04). Widespread improvements in procedures and engineering controls were implemented in the Northern France network before and during the study period. Significant reductions were observed in reports of BBF exposures and NSIs, particularly in nurses. These findings are similar to those in other countries and reflect the overall improvement in the management of occupational risk of BBF in health care workers.

67.

Experimental study on the safety of a new connecting device

Trautmann M, Moosbauer S, Schmitz FJ, Lepper PM.

Institute of Hospital Hygiene, Klinikum Stuttgart, Stuttgart, Germany

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

45

Page 46: Safety device studies

American Journal of Infection Control. 2004 Aug;32(5):296-300

BACKGROUND: The tested device is a new connecting tool for infusion systems that has been designed to replace conventional single-use stopcocks. Because outbreaks of bloodstream infections have been observed during the use of similar connectors in the United States, we examined the microbiological safety of the connecting device after artificial contamination in the laboratory setting and during routine clinical use.

METHODS: In the first part of the study, the new device was tested in 3 types of in vitro experiments. In the second part of the study, surgical intensive care patients had their entry ports capped with novel devices (n=27) or with conventional stopcocks (n=32), and samples of infusion fluids and swabs from entry ports were taken after completion of infusion periods.

RESULTS: The new device did not perpetuate bacterial contaminations in spite of high artificial inocula in the in vitro experiments. Microbial contamination rates after 96 hours of infusion therapy for the novel connecting tool versus conventional stopcock groups were as follows: swabs from 3-way ports, 6/129 versus 1/111; rest fluid from infusion lines, 0/20 versus 1/22; rest fluid from infusion bottles, 2/196 versus 2/208; rest fluid from perfusor syringes, 7/180 versus 6/142 (all differences not significant).

CONCLUSION: The novel connecting device was microbiologically safe and did not increase microbial contamination rates of intravenous infusion systems.

68.

Evaluation of potential reduction in blood and body fluid exposures by use of alternative instruments

Waclawski ER.

Occupational Health Service, NHS Argyll and Clyde, Dykebar Hospital, Grahamston Road, Paisley PA2 7DE, UK.

Occup Med (Lond). 2004 Dec;54(8):567-9.

BACKGROUND: Injuries from needlestick, sharps injuries and splashes lead to exposure to blood and body fluids with the potential for transmission of blood-borne viruses.

AIMS: To identify alternative instruments, which if used would improve worker safety.

METHODS: Retrospective review of 161 injuries with identification of safer alternative products for instruments that caused injury. The proportion of injuries that could be prevented was calculated [with 95% confidence intervals (CI)].

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

46

Page 47: Safety device studies

RESULTS: The average rate of injury was 7.8/1000 employees per annum (95% CI, 6.8-9.4/1000). In the 2 years the highest rates of injury occurred in pre-registration house officers (164/1000; 95% CI, 64-264/1000), phlebotomists (154/1000; 95% CI, 15-291/1000) and senior house officers (45/1000; 95% CI, 13-77/1000). An upper estimate of 65% (95% CI, 58-72%) of incidents would have been preventable with a change to alternative devices.

CONCLUSIONS: Change to the use of intrinsically safer instrumentation has the potential to prevent injury to healthcare workers.

69.

Prevalence of Blood-Borne Pathogens in an Urban, University-Based General Surgical Practice

Weiss ES, Makary MA, Wang T, Syin D, Pronovost PJ, Chang D, Cornwell EE 3rd.

Department of Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland; daggerDepartment of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland; and the double daggerJohns Hopkins Patient Safety and Quality Research Group, Johns Hopkins Medical Institutions, Baltimore, Maryland

Ann Surg. 2005 May;241(5):803-809.

OBJECTIVE: To measure the current prevalence of blood-borne pathogens in an urban, university-based, general surgical practice.

SUMMARY BACKGROUND DATA: Human immunodeficiency virus (HIV), hepatitis B, and hepatitis C represent significant occupational hazards to the surgeon. While the incidence of these blood-borne pathogens is increasing in the general population, little is known about the current prevalence of these exposures among patients presenting for surgery.

METHODS: We studied 709 consecutive operative cases (July 2003 to June 2004) in a university practice that provides all inpatient, emergency department, and outpatient consultative general surgical services. Trauma cases and bedside procedures were excluded. Data collected included HIV, hepatitis B and C test results, type of operation, age, sex, and history of intravenous drug use.

RESULTS: Testing for blood-borne pathogens was performed in 53% (N = 373) of 709 patients based on abnormal liver function tests, neutropenia, history of IV drug use, or patient request. Thirty-eight percent of all operations (142/373) were found to involve a blood-borne pathogen when tested: HIV (26%), hepatitis B (4%), hepatitis C (35%), and coinfection with HIV and hepatitis C (17%). Forty-seven percent of men tested positive for at least 1 blood-borne pathogen. Seventy-three different types of operations were performed, ranging from Whipple procedures to amputations. Soft-tissue abscess procedures 48% (34/71) and lymph node biopsies 67% (10/15) (P < 0.01) were most often associated with blood-borne pathogens. Infections were

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

47

Page 48: Safety device studies

more common among men (P < 0.01), patients 41 to 50 years of age (P < 0.01), and patients with a history of intravenous drug use (P < 0.01).

CONCLUSIONS: HIV and hepatitis C infections are common in an urban university general surgical practice, while hepatitis B is less common. In addition, certain operations are associated with significantly increased exposure rates. Given the high incidence of these infections, strategies such as sharpless surgical techniques should be evaluated and implemented to protect surgeons from blood-borne pathogens.

70.

Hollow-bore needlestick injuries in a tertiary teaching hospital: epidemiology, education and engineering

Whitby RM, McLaws ML.

Med J Aust. 2002 Oct 21;177(8):418-22

OBJECTIVE: To describe the frequency, cause and potential cost of prevention of hollow-bore dirty needlestick injury (NSI) sustained by healthcare workers.

DESIGN AND PARTICIPANTS: Ten-year prospective surveillance study, 1990-1999, with triennial anonymous questionnaire surveys of nursing staff.

SETTING: 800-bed university tertiary referral hospital in Brisbane, Australia.

MAIN OUTCOME MEASURES: Rates and circumstances of NSI in medical, nursing and non-clinical staff; knowledge of NSI consequences in nurses; and minimum costs of safety devices.

RESULTS: Between 1990 and 1999, there was a significant increase (P < 0.001) in the trend of the reported rate of NSI. Of the 1836 "dirty" NSIs reported, most were sustained in nursing (66.2%) and medical (16.8%) staff, with 62.7% sustained before disposal. Hollow-bore injuries from hypodermic needles (83.3%) and winged butterfly needles (9.8%) were over-represented. Knowledge among nursing staff of some of the risks and outcomes of NSI improved over the decade. A trend (chi(2 )= 9.89; df = 9; P = 0.0016) with increasing rate of reported injuries in this group was detected. The estimated cost of consumables only, associated with the introduction of self-retracting safety syringes with concomitant elimination of butterfly needles, where practicable, would be about $365 000 per year.

CONCLUSION: More than one NSI occurs for every two days of hospital operation. Introduction of self-retracting safety syringes and elimination of butterfly needles should reduce the current hollow-bore NSI by more than 70% and almost halve the total incidence of NSI.

71.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

48

Page 49: Safety device studies

Needlestick injury: impact of a recapping device and an associated education program

Whitby M, Stead P, Najman JM.

Department of Infectious Diseases, Princess Alexandra Hospital, Brisbane, Australia.

Infect Control Hosp Epidemiology 1991 Apr;12(4):220-5

OBJECTIVE: To determine the impact of the introduction of a plastic shield-shaped device (Needleguard, Biosafe, Auckland, New Zealand) and education program designed to allow safer recapping, on recorded rates of needlestick injury.

DESIGN: A before-after trial with a two-year duration of follow-up.

SETTING: Tertiary referral hospital.

PARTICIPANTS: Nursing and other hospital personnel.

RESULTS: Prospectively collected baseline data, together with the results of an anonymous questionnaire of 25% of the hospital nursing staff, defined a reported needlestick injury rate of 6.9 per hundred full-time nursing staff per year. In the pre-intervention period, there were 6.7 needlestick injuries per 100 nursing staff members per year reported. This increased to 15.4 (p less than .0001) needlestick injuries per 100 nursing staff members per year after the intervention. An anonymous survey undertaken at both time periods suggests that the apparent increase in officially reported needlestick injuries is due to an increase in the willingness of nurses to now report previously unreported needlestick injuries.

CONCLUSIONS: The impact of the safety device and education program was the more accurate reporting of needlestick injuries; many nursing staff continued to resheath needles contrary to hospital policy. Many staff simply did not use the newly designed safety device. Approaches to improving compliance with such safety devices are considered.

72.

A follow-up evaluation to a needle-free i.v. system.

Wolfrum J.

Nursing Management 1994 Dec;25(12):33-5

In 1989, one hospital noted a high rate of puncture wounds among healthcare workers, many injuries considered preventable. Beginning in 1990, strategies were developed in conjunction with the hospital's Blood and Body Fluid Exposure Task Force. In 1991, the hospital instituted a

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

49

Page 50: Safety device studies

needle-free system in addition to employee and product educational programs. The study's results show a significant decrease in the number of injuries.

73.

Needlestick/sharps injuries among vocational school nursing students in southern Taiwan

Ya-Hui Yang, Ming-Tsang Wu, Chi-Kung Ho, Hung-Yi Chuang, Limei Chen, Chun-Yuh Yang, Hsiu-Yuan Huang, Trong-Neng Wu

Kaohsiung, Taiwan.

American Journal of Infection Control 2004 Dec;32(8)

BACKGROUND: Although most needlestick/sharps injuries (NSIs/SIs) research focuses on health care workers (HCWs), students in hospital internships are also at risk. Investigations that examined NSIsS/SIs in student populations generally studied medical rather than nursing students (NSs). In 1999, approximately 17,000 Taiwanese nursing graduates were exposed to the hazard of NSIs/SIs. We examined the frequency and mechanism of NSIs/SIs among vocational school NSs in southern Taiwan.

METHODS: Between July and December of 1999, within 1 week after the NSs completed their internship training, one of the researchers, who was a teacher in this vocational school, asked them to fill out questionnaires.

RESULTS: Five hundred twenty-seven of 550 (92.6%) questionnaires were considered valid. Two hundred sixty-four of 527 (50.1%) responders sustained one or more NSIs/SIs. Ninety-six of 527 (18.2%) responders suffered contaminated NSIs/SIs. The average number of NSIs/SIs per student was 8.0 times/year (4.9 times/student/year for NSIs and 3.1 times/student/year for SIs). NSIs/SIs rates for NSs in 10-week and 4-week internships were significantly different (P=.039): 53.3% versus 43.7%, respectively. The NSIs/SIs frequencies were influenced by length of internship: 7.3 times/student/year in 10-week internship and 11.7 times/student/year in 4-week internship. Logistic regression analysis indicated that length of internship rotation was statistically significant with respect to contaminated NSIs/SIs (OR=1.682; 95% CI: 1.005-2.81; P=.048).

CONCLUSIONS: The NSIs/SIs frequencies of NSs were higher than those for HCWs. We found that frequency of NSIs/SIs for vocational school NSs is above average. Whether the young age of these NSs put them at greater risk for NSIs/SIs warrants further inquiry.

74.

Efficacy and cost-effectiveness of a needleless intravenous access system.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

50

Page 51: Safety device studies

Yassi A, McGill ML, Khokhar JB.

Department of Occupational and Environmental Medicine, Health Sciences Center, Winnipeg, Manitoba, Canada.

American Journal of Infection Control 1995 Apr;23(2):57-64

BACKGROUND: Needlestick injury has been identified as a major cause of exposure to blood and body fluids. The heparin-lock intermittent intravenous procedure was implicated in the largest number of needlestick-related exposures (26%) at this 1100-bed tertiary care hospital, and replacement of this system was imperative. Cost concerns, however, necessitated that replacement products not increase overall hospital costs.

METHODS: A needleless intravenous access system (Interlink i.v. Access System; Baxter Healthcare Corp., Parenterals Division, Deerfield, Ill.) was introduced. Effectiveness and cost-benefit of this system were analyzed by comparing needlestick injuries and their associated costs, as well as costs of relevant products and procedures, for the year before introduction of the new product with those for 1 year after implementation of the new system.

RESULTS: During the study period, the needleless access system was 78.7% effective in reducing intravenous line-related needlestick injuries. There was an overall reduction of 43.4% in total needlestick injuries from all procedures and events. The incremental cost to this hospital ranged from a 5.3% additional cost to a 5.7% savings, without even considering the less quantifiable benefits associated with avoidance of needlestick injury, time saved by using this product, and decreased infection rate.

CONCLUSION: When used as intended, this system was extremely effective in reducing intravenous line-related needlestick injuries, and the system does pay for itself.

75.

Prevention of catheter-related bloodstream infection in critically ill patients using a disinfectable, needle-free connector: a randomized controlled trial

Yebenes JC, Vidaur L, Serra-Prat M, Sirvent JM, Batlle J, Motje M, Bonet A, Palomar M.

Am J Infect Control. 2004 Aug;32(5):291-5.

Intensive Care Unit, Hospital Universitari de Girona Dr. Josep Trueta, Girona, Spain.

OBJECTIVE: The aim of this study was to assess the efficacy of a disinfectable, needle-free connector in the prophylaxis of catheter-related bloodstream infection.

METHODS: A randomized controlled trial was performed in a polyvalent intensive care unit. Patients who needed multilumen central venous catheters were randomly assigned to a study or a

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

51

Page 52: Safety device studies

control group. All catheters were inserted and manipulated according to the Centers for Disease Control and Prevention (CDC) recommendations. Study group patients were equipped with catheters with disinfectable, needle-free connectors whereas control group patients were equipped with catheters with 3-way stopcocks. Two peripheral blood cultures and a semiquantitative culture of the catheter tip were performed on removal of the catheter.

RESULTS: The study included 243 patients, with a total of 278 central venous catheters. The catheters' mean insertion duration was 9.9 days. Both groups were comparable regarding patient and catheter characteristics. Incidence rate of catheter-related bloodstream infection was 0.7 per 1000 days of catheter use in the study group, compared with 5.0 per 1000 days of catheter use in the control group (P=.03).

CONCLUSIONS: To add a disinfectable, needle-free connector to the CDC recommendations reduces the incidence of catheter-related bloodstream infection in critically ill patients with central venous catheters.

76.

Impact of a shielded safety syringe on needlestick injuries among healthcare workers.

Younger B, Hunt EH, Robinson C, McLemore C.

Infection Control, Pacific Presbyterian Medical Center, San Francisco, California.

Infection Control and Hospital Epidemiology 1992 Jun;13(6):349-53

OBJECTIVES: Evaluate the impact of a shielded 3 cc safety syringe on needlestick injuries among healthcare workers.

DESIGN: Surveillance study.

SETTING: Three medical centers.

RESULTS: The total number of needlesticks from all sources rose from 134 during the baseline period to 140 during the study phase. However, the overall rate of needlesticks involving 3 cc syringes decreased from 14/100,000 inventory units to 2/100,000, and the frequency declined substantially at each of the participating medical centers.

CONCLUSIONS: The total number of needlesticks from all sources rose from 134 during the baseline period to 140 during the study phase. However, the overall rate of needlesticks involving 3 cc syringes decreased from 14/100,000 inventory units to 2/100,000, and the frequency declined substantially at each of the participating medical centers.

77.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

52

Page 53: Safety device studies

Effect of a comprehensive program to reduce needlestick injuries

Zafar AB, Butler RC, Podgorny JM, Mennonna PA, Gaydos LA, Sandiford JA.

Arlington Hospital, VA 22205, USA.

Infect Control Hosp Epidemiology 1997 Oct;18(10):712-5

The Arlington Hospital Needlestick Injury (NSI) Prevention Program was created to protect healthcare workers from NSI and to assess the effectiveness of our interventions. Interventions included revising NSI policy and procedures. The average NSI rate dropped from 109 to 43 per year after the interventions, over a period of 4 years.

This information is provided by the Premier Safety Institute to assist in the review of sharps injury prevention research—and is not meant to be all-inclusive. For a detailed analysis of the research, refer to the original article. Please contact the Premier Safety Institute at [email protected] if you believe an important article is missing.

53