Printer Friendly

A survey of environmental safety issues at 20 medical clinics.

Introduction

Measuring performance in the health care industry using documented inspections with respect to ambulatory care standards, guidelines, and regulations serves to identify patient care and environmental safety issues in preparation for various third-party inspections by accreditation bodies or by local, state, and federal regulators (Springhouse Corporation, 1998). The accreditation standards of The Joint Commission (TJC) (TJC, 2009a) and the Accreditation Association for Ambulatory Health Care (AAAHC) (AAAHC, 2004) represent a consensus on outstanding patient care, emergency preparedness, life safety, and compliance with environment of care issues, such as hazardous waste regulations. As a result of the debut of these standards more than 30 years ago and increasing regulatory pressures from local, state, and federal agencies, measuring clinic safety performance using inspection checklists has become standard in the health care environment (AAAHC, 2004; TJC, 2009a; Tulane University Office of Environmental Health and Safety, 2009). The Joint Commission, formed in 1951, has been around longer than health care regulations and we consider it to have been a driving force in developing federal regulations, such as the Bloodborne Pathogens Standard, which took effect in 1992 (Bloodborne Pathogens, 2008; TJC, 2009b). Although the trend of using inspection checklists has not been widely documented in research journals, it has been observed over the last 15 years by our research team in various institutions within a major medical center.

The AAAHC offers a number of publications on its Web site (AAAHC, 2004), including the "AAAHC Physical Environment Checklist for Ambulatory Surgery Centers," accreditation handbooks, and self-assessment manuals, whereas TJC offers their standards in both electronic and paper formats (TJC, 2009a). Since the goal of our survey was not accreditation but rather education and improved compliance with institutional, local, state, and federal guidelines and regulations, we used a checklist based on our collaborative experience in the clinical and research environment at several research, education, and health care institutions, as well as regulatory guidelines and other sources. The survey checklist was similar to the "Satellite Clinic Quarterly Inspection" form used by Tulane University's Office of Environmental Health and Safety (Tulane University Office of Environmental Health and Safety, 2009) and a number of other more general institutional health and safety checklists and resources (Charney & Schirmer, 1990; County of San Mateo, California, 2008; Emery & Savely, 1997; Gershon et al., 1995; Gershon et al., 2000; Kavaler & Spiegel, 1997; North Carolina State University, 2009; Stanford School of Medicine, 2009; Stanford University Environmental Health and Safety, 2009; Tai, 2001; U.S. Department of Agriculture Animal and Plant Health Inspection Service, 1995; University of North Carolina, Asheville, 2009; Wisconsin Automobile and Truck Dealers Association, 1998).

Survey Rationale

Implementation of a routine clinic safety inspection program generally results in increased compliance with institutional, local, state, and federal guidelines and regulations (Emery & Savely, 1997). "Periodic walk-through inspections" are recommended by the National Institute for Occupational Safety and Health as part of the "checklist for developing a hospital safety and health program" in Guidelines for Protecting the Safety and Health of Health Care Workers (U.S. Department of Health and Human Services, 1988) and are considered to be a common method of limiting "property, liability, and personnel exposures (Monagle, 1985)" in the risk-management arena. Pressures from accreditation bodies such as TJC or AAAHC make documentation "a must" for organizations seeking or maintaining accreditation. Those who document internally usually do not make the results of their safety inspections readily available, and we were not able to locate any journal publications on this topic. Although our patient care, research, and academic institution has a long-term goal to attain accreditation, this project stemmed from an expansion of the clinical and research lab safety inspection program into the health care clinic arena to increase compliance and create greater linkages between environmental safety and clinic personnel.

Methods

Study Population Criteria and Sources

Eight clinics were initially inspected at a centralized location (onsite) as part of the expansion of the routine safety inspection program. Later, Institutional Review Board (IRB) approval was obtained to collect information at 12 additional clinics located away from the main site (offsite) and to compare the data from the offsite clinics with the data collected onsite. The 20 clinics included in our study were the only clinics known to be affiliated with the institution at the time of the survey.

Essential Elements of a Clinic Safety Program and Safety Checklist Content

Clinic safety programs pivot around numerous local, state, and federal guidelines and regulations; a brief list of the major guidelines and regulations that served as references for the inspection checklist for this study are included in Table 1. In addition to the references found in Table 1, institutional guidelines and policies were also used as a basis for the inspection questions. The 90 inspection questions (Table 2) emphasize nine areas of interest: emergency preparedness (42 questions), infection control (28 questions), waste/infection control (10 questions), patient safety (four questions), general safety (two questions), and one question each for fire safety, medication management, physical safety, and quality control. The categorization of the questions was somewhat arbitrary, and some of the questions that were categorized as emergency preparedness could also have been classified as fire safety

Possible answers included yes, no, or not applicable (N/A). Yes answers were the desired outcomes and indicated that the safety practice was being followed, the participant was knowledgeable regarding the question asked, or knew where the various safety equipment was located. If a participant gave a one-word answer of yes to a question, the researchers followed up with sufficient questions to ensure that the individual was truly aware of the desired information. Only when the participant responded with sufficient information was the answer checked yes. For example, "Are you aware of the Safe Medical Devices Act of 1990?" was occasionally answered yes by a participant, in which case a follow-up statement such as, "Please tell us what you know about it" was added to ensure knowledge. No was checked when the participant appeared to be either immediately uninformed about the question asked, or ultimately uninformed upon follow-up questioning. In addition to the answers to these questions, information was gathered regarding who participated in the survey, the survey date, and total number of rooms inspected.

Although the inspection checklist was not composed of and did not request any type of sensitive or confidential information, responses were set up to be received and handled confidentially. The data were recorded by hand during the interview and observation phases of each inspection and later entered into a centralized database. The data received from the participants were subsequently made anonymous prior to statistical analysis and are reported in aggregate so as not to identify individual clinics.

Clinic Contact

Using an IRB-approved script, the 12 offsite clinic contacts were initially contacted by phone or, in rare cases, in person. Appointments were made for a convenient time when few patients would be present in order to facilitate discussion with clinic staff regarding safety practices and procedures. After an initial introduction and the process of informed consent had been completed, the clinic staff representative was interviewed using the inspection checklist interview questions 1-36 (Table 2). Afterwards, a walk-through of the clinic was performed using the remaining questions 37-90 as a guide. Following the inspection, participant compliance with Bloodborne Pathogens Standard training requirements was checked using the institutional safety training database. All 20 clinics contacted agreed to participate in the study

Statistical Methods

Answers to all 90 inspection questions (yes, no, or N/A) were summarized according to clinic using frequencies and percentages. The difference between onsite and offsite for each of the 90 questions was assessed using Fisher's exact test and the permutation resampling method. This method adjusts for multiple comparisons embedded in the data and provides adjusted p-values that are more robust than the p-values generated from conducting hypothesis testings on each individual question. In addition, the Wilcoxon rank-sum test was performed to assess differences between the onsite and offsite clinics in terms of the number of yes answers overall and by safety knowledge category. Participant compliance with Bloodborne Pathogens Standard training requirements was analyzed using Fisher's exact test. SAS/ STAT (SAS Institute) and S-Plus (TIBCO Software) were used for the statistical analyses. For clarity, additional statistical methods are described below.

Results

Percentages of yes, no, and N/A responses to the 90 safety inspection questions were calculated for each of the 20 clinics, as shown in Table 3. Among the 20 sites, Offsite 4 had the highest number of yes answers, answering yes to 65 of the 90 questions (72.2%). Offsite 2 and Onsite 4 scored yes to only 42 of the 90 questions, which at 46.7% is the lowest percentage of yes answers among the 20 clinics.

To examine if any differences existed between onsite and offsite clinics in terms of the responses to each of the 90 safety questions, Fisher's exact test was initially used without adjustment for multiple comparisons. Subsequently, the data were adjusted for multiple comparisons using Fisher's exact test and permutation resampling, which is more robust and conservative. The results suggest that even though Fisher's exact test (without adjustment) identified the answers to 13 questions to be significantly different between the onsite and offsite clinics (p < .05), reanalysis adjusting for multiple comparisons revealed that only the answers to two questions (#2 and #10) were statistically significant (p < .05). A closer look at the answers reveals slightly better performance by the offsite clinics on these two questions. Question 2, are you "aware of how to triage patients with communicable diseases?" was answered yes 83% of the time at the offsite clinics, but only one of the eight onsite clinics (12.5%) answered yes (p = .05).

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]

Question 10, are you "aware of what to do if emergency power does not come on?" was answered yes 92% of the time at the 12 offsite clinics, whereas only 25% of the eight onsite clinics answered yes (p = .04).

In addition, a Wilcoxon rank-sum test was conducted to compare the overall knowledge assessed by the number of yes answers to all 90 questions of the eight onsite clinics with the number of yes answers to all 90 questions at the 12 offsite clinics. Among the eight onsite clinics, the median number of yes answers was 54, with a range of 42 to 60; among the 12 offsites, the median number of yes answers was 57, with a range of 42 to 65. This difference was not significant (p = .26; Wilcoxon rank-sum test; Figure 1).

As a secondary analysis, we examined the knowledge level by category in the onsite and offsite clinics. For example, we compared the number of yes answers to the 49 questions related to emergency preparedness among the eight onsite clinics with the 12 offsite clinics. Because of the limited number of questions in the fire safety, general safety, medication management, patient safety, physical safety, and quality control categories, this analysis was only conducted for the emergency preparedness (42 questions), infection control (28 questions), and waste/infection control (10 questions).

The results suggested that a marginally significant difference existed between the onsite and offsite clinics in terms of safety knowledge in infection control and waste/infection control, with p-values of .07 and .10, respectively. In each case, a tendency occurred for offsite clinics to be more knowledgeable than onsite clinics. For infection control questions, the median percentage yes was 17% and 20% for onsite and offsite, respectively (Figure 2); for waste/infection control, the median was 6% and 8% for onsite and offsite, respectively (Figure 3). For emergency preparedness, no significant difference existed between onsite and offsite clinics (p = .51) in terms of their overall emergency preparedness (Figure 4).

Finally, we identified the questions that all clinics, onsite and offsite, answered extremely well or extremely poorly. Nine questions were answered yes by less than 20% of all clinics (Table 4). All of the clinics (100%) answered yes to the 12 safety questions listed in Table 5. In addition, the questions regarding whether crash cart medications were kept secured (question #49) and the medication preparation area was clean and organized (question #65) were both answered yes by 14 out of 20 clinics (70%); for six out of 20 clinics these questions were not applicable.

Compliance With Bloodborne Pathogens Standard Training Requirements

Personnel compliance with Bloodborne Pathogens Standard training requirements (Bloodborne Pathogens, 2008) was evaluated by reviewing training records in the institutional safety training database. Overall, 28% of 29 personnel located at the 20 clinics were in compliance with initial and annual refresher training requirements. Personnel stationed at onsite clinics were in compliance 31% (4/13 personnel) of the time, while only 25% (4/16 personnel) of those located in offsite clinics had completed the required training. These differences, however, were not statistically significant (p-value = 1.00; Fisher's exact test).

Discussion

We originally assumed that clinics located offsite might be less knowledgeable regarding safety issues due to their relative remoteness and greater independence. The analyses revealed, however, that overall no significant difference existed between the onsite and offsite clinics with regard to the number of yes answers to the 90 questions and compliance with Bloodborne Pathogens Standard training requirements.

[FIGURE 3 OMITTED]

[FIGURE 4 OMITTED]

In fact, offsite locations performed better than the onsite clinics on two questions and in the infection control and waste/infection control categories. We believe this occurred due to the requirement for increased independence at the offsite clinic locations and the fact that the onsite clinics had previously been under the umbrella of a local hospital a year prior to this study. Because of their previous strong hospital affiliation, they had grown accustomed to a high level of hospital environmental health and safety service and were less independent than offsite locations, which by design had started out and remained independent of much central oversight. No significant difference existed between onsite and offsite clinics for the emergency preparedness category.

One potential limitation of the study was the likelihood that participants answered yes to safety questions in order to appear to be more knowledgeable and compliant. The survey instrument was designed to compensate for this in part by asking multiple questions to assess knowledge and to follow up one-word answers with additional questions sufficient to determine whether the participant was knowledgeable (yes) or not (no). We have no reason to suspect, however, that this tendency would be greater at the onsite or offsite clinics and thereby bias the differences observed. This study was used to enhance both new employee and clinic-specific training courses, as well as to develop new training courses, such as "Hand Hygiene."

Conclusion

Areas of future improvement for safety knowledge and performance were identified through this survey and include the questions identified in Table 4 as the areas answered yes by less than 20% of all clinics. The inspection provided a forum for the immediate correction of a number the safety issues identified and enabled an informal training opportunity. For example, a no answer could be used to stimulate discussion and present information in a nonconfrontational manner. In addition, participant safety concerns were identified in a number of instances, and through the enhanced relationship provided by the inspection, the investigators were able to help address these concerns. In correcting safety issues noted and providing both onsite informal training and referrals to more formal training classes, this patient care, research, and education institution was able to increase knowledge and compliance with local, state, and federal guidelines and regulations.

Acknowledgements: The authors would like to acknowledge Becky Luke for her assistance in completing this project.

References

Accreditation Association for Ambulatory Health Care, Inc. (2004). Accreditation handbook for ambulatory healthcare. Wilmette, IL: Author.

Bloodborne Pathogens (Occupational Safety & Health Administration), 29 C.F.R. 1910.1030 (2008).

California Emergency Medical Services Authority. (2006). Hospital incident command system guidebook. Retrieved April 23, 2010, from http://www.emsa.ca.gov/hics/

Charney, W., & Schirmer, J. (1990). Essentials of modern hospital safety. Chelsea, MI: Lewis Publishers, Inc.

City of Houston. (2003). Houston Fire Department standards and codes. Retrieved June 6, 2010, from http://library.municode.com/

Clean Air Act (U.S. Environmental Protection Agency), 42 U.S.C. [section]7401 (1990).

Clean Water Act (U.S. Environmental Protection Agency), 33 U.S.C. [section]1251 (1990).

County of San Mateo, California. (2008). Safety hazard inspection guide. Retrieved April 28, 2009, from http://www.co.sanmateo.ca.us/Attachments/HR/ Files/Risk%20Management/Safety/Safety%20Hazard%20Inspection% 20Checklist_2008.doc

Emery, R., & Savely, S.M. (1997). Soliciting employee concerns during routine safety inspections. Professional Safety, 42(7), 36-38.

Gershon, R.R.M., Karkashian, C.D., Grosch, J.W., Murphy, L.R., Escamilla-Cejudo, A., Flanagan, P.A., Bernacki, E., Kasting, C., & Martin, L. (2000). Hospital safety climate and its relationship with safe work practices and workplace exposure incidents. American Journal of Infection Control, 28(3), 211-221.

Gershon, R.R.M., Vlahov, D., Felknor, S.A., Vesley, D., Johnson, P.C., Delclos, G.L., & Murphy, L.R. (1995). Compliance with universal precautions among health care workers at three regional hospitals. American Journal of Infection Control, 23(4), 225-236.

Hazard Communication (Occupational Safety & Health Administration), 29 C.F.R. 1910.1200 (1996).

Hazardous Waste Operations and Emergency Response (Occupational Safety & Health Administration), 29 C.FR. 1910.120 (2006).

International Code Council. (2009). International fire code. Washington, DC: ICC Publishing.

The Joint Commission. (2009a). Comprehensive accreditation manual for ambulatory care. Oakbrook Terrace, Illinois: Joint Commission Resources.

The Joint Commission. (2009b). A journey through the history of The Joint Commission. Retrieved April 23, 2010, from http://www. jointcommission.org/AboutUs/joint_commission_history.htm

Kavaler, F, & Spiegel, A.D. (1997). Risk management in health care institutions. Sudbury, MA: Jones and Bartlett Publishers.

Monagle, J.F (1985). Risk management: A guide for health care-professionals. Rockville, MD: Aspen Systems Corporation.

National Fire Protection Association. (2009). NFPA 101: Life safety code. Quincy, MA: Author.

National Incident Management System (U.S. Department of Homeland Security), (2003). Retrieved June 6, 2010, from http://www. dhs.gov/xlibrary/assets/NIMS-90-web.pdf

North Carolina State University. (2009). NC State University supervisor's safety self assessment checklist. Retrieved April 28, 2009, from http://www.ncsu.edu/ehs/www99/right/super/NCSUSafetySelfAssesment.pdf

Occupational Exposure to Hazardous Chemicals in Laboratories (Occupational Safety & Health Administration), 29 C.F.R. 1910.1450 (2006).

Resource Conservation and Recovery Act (U.S. Environmental Protection Agency), 42 U.S.C. [section] 6901 (1996).

Safe Medical Devices Act (Food and Drug Administration), 21 C.FR. [section] 803 (1990).

San Mateo County Health Services Emergency Medical Services Agency. (1998). Hospital emergency incident command system. Retrieved June 1, 2010, from http://www.heics.com/HEICS98a.pdf

Springhouse Corporation. (1998). Healthcare professional guides: Safety and infection control. Springhouse, PA: Author.

Stanford School of Medicine. (2009). Fire safety checklist. Retrieved April 28, 2009, from http://med.stanford.edu/somsafety/inspections/firesafetyprep.html

Stanford University Environmental Health and Safety. (2009). General office inspection checklist. Retrieved April 28, 2009, from http://www.stanford.edu/dept/EHS/prod/training/checklist/gencheck.pdf

Tai, E. (2001). OSHA compliance management: A guide for long-term health care facilities. Boca Raton, FL: CRC Press, LLC.

Texas Regulations for Control of Radiation (Texas Administrative Code), Title 25 Part 1 Chapter 289 (1999).

Tulane University Office of Environmental Health & Safety. (2009). Tulane University satellite clinic quarterly inspection form. Retrieved April 28, 2009, from http://www.som.tulane.edu/oehs/docs/satelliteInspFrm.pdf

U.S. Department of Agriculture Animal and Plant Health Inspection Service. (1995). APHIS safety inspection checklist (general safety). Retrieved April 28, 2009, from http://www.aphis.usda.gov/library/forms/pdf/aphis256_1.pdf

U.S. Department of Health and Human Services. (1988). Guidelines for protecting the safety and health of health care workers (DHHS publication #88-119). Washington, DC: Government Printing Office.

U.S. Department of Health and Human Services, Centers for Disease Control & Prevention, and National Institutes of Health. (2007). Biosafety in microbiological and biomedical laboratories. Washington, DC: U.S. Government Printing Office.

University of North Carolina, Asheville. (2009). UNCA safety inspection checklist--Clinic environment. Retrieved April 28, 2009, from http://www.unca.edu/fac_mgmt/safety/ClinicChecklist.html

Wisconsin Automobile and Truck Dealers Association. (1998). WATDAs OSHA self-inspection check list. Retrieved April 28, 2009, from http://www.watda.org/legal/Miscellaneous/OSHA%20 Checklist.pdf

Corresponding Author: Susanne M. Savely, Assistant Professor, Environmental Health Section of the Chronic Disease Prevention and Control Research Center, Department of Medicine Radiation Safety Officer and Manager, Safety Compliance & Training/Radiation Safety, Office of Environmental Safety, Baylor College of Medicine, One Baylor Plaza, MS: BCM 175 Houston, TX 77030-3411. E-mail: savely@bcm.edu.
TABLE 1

Guidelines and Regulations Used to Create Clinic
Inspection Checklist

* San Mateo County Health Services Hospital Emergency Incident
Command System Guidebook

* National Incident Management System

* City of Houston Fire Standards & Codes

* Hazardous Waste Operations and Emergency Response

* International Fire Code

* The Joint Commission Ambulatory Care and Hospital Standards

* NFPA 101 Life Safety Code

* OSHA "Lab Standard" Occupational Exposure to Hazardous Chemicals in
Laboratories

* OSHA Bloodborne Pathogens Standard

* OSHA Hazard Communication Standard

* Texas Regulations for Control of Radiation

* Biosafety in Microbiological and Biomedical Laboratories

* U.S. Environmental Protection Agency Clean Air Act

* U.S. Environmental Protection Agency Clean Water Act

* U.S. Environmental Protection Agency Resource Conservation and
Recovery Act

* U.S. Food and Drug Administration Safe Medical Devices Act of 1990

TABLE 2
Clinic Environmental Safety Inspection Checklist *

Question Category

 1 Aware of how to access the Infection Infection control
 Control Manual?

 2 Aware of how to triage patients with Infection control
 communicable diseases?

 3 Aware of what their on-the-job risks are? General safety

 4 Aware of who is the institution's Chief Emergency preparedness
 Safety Officer?

 5 Aware of how to report a patient or Emergency preparedness
 employee injury?

 6 Aware of how to locate the safety Emergency preparedness
 policies and procedures?

 7 Aware of what to do if they witness a Emergency preparedness
 crime?

 8 Aware of what to do if they feel a Emergency preparedness
 potential threat of violent behavior by
 an employee, visitor, or patient?

 9 Aware of what to do in the event of a Emergency preparedness
 bomb threat?

10 Aware of what to do if emergency power Emergency preparedness
 does not come on?

11 Aware of how long it takes for emergency Emergency preparedness
 power to come on in the event of an
 outage?

12 Aware of where emergency lighting and Emergency preparedness
 outlets are located?

13 Aware of how the clinic handles equipment Patient safety
 and product recalls?

14 Aware of the Safe Medical Devices Act of Emergency preparedness
 1990?

15 Aware of who manages the biohazard waste Waste/infection control
 manifests and where they are kept on
 file?

16 Aware of how to dispose of biohazard Waste/infection control
 waste and sharps?

17 Aware of where spill kits are located? Emergency preparedness

18 Aware of how to clean up a chemical Emergency preparedness
 spill?

19 Aware of how to clean up a blood spill? Emergency preparedness

20 Aware of where the chemical inventory is Emergency preparedness
 located?

21 Aware of what Material Safety Data Sheets Emergency preparedness
 are, what the acronym MSDS means, and
 where they are located in their clinic?

22 Aware of the procedure to follow if they Emergency preparedness
 have a needle stick or an exposure to an
 infectious disease?

23 Aware of whether or not their department Emergency preparedness
 has an emergency plan?

24 Aware of whether or not the building has Emergency preparedness
 an emergency guide?

25 Aware that the institution has an Emergency preparedness
 Emergency/Disaster Management Plan?

26 Aware of the details of the various Emergency preparedness
 emergency plans in place for their
 clinic?

27 Aware of what number to call to get an Emergency preparedness
 update regarding the status of the
 institution during emergency situations?

28 Aware of what their role and their Emergency preparedness
 department's role is in an emergency?

29 Aware of where to move themselves and Emergency preparedness
 patients to in the event an evacuation is
 ordered?

30 Aware of where the nearest fire alarm Emergency preparedness
 pull station is located?

31 Aware of where the nearest fire Emergency preparedness
 extinguisher is located?

32 Aware of what their role is during a fire Emergency preparedness
 alarm?

33 Aware of how many fire drills are Emergency preparedness
 conducted in their clinic per year?

34 Aware of the emergency phone numbers to Emergency preparedness
 report a fire?

35 Aware of where to evacuate patients if Emergency preparedness
 there is a fire?

36 Aware of what the safety acronym RACE Emergency preparedness
 stands for?

37 Fire Extinguishers have been inspected Emergency preparedness
 within the last year and are properly
 pressurized.

38 Fire extinguishers are not obstructed. Emergency preparedness

39 Materials are stored 18 inches from the Emergency preparedness
 ceiling and away from sprinkler heads.

40 Corridors and exits are not obstructed. Emergency preparedness

41 Smoke and fire doors are kept closed. Emergency preparedness

42 Smoke and fire doors open and close Emergency preparedness
 properly.

43 Compressed gas cylinders are properly Physical safety
 secured.

44 Exit signs are illuminated, visible, and Emergency preparedness
 directional.

45 Doors to hazardous areas, soiled linen Waste/infection control
 rooms, are kept closed.

46 Electrical cords, plugs and outlets Fire safety
 appear to be in good working condition.

47 Clinic performs environmental safety self General safety
 inspections.

48 Daily crash carts inspection are Emergency preparedness
 performed and documented.

49 Crash carts medications are kept locked. Emergency preparedness

50 Crash carts are dust and clutter free. Emergency preparedness

51 Crash carts supplies are checked daily Emergency preparedness
 and restocked as needed.

52 Crash cart automated external Emergency preparedness
 defibrillators are kept charged and are
 plugged into emergency power.

53 Medical waste boxes are available. Waste/infection control

54 Medical waste boxes are being used Waste/infection control
 appropriately.

55 Medical waste boxes are closed when ready Waste/infection control
 for disposal.

56 Medical waste is kept closed when not in Waste/infection control
 use.

57 Medical waste and sharps containers are Waste/infection control
 not overfilled.

58 Soiled linen is appropriately bagged and Waste/infection control
 stored.

59 Appropriate waste receptacles are used Waste/infection control
 and regular trash and medical waste are
 separated.

60 All food kept on hand for patients is Patient safety
 labeled and dated.

61 Refrigerators and freezers are clean and Patient safety
 defrosted.

62 Refrigerator and freezer temperatures are Patient safety
 checked daily and documented.

63 Ice machine disinfections are occurring Infection control
 and have been documented.

64 Specimen/tissue/blood refrigerator Quality control
 temperatures are checked daily and
 documented.

65 Medication preparation is clean and Medication management
 organized.

66 Dirty equipment and supplies are stored Infection control
 in an area separate from sterile and
 clean supplies.

67 External shipping cartons are removed Infection control
 from storage areas to protect clean and
 sterile patient supplies.

68 Under sink storage free of patient care Infection control
 items and hazardous chemicals.

69 Clean linen is stored in a separate area Infection control
 from dirty linen and kept covered.

70 Clean and sterile supplies on open Infection control
 shelves are stored at least 8"-10" above
 floor and 18" from ceiling.

71 Hand washing facilities are available. Infection control

72 Appropriate environmental cleaning agents Infection control
 are being used for cleaning between
 patients.

73 Environmental cleaning is being performed Infection control
 on a regularly scheduled basis.

74 Horizontal surfaces are free of dust. Infection control

75 Ventilation grills are free of dust. Infection control

76 Floors and walls are clean. Infection control

77 Ceiling tiles are in place and in good Infection control
 condition.

78 Caulking is clean and in appropriate Infection control
 places.

79 No staff or physician food or drink at Infection control
 nurse's station or in patient care areas.

80 Removal of gloves and hand washing Infection control
 between patients was observed.

81 Appropriate Personal Protective Equipment Infection control
 is being used.

82 Staff is observed properly handling and Infection control
 segregating medical waste.

83 Use aseptic technique during procedures. Infection control

84 Sterile supplies and medications have not Infection control
 expired.

85 Appropriate hand scrubs or disinfectants Infection control
 are available in treatment and procedure
 rooms.

86 Personal Protective Equipment is Infection control
 available and in use for cleaning and
 disinfection.

87 Workflow is acceptable with adequate Infection control
 space and ventilation.

88 Quality control monitoring of autoclaves Infection control
 is documented in log books.

89 Autoclaves are properly maintained with Infection control
 documentation.

90 Transport of soiled instruments, Infection control
 equipment and linens is handled
 appropriately.

* Possible answers to questions: yes = desired answer;
no = undesired answer; not applicable = doesn't apply to a clinic.

TABLE 3
Summary of Answers by Clinic Site *

 Number
Clinic Name Answer Answered % Answered

Onsite 1 N/A 2 2.2
 No 31 34.4
 Yes 57 63.3

Onsite 2 N/A 17 18.9
 No 22 24.4
 Yes 51 56.7

Onsite 3 N/A 8 8.9
 No 23 25.6
 Yes 59 65.6

Onsite 4 N/A 16 17.8
 No 32 35.6
 Yes 42 46.7

Onsite 5 N/A 7 7.8
 No 25 27.8
 Yes 58 64.4

Onsite 6 N/A 4 4.4
 No 26 28.9
 Yes 60 66.7

Onsite 7 N/A 3 3.3
 No 38 42.2
 Yes 49 54.4

Onsite 8 N/A 4 4.4
 No 35 38.9
 Yes 51 56.7

Offsite 1 N/A 18 20
 No 14 15.6
 Yes 58 64.4

Offsite 2 N/A 35 38.9
 No 13 14.4
 Yes 42 46.7

Offsite 3 N/A 5 5.6
 No 29 32.2
 Yes 56 62.2

Offsite 4 N/A 9 10
 No 16 17.8
 Yes 65 72.2

Offsite 5 N/A 27 30
 No 16 17.8
 Yes 47 52.2

Offsite 6 N/A 10 11.1
 No 19 21.1
 Yes 61 67.8

Offsite 7 N/A 13 14.4
 No 21 23.3
 Yes 56 62.2

Offsite 8 N/A 3 3.3
 No 23 25.6
 Yes 64 71.1

Offsite 9 N/A 4 4.4
 No 29 32.2
 Yes 57 63.3

Offsite 10 N/A 4 4.4
 No 24 26.7
 Yes 62 68.9

Offsite 11 N/A 16 17.8
 No 14 15.6
 Yes 60 66.7

Offsite 12 N/A 10 11.1
 No 27 30
 Yes 53 58.9

* N/A = not applicable to that clinic; no = undesired answer;
yes = desired answer.

TABLE 4
Nine Questions Answered Yes by Less Than 20% of All 20 Clinics

Question
Number Question Category

4 Aware of who is the institution's Emergency preparedness
 Chief Safety Officer?

13 Aware of how the clinic handles Patient safety
 equipment and product recalls?

14 Aware of the Safe Medical Devices Emergency preparedness
 Act of 1990?

17 Aware of where spill kits are Emergency preparedness
 located?

20 Aware of where the chemical Emergency preparedness
 inventory is located?

23 Aware of whether or not their Emergency preparedness
 department has an emergency plan?

62 Refrigerator and freezer Patient safety
 temperatures are checked daily
 and documented.

64 Specimen/tissue/blood refrigerator Quality control
 temperatures are checked daily and
 documented.

67 External shipping cartons are Infection control
 removed from storage areas to
 protect clean and sterile patient
 supplies.

TABLE 5
Twelve Questions That Were Answered Yes by All 20 Clinics

Question
Number Question Category

32 Aware of what their role is during Emergency preparedness
 a fire alarm?

38 Fire extinguishers are not Emergency preparedness
 obstructed.

40 Corridors and exits are not Emergency preparedness
 obstructed.

41 Smoke and fire doors are kept Emergency preparedness
 closed.

42 Smoke and fire doors open and Emergency preparedness
 close properly.

46 Electrical cords, plugs, and Fire safety
 outlets appear to be in good
 working condition.

74 Horizontal surfaces are free of Infection control
 dust.

75 Ventilation grills are free of Infection control
 dust.

76 Floors and walls are clean. Infection control

77 Ceiling tiles are in place and in Infection control
 good condition.

78 Caulking is clean and in Infection control
 appropriate places.

86 Personal protective equipment is Infection control
 available and in use for cleaning
 and disinfection.
COPYRIGHT 2011 National Environmental Health Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

 
Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:ADVANCEMENT OF THE PRACTICE
Author:Savely, Susanne M.; Hamilton, Winifred J.; Degani, Farah; Weinberg, Armin D.; Muraca, Paul
Publication:Journal of Environmental Health
Date:Dec 24, 2010
Words:5375
Previous Article:Guided-inquiry learning in environmental health.
Next Article:Putting theory into practice--CDC's Summer Program in Environmental Health (SUPEH).
Topics:

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters