R E V I E W S & A N A LY S E S The Role of the Electronic Health Record in Patient Safety Events Erin Sparnon, MEng INTRODUCTION Senior Patient Safety Analyst Adoption of electronic medical records (EMRs) and electronic health records (EHRs)* William M. Marella, MBA Program Director in US healthcare facilities is growing: HIMSS Analytics reports that, as of the second Pennsylvania Patient Safety Authority quarter of 2012, over three-quarters of US healthcare facilities have achieved at least stage 3 of their seven-stage EMR Adoption Model.1 Stage 3 reflects a facility having the cumulative capabilities for electronic flowcharts, error checking, and picture archiving and communication systems (PACS) available outside of the radiology department.1 ABSTRACT However, as adoption grows, so does concern over the potential safety implications of As adoption of health information tech- these systems. The recently released Institute of Medicine report Health IT and Patient nology solutions like electronic health Safety: Building Safer Systems for Better Care2 noted a lack of hazard and risk reporting records (EHRs) has increased across data on health information technology (HIT) as a hindering factor in building safer the United States, increasing attention systems. In response to this need for information on the scope and extent of EHR risks is being paid to the safety and risk posed by today’s implemented systems, Pennsylvania Patient Safety Authority analysts profile of these technologies. However, identified EHR events in the Authority’s Pennsylvania Patient Safety Reporting System several groups have called out a lack (PA-PSRS). of available safety data as a major challenge to assessing EHR safety, and METHODS this study was performed to inform the Authority analysts queried the PA-PSRS database on May 23, 2012, using the keywords field about the types of EHR-related “emr,” “ehr,” “adt,” “electronic med,” “electronic health,” “information system,” “drop- errors and problems reported to the down,” “default,” “selection,” “mouse,” “no record,” and “link,” in conjunction with Pennsylvania Patient Safety Authority EHR supplier and system names. The query returned 8,003 reports from June 2, 2004, and to serve as a basis for further study. through May 18, 2012. Analysts noted that the search query returned some types of Authority analysts queried the Pennsyl- reports in which the term “EHR” was either incidental or EHR involvement could not vania Patient Safety Reporting System be confirmed, such as the following: for reports related to EHR technologies and performed an exploratory analysis — An event (e.g., a fall) that was reported in the EHR but for which no EHR systems of 3,099 reports using a previously were involved in or contributed to the event published classification structure specific — Manual errors that were committed outside EHR systems, such as pulling the to health information technology. The wrong medication from a cabinet or applying the wrong label to a specimen majority of EHR-related reports involved — Reports that indicated the use of a paper-based chart or did not specify whether errors in human data entry, such as an electronic system was involved entry of “wrong” data or the failure to A random sample of approximately 20% of these event reports was created by assigning enter data, and a few reports indicated each of the 8,003 queried reports a random number between 0 and 1 and reviewing technical failures on the part of the those reports with a randomly assigned number between 0 and 0.2. This random sam- EHR system. This may reflect the clini- ple was manually reviewed by one analyst with a background in clinical and biomedical cal mindset of frontline caregivers who engineering to classify the events as relevant or not relevant to the topic of patient report events to the Authority. (Pa Patient safety events involving the EHR; 933 (59.5%) of the 1,567 manually reviewed reports Saf Advis 2012 Dec;9[4]:113-21.) were identified as relevant. With the intent of reducing manual review of irrelevant reports, the data set of manually reviewed event reports (n = 1,567) was divided into training and validation data sets for a machine-learning model. The objective of the model was to estimate the probability of relevance of unlabeled cases using an algorithm trained on manually labeled cases. The training data set contained 70% (n = 1,097) of the manually reviewed reports, while 30% (n = 470) of reports were used in 10-fold cross-validation with stratified sampling. The best-performing model, using a Naïve Bayes kernel classifier, achieved an area under the receiver operating characteristic (ROC) * For the purposes of this article, the term “EHR” is used to denote a family of technologies that includes electronic medical records and electronic medication administration records, except in instances in which “EHR” constitutes a search or manufacturer-specific term. Vol. 9, No. 4—December 2012 Pennsylvania Patient Safety Advisory Page 113 ©2012 Pennsylvania Patient Safety Authority R E V I E W S & A N A LY S E S curve of 0.927±0.023 after dropping Figure 1. Reports Related to Electronic Health Records uncertain predictions (i.e., those with less (June 2004 through May 2012) than 90% confidence). NO. OF REPORTS This model was then applied to the remaining 6,436 queried reports that 1200 had not been manually classified. The machine-learning tool identified 2,500 of 6,436 reports as relevant. These 2,500 1000 298 reports were then manually screened to confirm relevance, and analysts deemed 2,166 of these reports (87%) as relevant 800 to EHRs. In total, 3,099 reports were confirmed as relevant to EHRs (933 from the initial random sample and 2,166 from 600 the machine-learning sample), and these reports were subjected to further analysis. 171 844 Analysts noted that EHR-related reports 400 are increasing over time, which was to be 141 89 expected as adoption of EHRs is growing in the United States overall (see Figure 1). 200 82 384 32 60 256 272 8 52 MS12739 RESULTS 23 104 148 52 83 0 ) Classification by Harm Score 04 05 06 07 08 09 10 11 12 AY 20 20 20 20 20 20 20 20 20 GH M Reported events were categorized by their U RO reporter-selected harm score (see Table 1). YEAR (TH Of the 3,099 EHR-related events, 2,763 (89%) were reported as “event, no harm” Manually identified Machine learning (e.g., an error did occur but there was no adverse outcome for the patient), and 320 (10%) were reported as “unsafe condi- Although the vast majority of EHR-related EHR-related problems. EHR systems are tions,” which did not result in a harmful reports did not document actual harm to used for the ordering, validation, and event. Fifteen reports involved temporary the patient, analysts believe that further administration of medications, labora- harm to the patient due to the following: study of EHR-related near misses and close tory tests, and diagnostic and therapeutic entering wrong medication data (n = 6), calls is warranted as a proactive measure. procedures. Therefore, it is not surprising administering the wrong medication that reported errors related to EHR use are (n = 3), ignoring a documented allergy Classification by Event Type associated with these event types. (n = 2), failure to enter lab tests (n = 2), EHR-related reports represented many Relevant cases were further classified by and failure to document (n = 2). Only event types in the Authority’s classifica- the same analyst according to an HIT- one event report, related to a failure to tion system (see Table 2); however, the vast specific taxonomy developed by Magrabi properly document an allergy, involved majority of reported events (81%) involved et al.3 This taxonomy includes classifi- significant harm. medication errors, mostly wrong-drug, cations for problems with data input, Patient with documented allergy to -dose, -time, -patient, or -route errors (50%) transfer, output, general technical issues, penicillin received ampicillin and or omitted dose (10%). The only other and contributing factors (see Figure 2). went into shock, possible [sic] due to event type with a significant number of Analysts considered applying the HIT tax- anaphylaxis. Allergy written on some reports was complications of procedures, onomy contained in the new Agency for order sheets and “soft” coded into treatments, or tests (13%), most of which Healthcare Research and Quality (AHRQ) Meditech but never linked to phar- involved lab test errors (7%). Analysis Common Formats for risk reporting; macy drug dictionary. attributed this distribution of event types however, insufficient detail was present in to the wide-reaching nature of potential Page 114 Pennsylvania Patient Safety Advisory Vol. 9, No. 4—December 2012 ©2012 Pennsylvania Patient Safety Authority Table 1. Classification of Reports Related to Electronic Health Records, by Harm Score the Bridge system was configured MACHINE- MANUALLY % OF to change to a default start time LEARNING IDENTIFIED TOTAL TOTAL Another report read: HARM SCORE* REPORTS REPORTS REPORTS REPORTS Incident: Unsafe Conditions Acetate component was not ordered (harm score A) 204 116 320 10 under the component section but Incident: No Harm was ordered in the administration (harm scores B1 through D) 1,952 811 2,763 89 instructions, which is a free-text field Serious Event: Temporary Harm that does not link with the TPN (harm scores E through F) 10 5 15 0 [total parenteral nutrition] additives Serious Event: Significant Harm and was missed by pharmacy upon (harm scores G through I) 0 1 1 0 verification and transcription into * As classified in the Pennsylvania Patient Safety Reporting System database. the TPN program. Acetate should have been ordered as meq/kg and not acetate 50:50, which was in the the narrative reports to properly apply this — This report was tagged with: administration instructions. taxonomy. 3.4.4, not alerted, because the — This report was tagged with: Analysts identified four new categories, system was not set up to alert 1.2.1.2, wrong input—wrong field, expanding the Magrabi et al. classifica- respiratory therapists because the component order tion to include specific problems with 4.4.1, software issue—function- was placed in the wrong field unit errors in wrong data entry (1.2.1.1), ality, because the system does 3.4.2, missing data (did not look data entered into wrong fields (1.2.1.2), not allow alerting of respiratory at complete record), because the misreading or misinterpreting displayed therapists pharmacist did not pull informa- information (3.4.5), and default values in An additional example is as follows: tion from the administration system configurations (4.4.2.1). A pharmacist entered correct day instructions field Some reports were tagged with more than start time (9/10) for Lovenox®, but Overall, 96% of the reports were tagged one problem type, such as in the following interface between pharmacy system with only one or two tags (see Table 3), example: and Bridge [administration system] and 3,946 problems were identified in the Patient was ordered albuterol caused the order to default to next 3,099 relevant reports. 0.5 mL Q4H [every four hours] and day start time. The nurse signed off ipratropium 2 mL Q4H nebulized order without confirming correct order COMPARISON WITH breathing treatments at 8:00 a.m. entry and did not “Add Dose” in OTHER DATA SETS into ProTouch system. The order Bridge to correct start time; patient In general, narrative reports from the was acknowledged by nursing, but missed one dose. Authority database exhibited a very dif- nursing did not notify RT [the respi- — This report was tagged with: ferent pattern of problem types than the ratory therapy department] of new 2.2, system interface issues, two sets of data tagged by Magrabi et al. orders. RT did not become aware of because the interface between (the US Food and Drug Administration’s orders until eight hours later. Due to the pharmacy and Bridge sys- [FDA] Manufacturer and User Facility limitations of ProTouch, RT cannot tems changed the order settings Device Experience [MAUDE] database, acknowledge respiratory orders; thus, 3.3, output/display error, in which there were 712 problems from therapist on duty was unaware of because the Bridge system out- 432 reports, and Australia’s Advanced the new orders until overdue order put an incorrect start time Incident Management System, in which report run at end of shift (two doses 3.4.2, missing data (did not look there were 117 problems). Analysts noted of each medication missed by that at complete record), because the that the most commonly used tags for time). Patient did not experience any nurse did not confirm correct reports to the Authority were related to adverse effects from delay in respira- order entry wrong input (applied to 47% of reports), tory therapy treatment; patient’s failure to update data (18%), or default respirations were unlabored. 4.4.2.1, software issue—system configuration—default, because (continued on page 117) Vol. 9, No. 4—December 2012 Pennsylvania Patient Safety Advisory Page 115 ©2012 Pennsylvania Patient Safety Authority R E V I E W S & A N A LY S E S Table 2. Classification of Reports Related to Electronic Health Records, by Event Type MACHINE- MANUALLY LEARNING IDENTIFIED TOTAL % OF TOTAL EVENT TYPE REPORTS REPORTS REPORTS REPORTS A. Medication Error 1,964 552 2,516 81 1. Dose omission 257 86 343 11 2. Extra dose 125 29 154 5 3. Wrong 1,226 321 1,547 50 a. Dose/overdosage 755 181 936 30 b. Dose/underdosage 62 17 79 3 c. Drug 91 30 121 4 d. Dosage form 24 9 33 1 e. Duration 19 6 25 1 f. Rate (intravenous) 14 5 19 1 g. Route 20 6 26 1 h. Strength/concentration 17 3 20 1 i. Technique 8 3 11 0 j. Time 72 27 99 3 k. Patient 144 34 178 6 4. Prescription/refill delayed 15 9 24 1 5. Medication list incorrect 58 15 73 2 6. Monitoring error (includes contraindicated drugs) 27 13 40 1 7. Unauthorized drug 31 7 38 1 8. Inadequate pain management 0 1 1 0 9. Other (specify) 225 71 296 10 C. Equipment/Supplies/Devices 1 6 7 0 3. Equipment not available 0 1 1 0 4. Equipment malfunction 1 4 5 0 13. Other (specify) 0 1 1 0 E. Error Related to Procedure/Treatment/Test 123 292 415 13 1. Surgery/invasive procedure problem 1 2 3 0 2. Laboratory test problem 66 165 231 7 3. Radiology/imaging test problem 12 31 43 1 4. Referral/consult problem 7 16 23 1 5. Respiratory care 9 5 14 0 6. Dietary 1 5 6 0 7. Other (specify) 27 68 95 3 F. Complication of Procedure/Treatment/Test 0 7 7 0 2. Anesthesia event 0 1 1 0 10. Catheter or tube problem 0 1 1 0 13. Other (specify) 0 5 5 0 G. Transfusion 6 9 15 0 2. Event related to blood-product administration 1 4 5 0 3. Event related to blood-product dispensing or distribution 1 1 0 8. Wrong patient requested 1 1 2 0 13. Other (specify) 3 4 7 0 I. Other/Miscellaneous 72 67 139 4 1. Inappropriate discharge 0 1 1 0 5. Other (specify) 72 66 138 4 Total 2,166 933 3,099 98* * Data in this table represents 100% of the reports, but the total percentage listed is less than 100% due to rounding of individual categories. Page 116 Pennsylvania Patient Safety Advisory Vol. 9, No. 4—December 2012 ©2012 Pennsylvania Patient Safety Authority Figure 2. Magrabi et al. Classification of Reports Related to Health Information Technology MS12735 Revised classification for health information technology problems (new categories for software problems are underlined) Reproduced with permission from BMJ Publishing Group Ltd. from Magrabi F, Ong MS, Runciman W, et al. Using FDA reports to inform a classification for health information technology safety problems. J Am Med Inform Assoc 2012 Jan-Feb;19(1):45-53. (continued from page 115) or blood glucose) that trigger calculations that default values contained in EHR of incorrect therapy, and even entry of systems were mentioned as contributing system configuration (10%).3 Many of the wrong physician name, resulting in factors in three of these reports. Reports the classifications developed by Magrabi reports being sent to the wrong recipient. tagged with “wrong fields” typically indi- et al.—especially those that focused on Authority analysts identified two new cated unfamiliarity with the configuration failures of the network, hardware, or soft- categories to describe specific types of or function of a facility’s EHR system. ware—applied to few or no reports. (See wrong-input problems that deserved more Users were entering data in a field that Table 4.) attention: 1.2.1.1 wrong input—units error was inappropriate for the intended data, (n = 18) and 1.2.1.2 wrong input—wrong as in the following example: Wrong Input fields (n = 65). Reports tagged with “units A patient received two extra doses Problems related to wrong input error” typically involved mix-ups between of oral magnesium oxide 400 mg. (n = 1,867) spanned a wide range of event patient weight units (lb versus kg) or selec- Order originally placed by physician types and outcomes: transposition or tion or entry of an incorrect dosing unit for [magnesium] oxide 400 mg [twice transcription errors in the entry of orders for a medication (e.g., weight-based dosing a day] for two days or four doses. or administration information, entry of like mg/kg/hr versus non-weight-based Physician did not place stop date into incorrect patient parameters (like weight dosing like mg/hr), and analysts noted ProTouch as per proper procedure Vol. 9, No. 4—December 2012 Pennsylvania Patient Safety Advisory Page 117 ©2012 Pennsylvania Patient Safety Authority R E V I E W S & A N A LY S E S but instead wrote instructions in the Table 3. Number of Tags Assigned per Report free-text box of ProTouch. When the NO. OF MACHINE- MANUALLY TOTAL order was verified by the pharmacist, TAGS PER LEARNING IDENTIFIED REPORTS % OF TOTAL instructions in the text box [were] REPORT REPORTS REPORTS (N = 3,099) REPORTS not acknowledged. When the nursing 1 1,728 616 2,344 76 staff administered the medication, 2 364 250 614 20 written instructions [were] not 3 59 61 120 4 acknowledged. Event [was] discovered by pharmacist after the patient had 4 12 5 17 1 received six doses of medication. 5 3 1 4 0 Default Values involved situations in which documenta- PA-PSRS and MAUDE differ in scope This classification was created when tion was completed in a paper system and reporting requirements. The Authority analysts noted that a large but not an electronic system (n = 85). By MAUDE database is populated by man- proportion of system configuration issues attempting to use both paper and elec- datory and voluntary reports of device mentioned errors due to default values. tronic systems in the course of workflow, failures and device-related errors. Cur- Like wrong-value problems, default-value users created confusing and conflicting rently, devices and systems like radiology problems spanned a wide range of event situations in which patient care was com- information systems (RIS) and PACS are types and outcomes, but reports generally promised, such as in the following case: FDA-cleared medical devices with man- fell into one of two categories: (1) a user dated reporting requirements, while EHR failed to modify a prepopulated default A patient was admitted to [the emer- gency department] with [a urinary systems, laboratory information systems, value for dose, time, route, or other and computerized provider order entry parameters in an order or (2) after entry tract infection]. A physician prescribed ciprofloxacin 500 mg [by mouth, (CPOE) and pharmacy (PhIS) systems are of an order, a system replaced entered not. Therefore, the MAUDE database is information with default values, often for once]. Patient had been in the [emer- gency department] for a while, and likely to contain more reported events start times. After correspondence with related to PACS than CPOE. The query Magrabi,4 the first type of default-value the previous nurse had administered the dose without documenting it on string used for this analysis also differs reports (“user failure to modify a default,” from the string used by Magrabi et al.; n = 70) were removed and retagged as the physician’s order sheet. The next nurse also administered the dose it specifically targeted EHR- and EMR- 1.2.3 failure to update data, and the sec- related events and did not include terms ond type (“system inserts a default after because she did not see it documented. When she went into the EHR, she related to RIS or other more broadly human entry,” n = 221) were tagged with defined HIT technologies. a new code, 4.4.2.1 software issue—system saw that the previous nurse had docu- configuration—default. mented [the initial administration] PA-PSRS and MAUDE also differ in the in the computer. She called the nurse background of reporting individuals. The Failure to Update Data to double-check that the [medication] MAUDE database is typically populated had been given. The physician was by biomedical and clinical engineers Problems related to failure to update data notified about the double dose. employed by facilities and manufacturers, (n = 762) largely involved four event types: while the PA-PSRS database is typically (1) users failing to transcribe written or DISCUSSION populated by risk management profession- verbal orders into an electronic order or als who are collecting clinical narrative pharmacy system, (2) users failing to enter The pattern of reported problems present event reports. Both reporting systems lab results into an information system, in the PA-PSRS database was different receive reports that are framed by the (3) users failing to modify a default value than that found by Magrabi et al. in reporter’s experience. Frontline caregivers to an intended value (as described in FDA’s MAUDE database. Analysts attri- will likely recognize if they have failed to the discussion regarding default values), bute this difference in problem patterns perform a duty or have entered incorrect or (4) users reporting that they did not to (1) differences in both the databases information, but they will rarely have properly document a clinical activity like themselves and the people who populate enough information to suspect a prob- removing a medication from stock or them and (2) limitations of the existing lem with device components or network administering a therapy. Analysts noted PA-PSRS data set. that many failure-to-document events (continued on page 120) Page 118 Pennsylvania Patient Safety Advisory Vol. 9, No. 4—December 2012 ©2012 Pennsylvania Patient Safety Authority Table 4. Application of Magrabi et al.* Taxonomy to Queried Reports EVENT REPORT TAGS NO. OF TAGGED REPORTS % OF TAGGED PROBLEMS Machine- Random PA-PSRS† PA-PSRS Magrabi et Magrabi et Learning Sample of Reports Data al. MAUDE‡ al. AIMS§ Reports Reports Data Data 1.1 Data capture down or unavailable 2 1 3 0 6 2 1.2.1 Wrong input 1,348 519 1,867 47 3 17 1.2.1.1 Wrong input—units error 12 6 18 0 ** ** 1.2.1.2 Wrong input—wrong fields 43 22 65 2 ** ** 1.2.2 Missing data 22 16 38 1 <1 throughout 6 1.2.3 Fail to update data 490 272 762 18 <1 6 1.2.4 Fail to communicate/carry out task 9 13 22 1 0 0 2.1 Network down or slow 5 6 11 0 <1 10 2.2 System interface issues 34 55 89 2 1 9 3.1 Output device down or unavailable 107 59 166 4 <1 4 3.2 Record unavailable 15 6 21 1 <1 0 3.3 Output/display error 113 30 143 4 28 5 3.4.1 Wrong record retrieved 46 19 65 2 <1 4 3.4.2 Missing data 22 7 29 1 0 0 (did not look at complete record) 3.4.3 Didn’t look 15 11 26 1 <1 4 3.4.4 Not alerted 10 17 27 1 0 2 3.4.5 Misread/misinterpret 9 1 10 0 ** ** 4.1 Computer system down or too slow 29 22 51 1 16 9 4.2 Software not available 1 5 6 0 0 <1 4.3 Unable to login 3 3 6 0 0 5 4.4 Software issue ** ** ** ** ** 7 4.4.1 Software issue—functionality 34 16 50 1 32 ** 4.4.2 Software issue—system 48 49 97 2 3 ** configuration 4.4.2.1 Software issue—system 168 53 221 8 ** ** configuration—default 4.4.3 Software issue—device interface 0 0 0 0 6 ** 4.4.4 Software issue—network 2 0 2 0 <1 ** configuration 4.5 Data loss 33 28 61 2 2 2 5.1 Contributing factor—staffing/training 30 24 54 1 0 2 5.2.1 Contributing factor—cognitive 1 3 4 0 0 <1 load—interruption 5.2.2 Contributing factor—cognitive 1 2 3 0 0 <1 load—multitasking 5.3.1 Contributing factor—fail to carry 1 3 4 0 0 3 out duty—fail to log off Note: Sample sizes are as follows: machine-learning reports (2,166 reports); random sample of reports (933 reports); PA-PSRS reports (3,009 reports); PA-PSRS data (3,946 problems from 3,099 reports); Magrabi et al. MAUDE data (712 problems from 436 reports); Magrabi et al. AIMS data (117 problems). * Magrabi F, Ong MS, Runciman W, et al. Using FDA reports to inform a classification for health information technology safety problems. J Am Med Inform Assoc 2012 Jan-Feb;19(1):45-53. † Pennsylvania Patient Safety Authority’s Pennsylvania Patient Safety Reporting System ‡ US Food and Drug Administration’s Manufacturer and User Facility Device Experience database § Australia’s Advanced Incident Management System ** Tag not used in analysis Vol. 9, No. 4—December 2012 Pennsylvania Patient Safety Advisory Page 119 ©2012 Pennsylvania Patient Safety Authority R E V I E W S & A N A LY S E S (continued from page 118) Perhaps because of these limitations, few results could allow for refinement of the of the contributing factors identified by machine-learning algorithm. components. Therefore, most frontline Magrabi et al. could be applied to queried caregiver reports of system availability reports. Although analysts may suspect CONCLUSIONS errors may indicate that the “computer that EHR-related errors could stem from system was down” (tag 4.1), even if the Overall, 3,946 problems were identi- inadequate training, interruption, or mul- underlying cause is a device interface fied in the 3,099 reports of EHR-related titasking, analysts could not apply these issue, a network configuration problem, events identified through this query of tags unless the narrative specifically identi- or an access problem. the Authority’s database, and several fied these problems. themes that may prove fruitful for further The limitations of using narrative review study were identified in reviewed reports, Limitations to identify EHR-related reports could be including the following: Specific limitations of this study may also alleviated through the use of EHR- or shape the nature and frequency of EHR- — The types of reported human-related HIT-specific event taxonomy like that related events present in the PA-PSRS problems (e.g., wrong entry, wrong used for the AHRQ Common Formats. database query. field, failure to update data) could Going forward, it may be advantageous have many underlying causes, which Reporting statutes of PA-PSRS. Pennsyl- for the Authority to include EHR- or could not be captured in the cur- vania healthcare facilities are required to HIT-specific options in the event type rent data set of narrative reports. report Serious Events, Incidents, and Infra- taxonomy and provide educational mate- Further study could provide more structure Failures through the Authority’s rials on the use of this taxonomy. This insight into the root causes of these PA-PSRS. However, Infrastructure Failures would prompt users to specify whether errors, which may include issues are accessible only by the Pennsylvania they believe EHR systems played a role in in workflow design or policies and Department of Health. In many facilities, a the reported event and would reduce the procedures, usability or functionality failure of a computer, information system, burden of manually reviewing irrelevant gaps in the design or configuration or network may be classified as an Infra- queried reports. As in any scientific study, of an electronic system, or gaps in structure Failure and would not appear in adding to reporter knowledge will likely the training or understanding of the the Authority’s data set. increase the quality of the reports and user population. Awareness of EHRs as a potential con- decrease the missed risk events, allowing the Authority a greater understanding of — Ongoing study of incident reports tributing factor to an error. As noted can help identify the common types previously, frontline caregivers may not HIT risks. of problems seen with EHRs. The suspect that an EHR system has con- Query design. This study’s query focused Authority can help improve patient tributed to a human error. Events in the on EHR system names and usage terms. safety by characterizing and system- PA-PSRS database were not picked up Terms related to missing, lost, or cor- atically addressing these common by the Analysts’ query if they did not rupted data were not specifically included problem types even in the absence of specifically call out a particular system or in the search string, although reports of root-cause data. EHR in general. Therefore, if a frontline this type were identified in the study. Fur- — EHR- and HIT-related reports that caregiver did not suspect that the configu- ther study on a more focused query string are classified by reporting facilities as ration of an EHR somehow contributed could identify more reports of system Infrastructure Failures are accessible to their choosing the wrong drug for a errors resulting in missing, lost, or cor- by the Pennsylvania Department patient, they may have simply reported rupted data. of Health but not by the Author- that they selected the wrong drug and not Further refinement of the machine- ity. Because many facilities classify mention the EHR. Analysts have not manually learning tool. failures of information technology Limitations of narrative reporting affected confirmed the remaining 3,936 queried networks and systems as Infrastruc- both the types of reports queried and the reports that were identified as “not ture Failures, this type of report is tags applied. Unless a narrative report EHR-relevant” by the machine-learning likely to be underrepresented in the specifically included the search query algorithm. Therefore, events that were Authority’s database. A query of the terms, the report was not captured by falsely tagged by the machine-learning Infrastructure Failure reports may this query. Unless specifically mentioned algorithm as “not EHR-relevant” were identify more machine- and system- in a narrative report, a problem type or excluded from this analysis. Identifica- related reports of EHR and HIT contributing factor could not be tagged. tion of false-negative machine-learning events. Page 120 Pennsylvania Patient Safety Advisory Vol. 9, No. 4—December 2012 ©2012 Pennsylvania Patient Safety Authority — Adding EHR- and HIT-specific event of interest for further study as more shed some light on best practices in types and taxonomy to the Author- facilities transition between paper- the use of default values in system ity’s reporting system may increase based and electronic systems. configuration. the number and quality of event — The configuration of electronic Acknowledgments reports related to EHRs and HIT. systems, especially the use of default Edward Finley, BS, Pennsylvania Patient Safety — Dual workflow that uses both paper- values, seems to lead to certain types Authority, contributed to data acquisition and based and electronic records seems of errors in medication orders and validity for this article. particularly problematic and may be documentation. Further study could NOTES 1. HIMSS Analytics Database. US EMR 2. Institute of Medicine. Health IT 3. Magrabi F, Ong MS, Runciman W, et al. Adoption Model [online]. 2012 [cited and patient safety: building safer Using FDA reports to inform a classifica- 2012 Jul 26]. http://www.himssanalytics. systems for better care [online]. 2011 tion for health information technology org/stagesGraph.asp. Nov 8 [cited 2012 Jul 25]. http:// safety problems. J Am Med Inform Assoc www.iom.edu/Reports/2011/ 2012 Jan-Feb;19(1):45-53. Health-IT-and-Patient-Safety-Building- 4. Magrabi, Farah. E-mail to: Pennsylvania Safer-Systems-for-Better-Care.aspx. Patient Safety Authority. 2012 Oct 31. Vol. 9, No. 4—December 2012 Pennsylvania Patient Safety Advisory Page 121 ©2012 Pennsylvania Patient Safety Authority PENNSYLVANIA PATIENT SAFETY ADVISORY This article is reprinted from the Pennsylvania Patient Safety Advisory, Vol. 9, No. 4—December 2012. The Advisory is a publication of the Pennsylvania Patient Safety Authority, produced by ECRI Institute and ISMP under contract to the Authority. Copyright 2012 by the Pennsylvania Patient Safety Authority. This publication may be reprinted and distributed without restriction, provided it is printed or distributed in its entirety and without alteration. Individual articles may be reprinted in their entirety and without alteration provided the source is clearly attributed. This publication is disseminated via e-mail. To subscribe, go to http://visitor.constantcontact.com/ d.jsp?m=1103390819542&p=oi. To see other articles or issues of the Advisory, visit our website at http://www.patientsafetyauthority.org. Click on “Patient Safety Advisories” in the left-hand menu bar. THE PENNSYLVANIA PATIENT SAFETY AUTHORITY AND ITS CONTRACTORS The Pennsylvania Patient Safety Authority is an independent state agency created by Act 13 of 2002, the Medical Care Availability and Reduction of Error (“Mcare”) Act. Consistent with Act 13, ECRI Institute, as contractor for the Authority, is issuing this publication to advise medical facilities of immediate changes that can be instituted to reduce Serious Events and Incidents. For more information about the Pennsylvania Patient Safety Authority, see the Authority’s An Independent Agency of the Commonwealth of Pennsylvania website at http://www.patientsafetyauthority.org. ECRI Institute, a nonprofit organization, dedicates itself to bringing the discipline of applied scientific research in healthcare to uncover the best approaches to improving patient care. As pioneers in this science for more than 40 years, ECRI Institute marries experience and indepen- dence with the objectivity of evidence-based research. More than 5,000 healthcare organizations worldwide rely on ECRI Institute’s expertise in patient safety improvement, risk and quality management, and healthcare processes, devices, procedures and drug technology. The Institute for Safe Medication Practices (ISMP) is an independent, nonprofit organization dedicated solely to medication error prevention and safe medication use. ISMP provides recommendations for the safe use of medications to the healthcare community including healthcare professionals, government agencies, accrediting organizations, and consumers. ISMP’s efforts are built on a nonpunitive approach and systems-based solutions. Scan this code with your mobile device’s QR reader to subscribe to receive the Advisory for free.