By Larry Ozeran MD, President, Clinical Informatics Inc.
HIMSS core objective as an organization is to improve healthcare quality and patient safety with information technology. As hospitals and providers work to implement electronic health records and other IT and management systems, HIMSS is launching a blog series on health IT and patient safety to help providers and hospitals identify potential risks to patient safety that have resulted from problems with EHR implementations and mitigate those risks through proactive measures.
We began this series with a focus on goals. Along the way, we have identified several key themes:
• Have a goal for each project.
• Listen to your users, whether you are a vendor or a healthcare organization.
• Prioritize based on available resources and impact on patient safety.
We discussed the challenges of completely avoiding risks to patient safety. In most cases, the most prudent use of available resources is to reduce the likelihood of patient harm, and then, identify failures early. If your organization is not failure-ready, culture change may be needed.
As we close this series, we are going to consider the four primary types of errors that lead to adverse safety events: design error, implementation error, process error, and user error.
We have already discussed the most important aspects of design errors, particularly if they lead to workarounds.
Implementation errors can occur when:
– There is inadequate training of the local IT staff by the vendor,
– The configuration options are unclear or misinterpreted, or
– The organization’s infrastructure cannot support the product (e.g., inadequate power, backup processes, or hardware).
– Process errors appear in multiple forms at multiple points. Here are a few common examples.
– The implementation process does not consider disruptions to clinical workflow (e.g. insufficient reduction in clinical work leaving inadequate time for training or recovery from mistakes with the software while trying to help patients).
– Training isn’t a priority (e.g. limited budgeting, no post-training testing, and limited availability of necessary clinical support).
– There is no systematic mechanism for identifying problems, prioritizing them based on resources and patient danger, and tracking them to resolution.
User errors are often a result of one of the other types of errors (e.g. workarounds from inadequate training or poor HCI design). This is why everyone in the process, beginning with the vendor and including the organization’s IT staff and training teams, in addition to the stakeholders and end-users, has an important impact on patient safety.
As you think about patient safety risks, consider the errors in this cartoon and why (some of us think) it is amusing.
It demonstrates all four major errors that can lead to patient harm.
– Design: The electrical plug is in the front, not the back.
– Implementation: The system is plugged across the walking path rather than behind the unit.
– Process: Rather than make it hard to unplug the unit, a sign is posted which, if missed, permits the person who designed the bad process to blame the person who unplugged the unit. The sign would be unnecessary, if there were no design or implementation error. In the face of a crisis, the process permits the staff to feel safe and simply ”relax,” rather than verify their assumptions.
– User: She tries to work around the cord, but accidentally, perhaps inevitably, unplugs it anyway.
Don’t be passive when it comes to patient safety.
Prepare your organization to be failure-ready by:
– Actively engage with stakeholders;
– Balance your resources against prioritized patient safety risks; and
– Always remaining focused on your goals.