By Randy Thomas, FHIMSS
Associate Partner, Performance Analytics, Encore, A Quintiles Company
Implementing an enterprise-wide EHR is a massive, complex undertaking. The needs of many stakeholders must be considered when defining the build requirements; workflow must support ease-of-use and not interfere in patient care delivery and related work processes. Many implementation decisions focus on driving clinician adoption to ensure both quality and efficiency objectives are met (not to mention regulatory requirements related to Meaningful Use). With all the multi-threaded work streams and decision processes involved in planning and executing an EHR implementation, the re-usability of the captured data frequently falls out of scope.
But what exactly is re-usability? Quite simply, it is using data captured in any “source system” (e.g., EHR, admitting/discharge/transfer [ADT], materials management, patient accounting, registration, operating room, emergency department, etc.) for reporting, measurement and analytics. Re-using the data captured in these source systems accelerates the value realized from implementing these systems and supports a virtuous cycle of performance improvement across the enterprise. It all relates to the old adage “you can’t manage what you don’t measure;” you can’t measure something if you don’t have the right data. And that leads back to the decisions made in implementing EHRs and other systems. You need to know what data is needed to measure and analyze what is important to the organization and ensure that data can be consistently, reliably and accurately captured at the point of origin (e.g., at registration, in the care process).
It is unrealistic to expect that every bit of data about a patient should be captured in a discrete form for re-use. There needs to be a balance between supporting ease-of-use in the appropriate workflow and the availability of data for reusability. A good way to strike this balance is to create a list of data elements your organization agrees are necessary for analytics. Frequently this initial list of data elements is driven by the metrics required to support Meaningful Use (MU) utilization and quality eMeasure reporting. Because of the leeway allowed in making implementation decisions, having an MU-certified system does not automatically guarantee all the data needed to meet reporting requirements is captured discretely and consistently. Some detective work is required; it starts with identifying the data needed for each metric and continues by tracing the journey of that data back to the source system and ensuring each data element is captured as expected in the intended workflow. This requires collaboration across a multi-disciplinary team involving experts in quality reporting, data analysis and clinical (or operational) workflow.
The resulting inventory of data elements will likely be shorter than anticipated. Most (if not all) metrics require some data about the patient – patient demographics. So there is a good bit of repetition of this type of data; once you’ve accounted for a data element, it’s in the inventory and available for use as many times as needed. And as you progress through the list of metrics, there will be fewer and fewer net new data elements you need to account for.
Once you’ve created your inventory of data elements, you need to identify where each data element can be captured in the source system (e.g., EHR, ADT, etc.). This is the “data chain of trust.” Going through this process it is critical that you work across the inevitable silos of EHR implementations. As stated above, EHR implementations are complex and multi-threaded. Different implementation teams could be making contradictory decisions about data capture that do not adversely affect patient care but DO impact the reusability of the data. Discussion and compromise will be needed to design workflow that both supports ease-of-use and captures data reliably and consistently.
To ensure you’re making coordinated decisions, you need to involve all the various teams – nursing, physician, lab, pharmacy, imaging, registration, scheduling, etc. – in “interlock” sessions. And you need to document all the decisions that support the “data chain of trust” for each data element for each metric so as system refinement occurs you know the consequences of changing how a particular data element is captured. Sometimes in this decision process, organizations decide that the trade-off between ease-of-use and data reusability is too great and will opt for ease-of-use. These decisions need to be documented and communicated as well so expectations about what can be measured and analyzed can be managed.
With your documented inventory of data elements married to how that data will be captured in the source systems, data can start flowing into a measurement and analytics environment. There are multiple options available to organizations – from custom data warehouses to purpose-built applications – but the common need for all of them is consistent, reliable data. Applying sound data governance principles and implementing a data profiling discipline to monitor the will ensure that consistency and reliability.
How does having accurate data help an organization. Here’s one example of looking at a sepsis mortality rate.
In this example, the organization currently has a sepsis mortality rate of 13.25%. To lower this rate they first need to understand what the current state is and then where they might initially focus their intention to achieve improvement.
Define the Population.
- Start with one year of data identifying all patients with a final diagnosis of Sepsis (ICD-9 = 995.51).
- Profile the population to see what characteristics are prominent – average age, admit type, discharge disposition, etc.
- For example, average age was 59, 97% were admitted urgently or emergently, 76% were admitted from home and 20% were discharged to hospice or expired.
- Focus on the expired population compared to the total Sepsis population – length of stay, ICU LOS and variable cost were all higher. What can be improved in the expired population?
Figure 1. Average LOS and Variable Cost per Case
Analyze the Data.
- Determine what might influence improved outcomes, and investigate the following areas:
- Order set usage
- Physician differentiation based on volumes related to outcomes
- Infectious disease consults and timing
- Antibiotic usage
- Sepsis order set usage is lowest among those patients who expired.
- Only 7% (n=12) of Sepsis patients used the Sepsis order sets; only 1 patient that expired was ordered the Sepsis order set.
- 22% (n=39) of Sepsis patients were never ordered an order set
- 29% (n=52) of Sepsis patients were ordered order sets other than Sepsis, infection related or ICU
Interpret the Data.
- Physicians treating higher volumes of patients:
- Shorter LOS
- Fewer transfers
- Antibiotic start times sooner
Figure 3. Antibiotic Timing
Identify Improvement Opportunities.
- Four concrete next steps:
- Educate physicians on use of dedicated order sets for specific populations
- Evaluate early recognition criteria, such as vital sign changes and lab results so antibiotics can be given sooner
- Evaluate appropriate antibiotic therapy
- Evaluate internal consult/transfer policy so experienced physicians are involved in care
Organizations don’t have to begin with a large set of discrete data – but any level of measurement, reporting and analytics requires consistent, reliable, accurate data that starts at the point of capture in the source systems. To accelerate value realization begin with the data most important to your organization and ensure that data can flow from origin to analytics in a “chain of trust” that is known and transparent. From there incrementally increase the available data as the organization comes to understand why it’s important to capture data discretely and accurately as more stakeholders benefit from access to that data. With increasing value realized comes the understanding that – “it’s all about the data”!