By Taqee Khaled
The idea that data is poised to break down massive barriers in the American healthcare system has been discussed for the better part of a decade. But recent signals from the federal government indicate a more focused intent to drive healthcare organizations toward turning concept into reality.
This summer, the Centers for Medicare & Medicaid Services (CMS) announced a pilot API program, “Data at the Point of Care” (DPC), to help increase provider access to patient data. The program, which launched July 30, delivers data directly to medical providers through the technology they already use — or any new solutions that may come along.
DPC and interoperability have the potential to relieve the massive burden of medical records for both providers and patients. CMS already serves more than 130 million people, and more than 10,000 boomers age into Medicare eligibility every day. Neither patients nor their doctors can be expected to recall or manage this crushing wave of medical information.
Meanwhile, increased cultural confidence in cloud standards and security makes this an optimal time for both existing and new companies to innovate around patient data. CMS has provided the rules for data use and the fuel for the fire — it’s up to developers to leverage this effectively and generate scalable value. Some companies (like Medal and Datica) have anticipated this market shift well and are ready to operate in this data economy. Others will need to evolve quickly.
But DPC isn’t a data free-for-all. Those who want to access the information must meet strict guidelines set by the government. Developers and healthcare businesses will need to consider cost, business model and experience before setting out on their interoperability journeys.
Cost of implementing solutions
Cost is a major barrier to interoperability because most health organizations are either just starting or are in the midst of legacy platform modernization and building a microservices-oriented architecture. In many cases, this involves skilling up or rebuilding in-house IT talent rapidly. It’s also the reason CMS made the data publicly available in the first place — the department understood that developing its own interoperability system would be too expensive, so it instead chose to create a data economy.
Most systems are not prepared to take in the large volumes of interoperable data needed to drive meaningful, valuable insights. So, part of companies’ preparation will involve demonstrating an ability to create more value for their patient panels. That starts with meeting international Fast Healthcare Interoperability Resources (FHIR) API protocols to call and receive sufficient data. It’s an important step that should be one of the first tech priorities of future-ready healthcare organizations and it’s not a small undertaking.
Business model updates
Electronic health records (EHR) are hardly a new concept. In 2011, CMS created the Medicare and Medicaid EHR Incentive Programs (now known as the Promoting Interoperability programs) to encourage their adoption, implementation and meaningful use. While a good first step, there has since been continuous friction between EHR workflows and the on-the-ground clinical workflows they purport to support and enable. Consequently, initiatives to improve EHR uptake have only resulted in the prioritization of technology implementation without value-returning and growth-enabling functions.
Still, the landscape has undergone broad change in the past eight years. In October, Mayo Clinic completed a $1.5 billion system-wide Epic EHR implementation that allows all 52,000 staff across 90 hospitals and clinics to access and share the same data. Additionally, Mayo patients can check in for appointments electronically, and receive a single consolidated bill no matter where they are treated.
While incumbents like Epic continue to make up large chunks of the landscape, movement by smaller players and accessibility to options like DPC stand to disrupt the market. There’s no question that the largest vendors will need to become much more nimble and valuable to their clinical clients, not just acquisitive of the competition.
Today’s seniors have moved beyond a general discomfort with technology, a stereotype that rings less true each year. And with that digital-forward mindset comes the expectation that healthcare systems and EHR experiences will meet certain implicit standards. By the time millennials and Generation Z age into Medicare, they’ll expect personalized, data-driven services that match the current sea change in retail.
Providers who don’t meet this expectations are taking a risk on several fronts. In an economy of choice, patients will simply transition to a group that tailors an experience to their digital ecosystem. Additionally, providers that don’t move to a digitally forward approach won’t deliver care based on the full body of patient information, incurring increased costs due to the accumulation of inefficient care.
Healthcare companies who prioritize using data to give their customers timely, relevant and valuable experiences interacting with their products and services will emerge from the next five years well ahead of those who do not.
So, what does the future look like?
Interoperability is a word that only exists today because systems are not interoperable. As data sharing and integration in healthcare becomes normative in the next decade, vocabulary around interoperability will disappear and the discussion will continue to shift toward extracting insights that lead to better health at a population level with controlled costs and minimal provider burnout. Developers and healthcare organizations should start today by prioritizing system updates to support a modern API ecosystem as this continues to be the cornerstone of unlocking data at scale for operational improvement and value-driven innovation.
Taqee Khaled is director of strategy at Nerdery.