Healthcare and the Data Center

Updated on November 28, 2018

By Akhil Docca

The healthcare industry – from hospitals to health insurers – is riding the technology wave. It has sparked the development of cutting-edge solutions to better serve patients and improve operational efficiency. To that end, these enterprises are developing and using technology and solutions like mobile health, electronic medical records, medical imaging, and artificial intelligence. This adoption of technology is driving the industry forward and creating a huge amount of information to process, manage, and store in ways that are both accessible and secure.

Globally, the volume of healthcare data accounted for over 700 exabytes in 2017 and is expected to grow to 2,314 exabytes in 2020. Mobile health alone, including healthcare-centric wearable devices, was a nearly $18 billion market in 2016. Medical imaging, which constitutes between 7.5-10% of total expenditures in healthcare, is seeing an increase in the volume of imaging done and in the file sizes. Picture archiving and communication systems (PACS) storage requirements for hospitals are growing at more than 20% per year – to more than 8,900 terabytes in 2011.



Add those storage requirements to the fact that the average healthcare organization uploads 6.8 terabytes to the cloud each month, and it’s clear that this volume of activity will have a significant impact on any data center – internal, colocation facility, or third-party provider – that houses this data.

This increase in computing demand drives the need for high-density equipment, which can do more work in the same or less space. Transitioning to this modern computing environment disrupts standard capacity planning processes such as measuring power capacity and traditional IT deployment and provisioning. Since this critical information is what enables healthcare organizations to increase operational efficiency and provide improved care for their patients, data center operators and managers need to support these processes with as little interference as possible.

Capacity Planning

For modern hospitals and health insurers dealing with this onslaught of sensitive data, and the resulting transition to a high-density computing environment, managing the data center is no longer simply about space and efficiency. It’s about accurate forecasting, process improvement, and maximizing capacity. Understanding this high growth area is about coordinating physical and virtual infrastructure, cloud and hybrid solutions, and colocation partnerships; understanding where to place specific workloads for optimum performance; and evaluating and forecasting demand and setting aside capacity to deal with it.

As healthcare providers deal with their increasing capacity needs, the tools used by facility managers must evolve to address the unique security, storage, and management requirements for data at this scale. Data center managers will need to rely on intelligent toolsets to optimize asset location, space utilization, security, cooling, weight and power.

As healthcare dives headfirst into the world of large-scale data management, the best way that facility managers can stay ahead is to remain flexible by preparing for the eventual impact on operations. The two keys to this flexibility are:

  1. The ability to test potential scenarios or failures without causing risk to existing systems.
  2. Insights and a holistic view into the portfolio otherwise spread across multiple Excel sheets and documents which can result in extreme difficulty to gather and gain any meaningful information.

The only way to achieve these two results is through the use of a digital twin.

The Digital Twin

Digital twins “provide a software representation of a physical asset,” such as a data center, and allow companies to “better understand, predict, and optimize the performance of each unique asset.” A digital twin should capture the data center in its entirety – including IT assets, racks, power networks and cooling distribution – and enable users to test what-if scenarios.

The ideal digital twin of a data center should have the ability to predict the airflow and temperature distribution and power failure scenarios, to understand potential capacity losses.

Additionally, a digital twin enables data center managers to design and test any scenario or change to operations, without the risk of physical implementation in a production environment. Use of a digital twin provides the opportunity to optimize administrative processes and account for space availability, capacity utilization, asset requirements, power needs, and future costs. This single-pane view of the optimal data center fosters collaboration in order to remove bottlenecks and speed up the planning process – potentially reducing the time to successful testing and implementation from weeks to days.

All this activity has a significant impact on data center operations, as well. Data centers that support hospitals, health insurers, colocation or outsourcing facilities, need to be ready to securely support this kind of data and analysis. Without proper systems and controls, this data is potentially at risk. The digital twin can prepare you to manage all this data, and keep you one step ahead in the evolution of the healthcare data center industry.

Akhil Docca is Director of Marketing for Future Facilities.

The Editorial Team at Healthcare Business Today is made up of skilled healthcare writers and experts, led by our managing editor, Daniel Casciato, who has over 25 years of experience in healthcare writing. Since 1998, we have produced compelling and informative content for numerous publications, establishing ourselves as a trusted resource for health and wellness information. We offer readers access to fresh health, medicine, science, and technology developments and the latest in patient news, emphasizing how these developments affect our lives.