We all know what it’s like to have a relative or friend take on the challenge of a health condition. Indeed, in the last 16 months no matter what the condition, whether awaiting diagnosis, treatment, or a scheduled check-up that has been delayed, the impact is acutely understood. Digital healthcare is at a pivotal point from the perspective of technology innovations facilitating improvements in individual well-being and personalized care, and this is not exclusively at the bedside.
Predictive care
By 2030, healthcare systems worldwide will be expected to deliver diagnostics and care that is both proactive and predictive, enabled by artificial intelligence (AI), machine learning (ML) and data-driven analytics, as connected care and bioinformatics commentators forecast. In the very near future, advanced analytics applications, including AI and ML, will greatly improve clinical decision-making and patient care outcomes. Analyzing patient health records alongside vast datasets that cover populations, conditions, countries, environmental factors, virology data and much more will be leveraged to help manage myriad health conditions.
Understandably, healthcare providers and medical teams alike are excited about the potential for AI-powered diagnostics and precision medicine, primarily because of what this means for improvements in patient care – especially when so many countries expect to care for a larger population of seniors in years to come.
Data-driven clinical informatics
Clinical informatics uses data and a range of tools to support health professionals, from analyzing data to prevent hospital patients from having accidents onwards, to running systems for storing and sharing X-rays, ultrasound and magnetic resonance imaging (MRI) scans. Health Education England forecasts that they will need to fill a skills gap of a staggering 672 percent to meet demand by 2030.
AI-driven data analytics and resource-intensive task automation can enable healthcare providers, public or private, to increase productivity and efficiency of care delivery whilst enhancing resource use, reducing waiting times and tackling employee burnout. However, few medical or healthcare professionals have likely given much thought so far to what these trends mean for the underlying critical IT infrastructure required to support this revolution.
Data demands will rely on higher-density processing which is evolving quicker than the cooling technologies used within most data centers. Densities that the majority of PACS Administrators and IT departments have never seen before, let alone will be able to accommodate within their current IT infrastructure. This is even before the sheer volume of data being generated starts to pull the analytics, applications and IT hardware to the data source itself – in operating theatres, onwards or at the bedside – for real-time processing which creates a whole set of other challenges to overcome.
Last year, diagnostics was the most prevalent use of AI within the UK’s National Health Service, as it starts to use deep learning (DL), ML and categorization technology on enormous sets of medical images to create workflows and algorithms that enable faster and more accurate readings at the point-of-care. This means that the processing also needs to be done at the edge, closer to the point-of-care, giving rise to additional challenges such as available space, acoustics, and physical and data security.
Digital healthcare at the server level and sustainability targets
In addition, the healthcare sector has its own sustainability challenges, producing the equivalent of 4.4 percent of global net emissions. As a result, it is facing the greatest IT infrastructure challenge it has ever seen, and healthcare leaders will have to prioritize sustainable initiatives (with cost savings an addition driver), which they say often goes hand-in-hand with technology advancements.
As the data center sector supports digital transformation, we will also increasingly inherit the sustainability challenges of other sectors. IT transformation, mobile devices and the Internet of Things (IoT) are creating enormous volumes of data globally. IDC predicts that in 2025, 175 zettabytes (175 trillion gigabytes) of new data will be built worldwide. Gartner forecasts that more than 50 percent of this data will be generated and processed outside of the data center where it resides.
In subsequent years, the exponential upsurge in data processing necessary to extract patient insights from large datasets will drive the requirement for higher power computing and, therefore, cooling densities. CPU power consumption is on the rise – with Thermal Design Power (TDP) expected to reach 400+ watts – resulting in hotter chips and higher rack densities. Increasing use of high-power GPUs alongside the CPU to accelerate computational workloads is also resulting in much higher power consumption, and is driving the need for a thermal management rethink.
The conventional way to remove heat from IT server equipment is by blowing cold air through the chassis, but even the most efficient air-cooling systems will tap out before CPUs reach their mapped TDP of 400+ watts. Just blowing more air over the problem is not practical, responsible or sustainable. Therefore, a successor cooling technology is required. Liquid cooling is the only technology that can take the mantle and enable digital transformation inside and outside the data center to enable high-quality personalized healthcare by 2030.
David Craig
David Craig is CEO of Iceotope Technologies.