By Finn Raben
The recent headlines about a US healthcare provider misusing patient’s data have been raising consumers’ concerns on how their confidential information is actually handled. Last November Ascension, the largest non-profit system in the US, was under the spotlight for providing Google with access to 50 million private medical records without doctors’ or patients’ knowledge (Project Nightingale).
The Wall Street Journal Report, who bought the matter to public attention, explained that this partnership was claimed to be designed to improve patients’ care by modernizing Google’s information system. Thanks to this modernization doctors would gain new and more efficient tools that would enable them to scan entire medical records for relevant data. Although intentions may have – or might still be – good, big tech companies have an inordinate amount of “power” in gaining access to personal data without consumer knowledge, and the fact that they have access to that personal information means that the data may well be handed on to third parties – this is the data business model so clearly articulated in the book “The Age of Surveillance Capitalism” by Shoshana Zuboff. Targeted advertising, based on patients’ medical histories, might be just a step away.
Health data is regulated by the federal government; unlike most of the data collected by Google. In relation to Project Nightingale, the parties involved claimed that their partnership was fully compliant with federal health-privacy law. However, it was also reported that approximately 150 Google employees had access to delicate data such as lab results, diagnoses, and hospital records, all of which provide detailed information on people’s health histories. 150 employees and yet no doctors or patients were notified of this data transfer?. This apparently caused considerable concern amongst those working on the project, aware of the possibility of data privacy breaches.
From the same Washington Post report, it appears that the National Institutes of Health (NIH) also stopped Google from posting more than 100,000 human chest x-rays due to images containing personally identifiable patients’ information.
So, what lessons can the US healthcare industry learn from this?
According to ESOMAR’s latest report “The Trust Paradigm”, 75% of consumers globally are concerned about sharing their personal data. Stories like this one don’t help. One false move can deeply affect consumers perception of the industry and of a provider, feeding the already strong feeling of mistrust of large-scale digital data collection (and by implication, abuse).
The reputational damage is also caused by a lack of understanding of the importance of observing our due diligence obligations and meeting the most basic legal and ethical requirements. In a time where healthcare and technology are increasingly intertwined, professionals might find the intersection between these two industries a dark area. The risk of data misuse is high, as is the resultant risk of exacerbating consumer and patient mistrust.
Innovative tools for better healthcare practices are constantly developing; AI, big data analytics or hardware can hugely improve healthcare provision and diagnosis. However, big tech – especially when applied to data collection – still encounters some resistance from consumers. According to recent survey from Rock Health, only 11% of US patients are willing to share their health data with big technology companies.
Patients’ trust is crucial for healthcare providers, as is the industry’s responsibilities to make consumers feel secure when sharing their information. The first step towards consumers trust is to abandon the cloak and dagger approach and adopt a transparent attitude when it comes to collecting data. Why do companies believe that keeping this data “secret” is better for all?? ESOMAR’s recent report, in fact, shows that 69% of consumers are more likely to share their data if the data collector is clear about why the information is needed and how it will be used.
Giving the right responsibilities to the right professionals is key for successful data practice. Heads of IT are often responsible for data within organizations, but they might not have the right expertise or understanding of our data obligations when it comes to citizen’s data. Hire a Data Protection Officer. With the right training and skills, they will be able to do the job effectively and avoid missteps.
But when it comes to tech partners, what do you need to look out for?
You can find out about where they stand when it comes to data protection just by checking whether they have signed up to an ethical set of codes and guidelines. There are a number of data associations, such as ESOMAR, both local and international, that they can join. These set(s) of ethical codes and guidelines go further than required law and are simply expressions of good practice that every regulator worldwide frequently recommends and which offer a great platform for future-proofing your activities.
Stricter legislation is coming; it will be raising the standards for data collection, and the public will be increasingly aware of the risk of their data being misused. The healthcare industry needs to be ahead of this curve and operate together towards showcasing safer data practices where consumers don’t have to fear their data being exposed.
Finn Raben has spent most of his career in Market Research. He started at Millward Brown IMS in Dublin, followed by AC Nielsen. Most recently at Synovate, Finn was CEO of Southern Europe. Finn is currently the Director General of ESOMAR.