In a rare partnership, Google and Apple are developing an API to help facilitate COVID-19 tracing for healthcare organizations on a nationwide scale. Many believe that such contact tracing — i.e., identifying those who may have come into contact with someone already infected with the virus — is a critical step in stopping the spread of the coronavirus.
However, with privacy concerns a discussion point for both even prior to the COVID-19 pandemic, there has been some pushback on these efforts. The ACLU is among those raising concerns about the privacy and security of such measures, with the U.S. legislature discussing bills designed to protect privacy and restore faith in technology.
The ACLU’s concerns with Apple and Google’s tracing app range from governmental overreach to discrimination to whether or not participation would be mandatory. For their part, the companies updated their proposal to include information about how data would be collected, and how the app would only remain in service until the end of the pandemic.
While the contact tracing app is voluntary, another hurdle lies in encouraging participation. A study conducted by the Washington Post and the University of Maryland discovered that three out of five Americans would be unwilling or unable to use a contact tracing app. In many ways, this nullifies the idea behind the app, calling into question the extent to which privacy laws should be suspended in the interest of public health.
With privacy already a hot-button issue in the United States, the way these privacy concerns are addressed now could set the stage for future rhetoric. Transparency is a start to making citizens more comfortable with the information they must share for contact tracing to work correctly, outlining the kind of data that is collected and how it will be used. Most try to keep the process as anonymous as possible. Geolocation is a common element of all of these apps, though the intended anonymity is lost for individuals living in more remote areas.
Proximity tracing via Bluetooth can help identify potential exposure cases, though some fear that this might be too blunt of a method.
“Proximity is only one factor in virus exposure and may be less significant than other factors, such as wearing a face mask,” Sherrese Smith, vice-chair of Paul Hastings’ Data Privacy and Cybersecurity practice, told HealthITSecurity. “Relying solely on proximity risks creating a database that overestimates potential exposure events. If a ‘clean’ proximity record becomes necessary for some individuals to go back to work or get life insurance, over-inclusion can result in real-world negative consequences.”
Other questions concern the precision of these methods — i.e., defining “contact” and eliminating false positives such as two individuals standing on opposite sides of an apartment wall, among other things. Enforcing self-isolation for those that have been exposed is another challenge entirely that could render data collection efforts moot if not done properly.
And although the stated goal for these apps is to stop tracing after the pandemic, it’s entirely possible that these apps could still serve as the basis for surveillance efforts later on. The massive collection of personal data that these apps can accrue makes them an inviting target for hackers as well as phishing efforts aimed at users. The lack of existing privacy legislation for technology also makes it challenging for companies to prove that apps won’t infringe on the rights of users.
The Public Health Emergency Privacy Act, proposed in May by Congressional Democrats, outlines some measures to ensure the safety of contact tracing apps. The bill limits collected data to public health use and requires an ethical review for any proposed additional use. It also aims to hold businesses accountable for data misuse and conduct data collection on an opt-in basis—although public concerns about privacy could certainly hamstring the impact of contract tracing apps in this case.
Critics of contact tracing efforts cite the 2001 Patriot Act, intended to increase surveillance efforts for the purpose of national security. However, the measure lives on, and rolling back such sweeping legislation has historically been an uphill battle. It is also true that the data would still be collected by companies, even if it’s being used to inform public health decisions.
Regardless, finding a balance between keeping citizens safe and not infringing on their privacy is a problem without a definite solution. Keeping the data aggregated could be a way for the process to stay anonymous, though there is a tradeoff between anonymity and accurate data. Experts generally agree that safeguards need to exist to secure collected data and limit its usage, though this too comes with costs that we may not be able to minimize in the short timespan required to execute a solution.
**Joel Landau is founder and chairman of The Allure Group, a network of six New York City-based skilled nursing facilities. He has served as a member or an advisor on a number of boards and committees, including the Medicaid Managed Care Advisory Review Panel (MCCARP), NYS DOH Preventative Health and Health Services Block Grant, NYS DOH Task Force on Long Term Care Financing, and the Brooklyn Chamber of Commerce.
Joel Landau, is founder and chairman of The Allure Group, a network of six New York City-based nursing homes.