Death and Data Science: Part 2

Death and Data Science: Part 2

Why we need to think critically of data usage in the age of COVID-19

A little over a month ago, I wrote this article about our fear of death, and how tech companies will manipulate that fear to convince us to give up our privacy. As noted in big data and healthcare chapter of Framing Big Data , discussions of privacy with regards to Big Data in the healthcare sphere is significantly lower than in other sectors, such as business and politics (Paganoni 2019). In this chapter, a professor from Duquesne University is quoted as essentially saying, “if you want medical innovation in tech, you have to give up personal privacy,” framing privacy as a trivial luxury that must be sacrificed for the good of advances in medicine. Later, Paganoni cites this post from the Information Commissioner’s office. The issue discussed in the post itself was that the turnover of a massive amount of data from Royal Free London NHS Foundation Trust to Google DeepMind was deemed to be in violation of the Data Protection Act. The part that Paganoni cites is:

1. It’s not a choice between privacy or innovation

It’s welcome that the trial looks to have been positive. The Trust has reported successful outcomes. Some may reflect that data protection rights are a small price to pay for this.

But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights. I’ve every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work. This will also be true for the wider NHS as deployments of innovative technologies are considered.

Essentially, we don’t need to compromise our personal privacy for the advancement of medicine, and a lack of privacy is more the result of negligence than necessity. Combine this with the fact that even data we think is anonymized can be incredibly easy to deanonymize , and you would think we should be extra careful with the data we turn over to tech companies, even if they are working on healthcare technology. However, with the outbreak of COVID-19, we’re seeing just the opposite of this. Take, for example, this map that scores each state by how well it’s social distancing. The way it scores each state is by looking at smartphone location data and determines the percent decrease in movement from before the outbreak and now, and they’re assigned a letter grade based on that percent reduction. They, of course, insist that the data is fully anonymized. In the coverage of this in the Washington Post, the Unacast chief executive Thomas Walle is quoted:

“Everything here is on the aggregated level,” Walle said. “We can’t tell or disclose if any individual is staying at home or not.”

However, the article also notes:

Privacy advocates worry data firms like Unacast can be dodgy because they’re gathering locations without real consent from people.
Walle said all of the apps that Unacast acquires location data from must let users know. But he declined to name any of the apps. And we know few people read the privacy policies on apps — the fine print where they disclose the many ways they use your location, such as selling it on to data firms.

When a data collector is not transparent about where they are getting their data, such as in this case where Unacast declines to name the apps used to collect data, that’s not a good sign. Combine that with the fact that the government is currently talking to tech giants such as facebook and google about how to utilize location data they’ve collected , and you have a potential recipe for disaster. The thing is, these tech giants are private corporations. They’re responsible to no one but their shareholders, and therefore have no direct interest in maintaining user privacy. If the government is getting their data from them, they could just say at that point it isn’t like they’re getting their data directly from private citizens, but from a corporation. Essentially we have to trust these private corporations to properly anonymize data out of the goodness of their hearts. And while, certainly, there may be one or two companies who actually do work to keep their data sharing transparent and ethical, on the whole, we cannot trust businesses to prioritize the public interest unless they are forced to do so, and it looks like they aren’t being forced for the time being.

Slate recently published an article, “ You Shouldn’t Have to Give Google Your Data to Access a COVID-19 Test ,” which discusses how google may be developing a system for people to access their COVID-19 test results, but to access those results, you have to give way more data than necessary. It’s almost as if our collective fear is being manipulated into giving corporations data we would normally never give them. In fact, this is put well in this quote:

As COVID-19 spreads through our communities, stories abound of people exploiting the situation for financial gain. Unscrupulous pandemic profiteers horde protective gear and cleaning supplies to sell at exorbitant prices. Like these commodities, data siphoned by tech companies from COVID-19 screening may be the next asset to be hoarded and exploited.

Like our fear of death is already being exploited in the healthcare sector when it comes to big data, our even more present fear of COVID-19 is being manipulated to get us to accept a loss of privacy. What’s worse is that while this loss of privacy is framed as something that is completely necessary, the fact is some of the algorithms that use this personal data don’t even work and potentially cause harm. In the paper, “Emergent Medical Data: Health Information Inferred by Artificial Intelligence,” Emergent Medical Data, or EMD, is “health information inferred by artificial intelligence from otherwise trivial digital traces.” While potential privacy violations are framed as necessary for the public good, Marks notes:

However, there is little evidence to show that EMD-based profiling works. Even worse, it can cause significant harm, and current health privacy and data protection laws contain loopholes that allow public and private entities to mine EMD without people’s knowledge or consent

Our fear of death has already been manipulated to compromise our privacy rights, and now that our fear is heightened, the manipulation is only increasing. This is why we need to be even more critical of the requests of these private companies and of the steps they do or do not take to protect our own rights, because it’s clear they have no intention of protecting our rights without being forced to do so.

[1] Paganoni, Maria Cristina. Framing Big Data: A Linguistic and Discursive Approach . Springer, 2019.

[2] Marks, Mason, Emergent Medical Data: Health Information Inferred by Artificial Intelligence (March 14, 2020). U.C. Irvine Law Review (2021, Forthcoming). Available at SSRN: https://ssrn.com/abstract=3554118

我来评几句
登录后评论

已发表评论数()

相关站点

+订阅
热门文章