The funny thing is, other people seem to get all the benefit.
What started me thinking about this was a report from MobiHealthNews on the FTC's PrivacyCon, particularly a discussion about the monetization of consumer health data. We talk a lot about Big Data, but it is worth remembering that Big Data is made up of lots of little data from people like you and me.
Companies are keen to make money from the data they collect about/from us. The Economist Intelligence Unit just released The Business of Data on this topic. Some of its key findings
- Over 80% of surveyed organizations are taking steps to generate more value from their data.
- 60% are already generating revenue from their data.
- Over 50% are selling or buying data.
- 86% say their customers trust them with their personal data, although only 34% think they are very effective in being transparent about their use of it.
- 82% believe they are effective about keeping their data secure, although 34% admit to having had a significant breach within the past 12 months.
The Pew Research Center also just released a new report, Privacy and Information Sharing. They found consumers have mixed attitudes towards sharing their data; "it depends" was the prevailing attitude when evaluating various sharing scenarios. They often see benefit in giving up some of their privacy -- we love "free"! -- but worry about how their data will be used and by whom, especially given the risks of security breaches.
One of the scenarios Pew presented was having your doctor upload your personal health data to a secure third party site, in return for access to your medical record and easier appointment scheduling. Somewhat surprisingly, more respondents (52%) found this sharing acceptable than in any of the other, non-health scenarios, Those who found it unacceptable tended to cite doubt about how secure the data really would be, noting the consequences if their data was hacked.
This concern is certainly valid. Being hacked is bad enough, but even having your data de-identified may not be enough, since whether de-identified data can stay de-identified is very much in question, as datasets get larger and analytic techniques get better.
As a researcher at PrivacyCon said:
What we have to realize is that genetic data is the most personal data out there...We also know this data is inherently identifiable. There's growing recognition that it is not possible to de-identify this data in a way that is not possible to re-identify later. The other thing is, this data is irrevocable. If there's a privacy breach, you can't change it. It's not like your iTunes password.We're increasingly in a world of biometric screening and DNA sequencing (including DTC sequencing). We are starting to use biometrics as a form of identification, as DNA has been used for many years, most notably in law enforcement. We've gotten used to the prospect of identity theft when it comes to our financial information, but it takes on even more ominous connotations when it comes to our genetic and health information.
In a very real sense, we are our data, with DNA a fantastically effective means of storage. Having this data stolen is especially troubling. I wouldn't want to suddenly run into my clone.
One way or another, Big Data is posed to become a huge market. Pharma in particular stands to benefit, with Big Data potentially revolutionizing drug development and targeting. All of this relies on our data. We may directly benefit from our sharing, or the benefits may accrue to other patients, particularly future ones.
Many people (including President Obama) believe precision/personalized medicine is the future of health care, tailoring treatments and even diet to a person's specific make-up. The success of this depends on Big Data. The President has also pushed for a "moonshot" to cure cancer, and many feel that not only does this need Big Data but also that lack of sharing data has been one of the big problems in cancer research (and, indeed, in all clinical trials).
GE's Digital CEO Bill Ruh believes that a detailed software model of a person, which he calls a "digital twin," will become integral to health care (as well as to products in other fields). As he said,
I believe we we will have a digital twin at birth, and it will take data off of the sensors everybody is running, and that digital twin will predict things for us about disease and cancer and other things. I believe we will end up with health care being the ultimate digital twin. Without it, I believe we will have data but with no outcome, or value.Whether we like it or not, whether we realize it or not, monetization of our data is happening, While research suggests we are overwhelmingly willing to share our health information for research, once the implicit value of our contributions becomes clearer, that willingness may be more conditioned upon an explicit return.
Take, for example, Datacoup. They are one of the first companies to allow consumers sell their data. Consumers upload their data to Datacoup, which then markets it to data purchasers, with the consumers getting paid based on how much those purchasers pay. Datacoup's CEO told the EIU: "If merchants are willing to provide value in exchange for more or better consumer consented data, then you’ll see a vibrant and massive marketplace spawned for the direct exchange of data, and in effect a more direct relationship between consumers and merchants."
I'm all for pooling our data to figure out better ways of helping people, but I'm less sure that I should be doing that for free, especially given the risks of my data being stolen or used for purposes I didn't intend.
If our digital twins are going to have economic value -- as they no doubt will -- shouldn't we share in that?