Skip to content
Who's in the Video
Rita Gunther McGrath, a professor at Columbia Business School, is one of the world’s top experts on innovation and growth. She is the best-selling author of The End of Competitive[…]
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

RITA GUNTHER MCGRATH: A conservative estimate of how much your personal information is worth to these data brokers is on the order of $240 a year for each of us, for millions and millions of people. And that adds up to a really, really big number. And a lot of that data, right, and the advertising against that data is just getting sucked out of conventional sources and going right into the pockets of the big data brokers like Facebook and Google and even Amazon, also a whole bevy of smaller data brokers-- you know, people who run little websites like, you know, Housekeepers Like Me and, you know, do you have insurance, and are you in the market for a new car? And people willingly hand over this information, which then gets aggregated and copied across databases and then put into this package which advertisers can then use to target you. I think we're just at that very early stages where a few people are sounding the alarm. But that has not spread out to the masses yet. I mean, all those people posting about their grandkids on Facebook don't understand. The minute that thing goes up on the internet, they have lost control of it. They don't own it anymore. And the minute you do a quiz or the minute you volunteer any information about yourself, I'm sorry, it's their data, not yours, I think it kind of crept up on them by accident. And I think about how Google, for example, got into the data business. They had built this fantastic search engine, but they had no way of monetizing it. And it wasn't until they figured out that you could sell ads against search-- and in the beginning, it wasn't collecting personal data. Let's not forget that. In the beginning, it was, if I go in and search on tennis rackets and you show me an ad on tennis rackets, that's kind of OK. Like, that doesn't bother me that much. It's when you start to get more pervasive. And I think where they crossed the line-- maybe without intending to, but where they started to cross the line was when you start targeting. When you start saying, I can get you, you know, ads posted to 14-year-olds living in Princeton, New Jersey who wear Keds tennis shoes, by the time you get that level of precision posting or targeting, I think that's where you've crossed a line. And I think we've gotten ourselves now into a situation where a lot of people are going, hey, what? You can do that? And just as an example, so Facebook can actually allow people that want to rent an apartment to exclude African-Americans, exclude people who are Jewish, exclude people who are Latino. That is actually against the Fair Housing Standards Act of 1968. So to me, that's an example of crossing a very big line. So instead of just saying, OK, if I'm searching for, you know, tennis rackets and showing me an ad about tennis rackets, going that extra mile and saying, because I know everything you've been writing and I know that tennis is really important to you and I know that you spend on average $567 per year on tennis related paraphernalia, I'm going to sell an advertiser a super targeted ad to just reach you. I mean, to me, that's creepy. That just gets to the-- it's not so much about privacy. It's about this ability to sort of algorithmically target on characteristics we may not want the world to know about. An even more insidious use of data to me is if you look at the recent agreement by 23andMe with one of the big pharma companies to hand over their data. And it was for billions of dollars. Now the thing I really think is kind of objectionable about that is you've already paid to have that data analyzed. You've not necessarily given 23andMe permission to further monetize that data just because they have it to other people. So you know, to me, there's a huge amount of institutional lag here. You know, our rules about property rights and your rights to your own data are so far behind practice that I think we're in for a big reckoning.


Related