Skip to content
Who's in the Video
Oliver Luckett is a technology entrepreneur and currently CEO of ReviloPark, a global culture accelerator. He has served as Head of Innovation at the Walt Disney Company and co-founder of[…]
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

It’s likely that most of us signed up to Facebook before we truly knew how powerful it was or would become. Many of us were too young, or inexperienced in the digital world, to realize that, at the end of the day, we were and are the product Facebook is really selling. We are sorted, packaged and prompted to act (by giving likes, clicking ads, and sharing emotional states and information) so that a supremely valuable commodity – our attention – can be more profitably sold to advertisers. It’s how we end up in echo chambers of like-minded people, and it’s this illusion of agreeability that started to tear in the wake of the election result.


We’re responsible for handing over our data to Facebook, there’s no question about that, but now that users are becoming more informed of data harvesting and algorithmic practices – by outside sources, not by Facebook itself, notes technology entrepreneur Oliver Luckett – we should seriously give thought to building our digital identities independently of Facebook.

Luckett takes issue with Facebook for its lack of transparency and its monopoly on power. Mark Zuckerberg is the most powerful editor-in-chief in the world, and that terrifies Luckett. From his own personal experience, Luckett tells a story of how his account was instantly shut down when he sent an image from a medical textbook to a friend over Facebook messenger. He received a notification that he was under review, and was denied access to his account until further notice. With that, he also lost access to websites and apps that were connected to his Facebook profile – Instagram, Soundcloud, Spotify. Imagine losing access to sites and resources you really depend on. “Someone can just be erased from a system without any recourse… That’s too much power,” Luckett says.

More worrying than switching off your online identity network is the lack of transparency in Facebook’s algorithms and social experiments. Luckett explains these in depth in the video, illuminating how little we know about the way Facebook turns algorithmic dials up and down without our knowledge (but we did press ‘I Agree’ on the T&Cs;, so yes, that’s on us), affecting who and what we see. This is particularly significant for businesses who have invested money in audience visibility through Facebook – 44% of the U.S. population accesses news on the social platform, according to a 2016 study by the Pew Research Center and the Knight Foundation. People and businesses are becoming heavily dependent on Facebook and powerless to its decisions.

The safeguard is remembering that Facebook is a choice; carve out your identity and your business so that if Facebook were pulled out from under your feet, it wouldn’t devastate you or your livelihood. Enjoy it for the amazing service that it is, but be wary and informed of how it works beneath the interface.

Oliver Luckett and Michael J. Casey’s book is The Social Organism: A Radical Understanding of Social Media to Transform Your Business and Life.

Oliver Luckett: There's been so much conversation recently about societal bubbles, echo chambers, media bubbles, personal algorithm bubbles. My newsfeed told me that everyone thought the same way I did. And there's a conflict there. One of the biggest reasons that those emerged was that companies like Facebook at the end of the day you're the product they're selling. They're actually selling you as a sale on that network; they're selling you to advertisers and so they need to put a nice polite bow around or you and they need to find look-alike audiences that think like you and look like you.

And the easiest way to do that is algorithmically to focus on, and at the same time there was never the dislike button, there was the like button. And so if you have a positive based signaling system like that that is designed to create a polite community of like minded people so that I could be targeted easier by an advertiser because my information is what's being sold, then the result of that is going to be a nice polite bubble echo chamber and we're seeing that right now. And people are astonished that I didn't see the other half of this community, literally half of them that thought differently than me. I don't see the ideas of that. At no point in modern history or in any human history has one person had as much control over what we think, what we see, what we do than Mark Zuckerberg.

He has more power than anyone in human history right now. And the scariest thing about it is, because I don't think he's innately a bad person, I don't know him enough to say one way or the other, but the problem is it's not transparent. We have no idea what dials are being pulled or what knobs are being turned that are controlling the information coming to us. There are two big moments that I've had that made me really think about this. The first is is that I used to represent a bunch of celebrities that were uncomfortable coming into platforms like Facebook because they didn't know if it was about them as a celebrity or if it was about them as a high school friend. And so we worked early on with Facebook and with other platforms to kind of set a set of rules, here's how we would engage with a famous person and a Facebook page would mean this. And so we helped those celebrities create content to talk to their audiences to build a direct connection. And suddenly there were people like Mark Wahlberg or Charlize Theron or Hugh Jackman or any of these people that are massive accounts inside of social they suddenly look like massive media networks. The idea that one person could push a button and reach 20 million fans was a big threat to a business model based on me selling the advertising, not of the celebrity selling the advertising. And that was really one of the births of the algorithmic suppression of content. And so we started seeing that on code drop Tuesdays in Facebook suddenly a person that would reach five million people out of the 12 million that they had as fans, people loved their content would engage 100,000 likes, 300,000 likes, really great content that people loved, we would just see that being dialed down and we started seeing it at such a mass scale and that's where a lot of the ideas of this book came from is I was looking at a trillion pieces of data a month across this network that was connected to at times over eight billion connections in a system that was only 700 million people at the time. So we had a really good snapshot of what was happening.

And as the business model grew for Facebook it was dial it down, dial it down weekly and people started screaming but you bait and switched me. You told me to make content. I went out there and I put money into your system to gain followers, to gain fans. It wasn't just celebrities, it was companies like The New York Times or companies like yourselves or companies that wanted to reach an audience and build a base and then suddenly Facebook said well, you know, I understand that this is based on a trusted system between you and your consumer and that consumer subscribed to you because they really want to hear from you, but we're the business in between now. And that's a hard business to be in in a long-term perspective because I subscribed to someone because I trust that source and I need that source.

And so that algorithm, until we have transparency, and then at the same time you read about these studies where Facebook arbitrarily without anyone knowing took 300,000 random people and turned up positive content that week and turned up negative content the next week. And they just wanted to do an experiment on people. You can't to do experiments on people without them agreeing. Oh, it's in our click through terms of service 27 pages deep we're going to do random experiments on you socially. Those are things that are really troubling to me and those are things that in my mind are a rallying cry around transparency of these algorithms, around transparency of why am I getting this information? What is the reason behind this? And not six different emotions that I can express of wow or sadness or anger, like fuck you I'm not that stupid. Give me transparency. If 70 percent of millennials are now receiving news information from Facebook, please show me some transparency in the sources, in the trust factor, what is the edge rank. Facebook perfected edge rank as a concept that now puts it in people's vocabulary of the trusted credit score between myself as a content provider and my audience. We don't know anymore. There is no transparency. It's really arbitrary and that scares me. That terrifies me. And it's not just in the newsfeed.

I had an experience where I sent an image from a medical textbook of a guy that I was in a fight with and I was like you're acting like you have a micropenis. And I went in Google and I went on search and I typed micropenis and of the first image I got was from a medical textbook of a 47-year-old person and I dragged that into my Facebook chat window and sent it. And instantly, I was riding passenger in a car, the stereo went off because my Spotify was connected, my Facebook account was blocked, of course Instagram I couldn't get into that, I couldn't get into SoundCloud because I used my Facebook for there. And then I was like what? So I tried to log back into Facebook. It said you're under investigation for international child pornography. I was like what? So I went back to Google and I was like okay that's a medical textbook. It's a 47-year-old I have the proof that this is not child pornography. And then I was enraged and I was like who do I call?

So I went in the helpline and they said there's nothing you can do. You're under review. We'll get back to you within three days maybe. That was the response. I was like wait these people have my identity, all these things that I'm relying upon are connected to this system. The guy never received the image on the other side because I called him and I said did you get an image for me? He said no it just shut off. That's terrifying to me. Someone can just be erased from a system without any recourse, without anything? That's too much power for one person. And so I've been pushing a lot and what I've been saying lately is somebody should invent identity within the block chain so that we have an identity system that is mine. In Iceland we have this thing called a Kennitala. It's a unique identifier that is literally your unique ID, your genetic records, your health records, your financial records all in one and it allows me to have total transparency across a system and anyone that I'm encountering. So you don't need a payday lender or a check cashing place or a credit score or any of these third-party parasites that exist in our system because you have real identity. And so those are kind of fundamental things that I think out of this new awareness we're going to start realizing that we need incorruptible identity, we need to have our digital identity as part of us of who we are as human beings as part of this system. Because relying upon a commercial interest like Facebook to have our identity, to capture everything it means in our digital life and they can just flip a switch, ask anybody that's been blocked by Facebook. There is no recourse. You're up to somebody in a farm deciding whether or not you're a good person or a bad person or whether or not you deserve an account. That is way too much power for any institution to have over us.


Related