This video is fake. It's part of Deep Reckonings — a series of explicitly-marked deepfake videos that imagine morally courageous versions of Mark Zuckerberg, Brett Kavanaugh, and other public figures. Here, an imaginary Zuckerberg reckons with what Facebook has become in a message to his employees.
The next step for Deep Reckonings is to make it real. Which means we need a way to get to Zuckerberg. Chief Product Officer Chris Cox is a longtime friend and colleague of Zuckerberg's and has been publicly critical of Facebook. To encourage Zuckerberg to deliver this message in his own words, post on Facebook: @Chris Cox make this real with @Stephanie Lepp: www.deepreckonings.com/zuckerberg #deepreckonings
The next step for Deep Reckonings is to make it real. Which means we need a way to get to Zuckerberg. Chief Product Officer Chris Cox is a longtime friend and colleague of Zuckerberg's and has been publicly critical of Facebook. To encourage Zuckerberg to deliver this message in his own words, post on Facebook: @Chris Cox make this real with @Stephanie Lepp: www.deepreckonings.com/zuckerberg #deepreckonings
CREDITS
Brought to you by |
Infinite Lunchbox |
Written, directed, and produced by |
Stephanie Lepp |
Generous support |
Mozilla & Topic |
Voice of Mark Zuckerberg |
Bryan Kopta |
Special thanks |
Andrew Marantz, Aden van Noppen, David Sauvage, Ryan Nakade, Jenn Beard, Brett Gaylor, Paul Johnson, and Kevin Zawacki |
TRANSCRIPT
Hey everyone. I know the boycotts and resignations and everything were tough on our company, and I've been reflecting. And I want to give an honest response.
When we launched Facebook to the world, it was the glory days of social media. Social media was this liberating force that was going to democratize access to information and give everyone a voice. And the truth is: we did that. We gave small publishers the same sophisticated tools the big guys had. And we allowed citizens to speak out against oppressive regimes. And we enabled people to be more creative, and entrepreneurial.
But the other truth is: there was a bad side to all of this. The "small publishers" we empowered ended up including hateful propagandists like Alex Jones. Our algorithms fueled ethnic violence in Myanmar and Sri Lanka. People on our platform have become more prone to hating others, and hating themselves. I was naive about Russian interference in the 2016 election, and I'm still being naive about domestic interference in 2020.
The difference between the way I see Facebook and the way other people see Facebook has gotten too big to ignore. I've been treating Facebook's problems as the result of bad actors exploiting our supposedly neutral platform, instead of as structural flaws inherent to the platform. I've been refusing to see Facebook's negative impacts clearly, which is necessary to making Facebook better. I believed so hard that we were being the change we want to see, that I denied the ways in which we weren't. And for that, I'm sorry.
If Facebook is capable of such harm and such good, we need a better way to think about the social impacts of technology. The question isn't whether a technology is inherently good or bad for democracy, but how can our technology serve democracy? The question isn't how to remove dangerous content while maintaining the widest definition of free speech possible, or whether Facebook should be an arbiter of truth — Facebook already is an arbiter of truth! The question is: how to make editorial decisions that our users can trust, and that fulfill our actual mission — which goes beyond free speech?
As much as I don't like to admit it: our ability to fulfill our mission is constrained by a business model and an economic system that push us to maximize user-engagement and shareholder-value. And as much as I don't like to admit this either: Facebook has commanding power in our economy. And I have commanding power over Facebook. Which means I could work to change the system that prevents me from having the kind of impact I say I want to have. And which might be the most impactful thing I could do.
This video is fake. And it's explicit that it's fake, so we won't take it down. But what would it take for me to do the real version? What would it take for me to change my mind? Huh. A lot of encouragement.
So, maybe this video will get sent around. Maybe it'll get sent to my friends. And maybe…..we can get me to deliver this message in my own words.
When we launched Facebook to the world, it was the glory days of social media. Social media was this liberating force that was going to democratize access to information and give everyone a voice. And the truth is: we did that. We gave small publishers the same sophisticated tools the big guys had. And we allowed citizens to speak out against oppressive regimes. And we enabled people to be more creative, and entrepreneurial.
But the other truth is: there was a bad side to all of this. The "small publishers" we empowered ended up including hateful propagandists like Alex Jones. Our algorithms fueled ethnic violence in Myanmar and Sri Lanka. People on our platform have become more prone to hating others, and hating themselves. I was naive about Russian interference in the 2016 election, and I'm still being naive about domestic interference in 2020.
The difference between the way I see Facebook and the way other people see Facebook has gotten too big to ignore. I've been treating Facebook's problems as the result of bad actors exploiting our supposedly neutral platform, instead of as structural flaws inherent to the platform. I've been refusing to see Facebook's negative impacts clearly, which is necessary to making Facebook better. I believed so hard that we were being the change we want to see, that I denied the ways in which we weren't. And for that, I'm sorry.
If Facebook is capable of such harm and such good, we need a better way to think about the social impacts of technology. The question isn't whether a technology is inherently good or bad for democracy, but how can our technology serve democracy? The question isn't how to remove dangerous content while maintaining the widest definition of free speech possible, or whether Facebook should be an arbiter of truth — Facebook already is an arbiter of truth! The question is: how to make editorial decisions that our users can trust, and that fulfill our actual mission — which goes beyond free speech?
As much as I don't like to admit it: our ability to fulfill our mission is constrained by a business model and an economic system that push us to maximize user-engagement and shareholder-value. And as much as I don't like to admit this either: Facebook has commanding power in our economy. And I have commanding power over Facebook. Which means I could work to change the system that prevents me from having the kind of impact I say I want to have. And which might be the most impactful thing I could do.
This video is fake. And it's explicit that it's fake, so we won't take it down. But what would it take for me to do the real version? What would it take for me to change my mind? Huh. A lot of encouragement.
So, maybe this video will get sent around. Maybe it'll get sent to my friends. And maybe…..we can get me to deliver this message in my own words.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.