There’s an old adage that if you owe the bank $100,000 and you can’t pay, you have a problem — but if you owe the bank $100 million and you can’t pay, the bank has a problem.
You could look at that another way: if you’re an ordinary Facebook user and you start putting out posts calling for violence, you’re going to have a problem with your Facebook account. But if you’re the president of the United States and it’s your posts that are calling for things like shooting looters, then Facebook has the problem.
And, indeed, Facebook has a problem.
If your business model is to give a megaphone to hateful messages, where does your responsibility lie when someone actually acts on the message you amplified?
Last October, Facebook CEO Mark Zuckerberg was blunt as he testified before a U.S. Congressional committee: “If anyone, including a politician, is saying things that can cause, that is calling for violence or could risk imminent physical harm … we will take that content down.”
But, after President Trump posted of the protests over the death of George Floyd that, “Any difficulty and we will assume control but when the looting starts, the shooting starts. Thank you!”, Facebook did not take Trump’s content down. Another social media giant, Twitter, did post a warning about the content of Trump’s tweet on their platform.
But Zuckerberg was content to leave the material up, while walking a fine line about whether or not Trump had gone too far.
“Our policy around incitement of violence allows discussion around state use of force, although I think today’s situation raises important questions about what potential limits of that discussion should be,” Zuckerberg wrote in a post on his own platform.
In other words, as we pointed out above, Trump didn’t have a problem, Facebook did.
Zuckerberg went on to class the issue as one of freedom of expression: “I’ve been struggling with how to respond to the President’s tweets and posts all day. Personally, I have a visceral negative reaction to this kind of divisive and inflammatory rhetoric. … But I’m responsible for reacting not just in my personal capacity but as the leader of an institution committed to free expression.”
Zuckerberg finished, “People can agree or disagree on where we should draw the line, but I hope they understand our overall philosophy is that it is better to have this discussion out in the open, especially when the stakes are so high. … (Ultimately) accountability for those in positions of power can only happen when their speech is scrutinized out in the open.”
In the end, though, the question might be a simpler one. If your business model is to give a megaphone to hateful messages, where does your responsibility lie when someone actually acts on the message you amplified?
Here’s a hint — the buck will stop with you. Or the muck will stick with you.