Trump showed Facebook, Twitter, YouTube can’t moderate their platforms. We need change

It took a mob-fueled insurrection, but Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey finally grasped the enormity of the damage and harm President Donald Trump has done by weaponizing their influential social media platforms, and they banned him, permanently on Twitter and “indefinitely” on Facebook, from continuing to post incendiary lies about the 2020 election.



a close up of a persons eye: Getty Images


© Provided by CNET
Getty Images

But it’s no longer about Trump. Facebook  and Twitter  need radical change. It’s time for social media companies to let someone else moderate their platforms.

Facebook and its Instagram photo sharing site, Twitter, Google’s YouTube , and other social networks pay tens of thousands of people to identify and respond to bad behavior. Many of those employees are located outside the US, where fewer benefits and lower hourly wages are common.

That process hasn’t worked well, as evidenced by the many stories we’ve had to write about the terrible videos, misleading posts and dangerous lies that need to be taken down but that have already been viewed and shared millions of times.

Donald Trump permanently suspended from Twitter

UP NEXT

UP NEXT

These screwups are happening on a worldwide scale, on platforms with user bases larger than the population of any country on Earth. Facebook has more than 2.7 billion active users, and its Instagram platform has over 1 billion active users. YouTube serves up videos to a global audience of more than 2 billion people per month. Twitter doesn’t share simple monthly active-user stats anymore, but in 2018 it counted 336 million people logging in each month. The company’s revenue and profits have grown by double-digits since then.



If you want accurate election information, find your local officials' website, the FBI director said.


© Angela Lang/CNET

If you want accurate election information, find your local officials’ website, the FBI director said.


Social media is one of the top ways people get their news in the US, and around the world too. Forget the “mainstream media.” It’s these platforms that are the most powerful and influential mechanisms for the dissemination of information — and, unfortunately, misinformation and disinformation. Though the companies running them have been successful at controlling posts by international terrorists and people involved in child exploitation, they’ve pretty much failed at everything else. 

Loading...

Load Error

That’s why it’s time for social media content moderation teams to work for an independent, nongovernmental body that’s funded by these companies. Facebook and Twitter reaped more than $20 billion in combined profits last year — they could easily afford it.

We need an industrywide program to deal with this catastrophe of nonresponsibility and the lack of repercussions. 

We need a justice system for the social world. And we need all the social media companies to sign on or face potentially business-ending lawsuits.

The US government can do this by rewriting the Communications Decency Act of 1996 to offer social media companies and their executives protection from fines and other lawsuits only if they meaningfully moderate their platforms. When they don’t, victims of social media’s failures must be able to seek some sense of justice if a company is negligently ignoring its responsibilities. 

There’s already a growing debate over Section 230 of the law, which gives these platforms legal protection from anything damaging said or posted. Lawmakers on both sides of the political spectrum agree changes must be made.

This cycle of reckless irresponsibility has to end.

Privacy advocates, social media provocateurs and anyone else who dips into extremism online may quarrel that this idea infringes on free expression. Maybe it does. But social media can live without neo-Nazis. It can go on without the QAnon child abuse conspiracy, anti-Vaxxers and Holocaust deniers. Social media is fine without people streaming mass murder for a half hour before anything’s done. It doesn’t need the president of the United States undermining our democracy and fomenting violence for weeks, capped off by Wednesday’s ugly scene when a violent pro-Trump mob stormed the US Capitol, leaving five people dead, including a Capitol police officer

This cycle of reckless irresponsibility has to end. It’s time we reckon with what Facebook, Twitter and YouTube have wrought. Whether they can’t or won’t address the situation no longer matters. They must.



a dog wearing a costume: This rioter who broke into the Capitol has said in media interviews that he believes the social media-fueled QAnon conspiracy theory. Getty Images


© Provided by CNET
This rioter who broke into the Capitol has said in media interviews that he believes the social media-fueled QAnon conspiracy theory. Getty Images

Facebook declined to make Zuckerberg available to discuss policy changes. On Thursday, when he announced the indefinite ban on Trump’s account, Zuckerberg acknowledged the danger that a rogue Facebook post represents when it comes from the president. 

“His decision to use his platform to condone rather than condemn the actions of his supporters at the Capitol building has rightly disturbed people in the US and around the world,” Zuckerberg said. “We believe the risks of allowing the President to continue to use our service during this period are simply too great.”

YouTube declined to make its CEO, Susan Wojcicki, available for an interview. Twitter also declined a request to discuss these issues with Dorsey, with a spokesman writing, “We can’t make this work right now but I’d love to stay in touch on this.”

I’ll be waiting.



a person standing in front of a building: The mob that ransacked the US Capitol was spurred on by President Donald Trump. Getty Images


© Provided by CNET
The mob that ransacked the US Capitol was spurred on by President Donald Trump. Getty Images

Table of Contents

New rules

The reason why Facebook, Twitter and every other social media company should pay uncomfortably large sums of their profits into a separate organization to police their content is because it’s clear everything else isn’t working. 

Here how it could work: A new, separate organization — let’s call it NetMod (for internet moderators) — that would be divorced from profit motive. It wouldn’t operate at the whims of a self-important CEO. Instead, it would be an independent organization. It’d have its own supreme court, as it were, which would set the rules most social media users need to follow, as they debate and decide what the laws of social media ought to be.

Here’s a freebie to start with: “Thou shalt not encourage a mob of violent extremists to ransack the US Capitol.”

Whether they can’t or won’t no longer matters. They must.

The good news is that Facebook has already started with the supreme court aspect, which it calls an “oversight board.” Facebook’s goal with the board is to offer a way for users to appeal moderation decisions they disagree with. The 20-member board is made up of former judges, lawyers and journalists. So far, the oversight board has been about as effective as its name is boring, but it’s a start.

Social media companies also have experience working together to fight international terrorists and child exploitation every day. They’re pretty good at that stuff. NetMod is simply the next step.

NetMod’s rules of operation and how it moderates content would need to be documented and shared publicly too. Aside from posting their terms of service, tech companies rarely share their processes. Facebook and Twitter built websites devoted to publishing political ads on their services, but not all ads. Aside from some leaked training documents and internal memos, we know so little about how these teams operate that many of their critics have bought into conspiracy theories about them too.

Of course, every social network is slightly different from the other, and they should be able to have their own rules for their little fiefdoms. Facebook insists people use their real names, for example. Twitter vows anonymity. NetMod wouldn’t affect that. It’s about setting basic standards for an unacceptable post or comment.

Think of each social network as its own state — which shouldn’t be too hard, considering the active user base of each one dwarfs the population of any state in the US. Each state has its own rules and ways of doing things, but the states all have to follow federal laws. Social networks and NetMod would follow a similar model.



Jack Dorsey wearing a suit and tie: Jack Dorsey testifying before Congress about online harassment and conspiracy theories. Getty Images


© Provided by CNET
Jack Dorsey testifying before Congress about online harassment and conspiracy theories. Getty Images

Do it

The next step is incentivizing the companies to do this. Both Zuckerberg and Dorsey have appeared on Capitol Hill during the past two years, saying they welcome some form of legislation to help guide their businesses. 

Zuckerberg in particular has already told Congress he believes the Communications Decency Act’s Section 230 needs to be updated. “People want to know that companies are taking responsibility for combatting harmful content — especially illegal activity — on their platforms,” he said during a hearing on Capitol Hill last October. “They want to know that when platforms remove content, they are doing so fairly and transparently. And they want to make sure that platforms are held accountable.”

Section 230’s legal free pass is what allowed the internet to flourish. Cyber-law experts say changing it to allow legal protections to social networks only if they meaningfully moderate their platforms would help push companies to take responsibility for what happens on their sites.

And NetMod would be a natural entity to work with to define what sort of meaningful moderation of unacceptable behavior has to be, in order to get that legal protection.



graphical user interface, text: Social media does a lot of good. It does a lot of damage too. Getty Images


© Provided by CNET
Social media does a lot of good. It does a lot of damage too. Getty Images

NetMod would have immediate payoffs too. For instance, the companies would share intelligence, identifying and acting against terrorists, domestic and foreign, who often have accounts across multiple platforms.

“These people use coded language,” said Brian Levin, director of the Center for the Study of Hate and Extremism at California State University, San Bernardino. He tracked how the movements that sprung up to support Trump’s calls to “liberate” states from coronavirus lockdowns in 2020 drew in conspiracy theorists, extremists and small-business owners afraid for their jobs.

“A lot of bears came to that honey,” Levin said.

All this change won’t happen overnight. NetMod can’t make up for more than three decades of neo-Nazi online recruiting. But NetMod will at least get us all on the same page. It’ll create a standard we all can agree on. It can be a start.

“We don’t let people go into their garages and create nuclear materials,” said Danielle Citron, a law professor at the University of Virginia and author of the 2014 book Hate Crimes in Cyberspace. She’s one of the people who wants changes to Section 230, in part because social networks have so much potential to do harm when poorly run.



a close up of a sign: Social media has upended politics, particularly during Trump's term. Angela Lang/CNET


© Provided by CNET
Social media has upended politics, particularly during Trump’s term. Angela Lang/CNET

NetMod could even help inspire more innovation and competition. Startups could join on a sliding scale fee, giving them instant access to experts and to tools they’d otherwise spend years building on their own. Smaller social networks like Gab or Parler, both of which often cater to extremists kicked off Twitter and Facebook, can either start meaningfully moderating on their own, join on to NetMod, or choose to face legal exposure for what their users do and say.

The best part of changing Section 230 and implementing NetMod would be how it would change the darkest parts of the internet. There’s increasing evidence that when you break up a hate group on Facebook or Reddit, it has a harder time acquiring the same influence on other, often less moderated, alternatives. 

I want to make it easier to break them up, and harder for them to find a welcoming new home.

Best of all, this plan would mean that the next world leader who acted like Trump wouldn’t get the kid-glove treatment. Facebook’s Zuckerberg or Twitter’s Dorsey wouldn’t have the choice of whether to let that person do whatever they wanted.

Instead, that next world leader would have to face the NetMods.

Like everyone else.

Next read: Parler was rife with talk of guns and violence before the Capitol riot

Continue Reading