As someone who relies upon Facebook to steer online traffic to my website, I have noticed a substantial dropoff in political commentary on FB in recent months. Hillary vs. Donald has become a toxic merry-go-round.

Videos of cats and dogs, stupid memes and endless photos related to wedding anniversaries and birthdays and kids’ first day of school are going strong. But I detect an increasingly nauseating mentality among those who really know politics to just stay silent. Take a pass.

So much of the FB commentary as we approach an unsatisfying general election campaign – for most of Mainstream America – consists of fringe websites on the left and right making the case that their team’s opponent is unacceptable – and much worse.

So, imagine if Facebook became a worldwide online network where posts would carry warning labels that some of its material derives from phony, or highly questionable, websites and sources. Limit content to mainstream media outlets and require that other posts would carry warnings putting them in the category of “silly stuff.”

(One obvious example would be another 9/11 Truther series of FB posts that have recently claimed no airliners hit the Twin Towers on Sept. 11, 2001.)

The result of this weeding out of wacko material might spark the first nuclear war (in relative terms) launched in cyberspace by the trolls.

As Facebook tries to fend off criticism of a lack partisan balance (in favor of liberal views), and their newest cost-cutting measure – eliminating actual journalists to oversee the “Trending” posts process, in favor of computer algorithms – the Poynter Institute, a leading  journalism watchdog, has suggested that it’s time for an internal Facebook fact-checking team.

When Facebook last week ditched its hands-on oversight process and turned to an automatic-pilot approach toward its Trending stories, it quickly got burned when the social network highlighted a totally fake story about Fox News host Megyn Kelly).

Poynter responded to this debacle with a brash proposal: Facebook just needs to hire a team of fact-checkers. No need to splurge, either. A team of 10 or so would do the trick.

 Poynter’s Alexios Mantzarlis envisioned a process where fact-checkers would steer away from personal posts and pages. “Your crazy uncle’s tirades” would not be blocked or red-flagged in any way. But public pages and posts would be fair game. Priority would be given to the pages categorizing themselves in Facebook’s “Media/News/Publishing” sector and any high engagement or high reach posts.

Here is what Poynter proposes as three possible tasks for FB fact-checkers:

  1. Weed out sources of information that consistently peddle fake news. Facebook could introduce a “three strikes and you’re out” policy that would decrease the reach of pages that consistently share links to fake stories on their own site. Once you’ve shared enough fake stories, your reach automatically gets cut to one-tenth of its potential size. As Buzzfeed hoaxbuster Craig Silverman noted recently, fake news websites are experimenting with business models just like legitimate news outlets. Reducing the traffic they’re granted by Facebook would dramatically decrease the incentive to publish fake stories.

  1. Aggressively reduce the reach of posts that are discovered to be false. Facebook is doing so already, but there are at least two problems with the current system.   First, the stimulus originates from readers rather than professional fact-checkers, and readers may be imprecise or have other motivations to flag the content. And Facebook isn’t transparent about how powerful the reach penalty for fake news is, or why de-emphasized stories have been taken down a peg. The current annotation, “many people on Facebook have reported that this story contains false information,” should be spelled out in shiny red letters and made far more prominent.

  2. (Facebook should) spot claims and articles that have been fact-checked elsewhere and offer a “related fact check” underneath the original article just like it currently shares related pages or stories. This could be deployed only in the most extreme cases, where the story was reported entirely false by established fact-checkers.

Surely, a number of far-right trolls would react to this by loudly blasting Zuckerberg for a violation of the First Amendment. Yet, hiring fact-checkers to weed out the wackos does not have to compromise his vision. In fact, it would reinforce Facebook’s stated “News Feed Values,” which say the network’s experience should be “informative.”

And it would represent a private company responding to concerns from its users about the trustworthiness of the information available on the network. Most importantly, it would have the added benefit of helping reduce misinformation and strengthen our democratic voting process.