With less than 100 days until the US midterm elections, Facebook announced that it had discovered dozens of fake accounts designed with intent to disseminate disinformation to sway voters. Meanwhile, businesses from Starbucks to Pepsi have reeled from false news stories that have spread on social media and threatened bottom lines.

Fighting false news can seem like a game of whack-a-mole as authorities rush to detect fake accounts, correct the story, and control the fallout. But bigger-picture solutions are being proposed by a mix of academics, entrepreneurs, and public officials who gathered this week at a conference co-organized by the Sanford C. Bernstein & Co. Center for Leadership and Ethics at the Business School, along with the Brown Institute at the School of Journalism and the School of International and Public Affairs.

"We're in a new world where it is going to be increasingly difficult for us — for anybody — to make sure that the information being conveyed is accurate," Professor of Economics Joseph Stiglitz said on the sidelines of the event. Stiglitz, whose insights into information asymmetry in the marketplace earned him a Nobel Prize, noted that the standard idea that market competition is good for consumers is not necessarily true when it comes to the marketplace of disinformation online.

"If the information that young people are getting about the world is disinformation deliberately distorted, then so is their view of the world, and that has all kinds of consequences for the decisions they make," he said.

Here are three ideas for fighting fake news proposed during the conference, from a startup to rank news sites on their reliability, to nudging people toward sharing accurate news, to a proposal for greater government oversight.

1. NewsGuard

Media entrepreneur Steven Brill announced during the conference that his latest startup, a watchdog called NewsGuard, will launch within the next two weeks. Led by Brill and former Wall Street Journal publisher Gordon Crovitz, NewsGuard's team of some 50 journalists will assess thousands of sources of online news for their reliability and rate them green for being generally reliable or red for being generally unreliable, with a "Nutrition Label" write-up detailing the nature of each site to give readers more information about the news sources they see online.

"Our SWAT team will be constantly looking to update news sites where ownership may have changed or standards may have changed for better or worse," said Brill, who founded the American Lawyer magazine and Court TV. "It's common sense, it's human intelligence, it's transparent, it's accountable — it's so the opposite of what we have today."

A web browser plug-in will show users these ratings and reviews, which NewsGuard is also licensing to social media and search companies to include in their products. NewsGuard says it will also help advertisers avoid fake news websites.

NewsGuard's certification scheme resembles the traditional solutions economics would suggest using against fake news, according to Charles Angelucci, assistant professor of finance and economics, who also presented at the conference. Drawing a parallel to how fake drugs of the 19th century were reined in through the 1906 creation of the US Food and Drug Administration, Angelucci suggested that a similar kind of certification system for the news could grow consumer trust and create a higher barrier to entry, making it harder for fake news sites to pop up overnight.

2. Behavioral Nudges

Despite the prevalence of disinformation and fake news, people do care about the truth and can be nudged toward sharing more accurate information, according to David Rand, an associate professor of management science and brain and cognitive sciences at MIT Sloan School of Management who also spoke at the conference.

Rand co-authored a new paper in the journal Cognition that found a correlation between media truth discernment and analytical thinking — the greater a person's analytical reasoning ability, the more likely they are to spot fake news, according to a survey of about 15,000 news consumers. But for another study in a forthcoming paper, Rand has found that all people are less likely to share misinformation after they are reminded of the importance of accuracy — suggesting that social media platforms such as Facebook can help tackle fake news through subtle appeals for accuracy in news sharing.

"By asking people to rate the accuracy of a headline," Rand said, "they became more discerning in subsequent sharing decisions."

Fake news is, though, an arms race, and the bad guys are not sitting still. Pinar Yildirim, an assistant professor of marketing at the Wharton School, explained that fake news focuses on topics that are highly mentioned, tweeted, and tagged; it succeeds because it arouses emotions such as rage, anger, and social outrage.

3. FEC Oversight

Fake political news is also a campaign finance problem, according to Ann Ravel, a former chair of the Federal Election Commission. She suggested that social media platforms be required to turn over all information to the FEC so it can monitor when something like a tweet is actually a political advertisement requiring financial disclosure.

"We are less than 100 days from the midterms," Ravel said. "And yet still in the United States there are no laws or regulations relating to social media platforms… Congress has been very slow to respond to what is clearly an issue of electoral integrity and fairness of our electoral processes."

Ravel is pushing for a government solution through her role with the Maplight Digital Deception Project, a nonprofit that aims to bring transparency to government data and spotlight the influence of money in politics. Already, several congressional Democrats have crafted legislation that would criminalize the intentional publication of false information about elections, while Sen. Mark Warner has laid out potential paths to addressing misinformation online, including by making media platforms legally liable for fake news.

But the efficacy of such laws is debatable. Ulf Buermeyer, president of the Society for Civil Rights in Germany, spoke about how his country has begun holding social media platforms responsible for deleting illegal, racist, or slanderous content that runs afoul of the country's laws against hate speech. Platforms face fines of up to $60 million for failing to take down such content within 24 hours, which Buermeyer said is forever in the online world.

While congressional action against fake news may be unlikely to happen anytime soon in the US, it will likely take some kind of legal or legislative move to spur social media platforms to address the problem, said Bruce Kogut, the Sanford C. Bernstein & Co. Professor of Leadership and Ethics. Kogut co-led a university wide working group on fake news with Mark Hansen of the Brown Institute and Anya Schiffrin of SIPA, which paved the way for the conference.

"We can't rely on the platforms to self-police until they're forced to self-police," Kogut said.

About the Researcher(s)

Joseph Stiglitz

Joseph Stiglitz

Professor
Economics Division
Professor
Heilbrunn Center for Graham and Dodd Investing
Executive Director and Co-founder
Initiative for Policy Dialogue
Bruce Kogut

Bruce Kogut

Sanford C. Bernstein & Co. Professor of Leadership and Ethics
Management Division
Academic Director of BAID
Hub Faculty