Media Literacy in an Era of Fake News

Media Literacy in an Era of Fake News

Daniel Munoz/Reuters
from Academic and Higher Education Webinars

More on:

Media

Influence Campaigns and Disinformation

Carla Anne Robbins, adjunct senior fellow at CFR, and faculty director of the Master of International Affairs program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs, discusses how to be a discerning consumer of news and media, as part of CFR's Academic Conference Call series.

Learn more about CFR’s resources for the classroom at CFR Campus.

Speaker

Carla Anne Robbins

Adjunct Senior Fellow, Council on Foreign Relations; Faculty Director, Master of International Affairs Program and Clinical Professor, National Security Studies, Marxe School of Public and International Affairs, Baruch College

Presider

Irina A. Faskianos

Vice President, National Program and Outreach, Council on Foreign Relations

FASKIANOS: Good afternoon from New York, and welcome to the CFR Academic Conference Call Series. I’m Irina Faskianos, vice president for the National Program and Outreach here at CFR. Thank you all for joining us for the first call of the semester.

Today’s call is on the record. And the audio and transcript will be available on our website, CFR.org, if you would like to share it with your colleagues or classmates.

We are delighted to have Carla Anne Robbins with us to talk about fake news and how to be a discerning consumer of news and media. Dr. Robbins is an adjunct senior fellow at CFR and leads a roundtable series on national security. She is also faculty director of the Master of International Affairs Program and clinical professor of national security studies at Baruch College’s Marxe School of Public and International Affairs. Previously, she was deputy editorial page editor at The New York Times and the chief diplomatic correspondent at The Wall Street Journal. She has reported from Latin America, Europe, Russia, the Middle East, and won numerous awards for her journalism and foreign policy analysis, including the 2003 Edward Weintal Prize for Diplomatic Reporting from Georgetown University. She was also a Nieman fellow at Harvard University and a media fellow at Stanford University.

Carla, thanks very much for being with us today. We’re very excited to have you leading this call. There’s a stat out there, nearly 40 percent of Americans read the news online, so I thought it would be great if you could start us off by giving us some background on the challenges facing Americans in discerning fake news and how we can identify misinformation.

 

ROBBINS: Well, thank you so much. And hi, students. Hi, everybody out there. Thanks so much for doing this.

First, let’s stipulate—and I’m not going to talk for very long, because I’m much—as a reporter, I’m much more comfortable with questions—(laughs)—of course, asking them as well as answering them. Let’s stipulate first that propaganda is by no means new, and the simple fact that politicians have been making up stories about their opponents for centuries. So one could say that fake news is not a new phenomenon at all.

What is new about all this, first of all, is the speed of the internet and the speed with which these stories spread, the distance that they can travel; and the nature of social media, which makes it incredibly easy and cheap to target these stories to the sympathetic or the credulous, who can then pass them on. It used to be that the price for entry if you—to own a newspaper, you know, to put fake news out there—certainly there were, you know, yellow journalism. You know, fake news has been around for a very long time, but the price for entry was high. And so people did not want to destroy their credibility too easily out there, and certainly in modern-day newspapers guarded their credibility because it was very, very expensive to print a newspaper. You didn’t want to lose the credibility of your audience unless you were something that, you know, basically sold out there as a tabloid in a supermarket that, you know, traded on the notion of the Loch Ness monster. Of course, maybe the Loch Ness monster is real, but certainly there was a difference between that and The New York Times and The Washington Post.

So, second—so let’s stipulate, first of all, that this is not a new phenomenon. But what is new about this is how pervasive it is, the ease with which it spreads. And the fact of the matter is that pretty much anybody can publish these days. It’s very easy to publish, you know, to set up a website and just press anything out there that you want. And it is also very easy with social media—using Facebook, using Twitter, using a variety of other ways—to push stories out, to amplify them.

There is a term that Ben Nimmo used—from the Atlantic Council—uses to describe the way that the Russians use fake news. They not just vilify, but they also amplify. They push these ideas out there. They create them, but they can amplify them because one of the things about Facebook and Twitter and a lot of other social media is that you can target things exquisitely. You can decide that you want to reach a particular group, and you can buy an ad. And we had seen these stories in The New York Times in the last few days about the way they can target. You can buy ads on Facebook. Facebook was talking—has said that they—the Russians bought—they believe the Russians bought, you know, $100,000 worth of targeted ads, things that can move up in people’s Facebook feeds. And say you want to reach a particular person, who follows soccer; mistrusts, you know, people who talk about climate change; and goes to—goes to church, this particular sort of church, or goes to this particular sort of mosque, or goes to this particular sort of synagogue. You can target things in a particularly exquisite way and move things up people’s feeds, so you can find your audience in a way that you couldn’t before.

So there’s an amplification capacity that exists with the internet that did not exist before. So, while fake news has always been out there, the ability to target it and to push it out, to amplify it, has not existed. So what is new about it is the speed with which it can go and the distance with which it can go.

The second point I want to make: When we talk about fake news, we’re really talking about several different types of stories with very different motivations. We’re talking about false stories that are generated really just for cash—to draw eyeballs to websites which then generate advertising revenue. These are just basic clickbait stories. They’re hoax sites. And while we’ve seen very politically motivated ones, we’re also seeing the ones, you know, get rid of, you know, eye bags—(laughs)—you know, bags under your eyes, or I suppose as an older person I’m aware of that one; or, you know, fix—you know, tricks to get rid of belly fat. I mean, you’ve all seen these sort of things on the internet. What we’ve seen during the political campaign was people went and very good reporting traced back to these sites, and some of these were just people who have created websites because the more clicks they got the more ad revenue they could generate. And what they found was viral political stories, more often than not ones attacking Hillary Clinton, went viral, and they could draw more ad revenue. So some of them are these hoax sites. Some of them are these political sites that seem to draw political eyeballs for political reasons, but they really were just—just existed to generate advertising revenue.

Some of them are false stories generated for political ends. These are hyper-partisan sites which may or may not be linked to the Russians. We are increasingly seeing that either they are linked to the Russians directly or the Russians pick up the information, sites related to the Russians, and then amplify them further.

So that’s the, you know, two sides: false stories generated for cash, false stories generated for political ends. Third, stories that distort the truth but still have kernels in there somewhere for either cash or political ends.

And then, also under the rubric of fake news are genuine stories and facts that are denounced as fake news. And we hear this time and time again from the president of the United States: You are fake news. And that’s one of the things that makes this so hard right now, is that—is that politicians who are using this—you hear this also from Putin and from other people who are using this term damning legitimate news stories that they don’t like, and using this—tarring them with this term “fake news” as well.

So I think we want to disaggregate these things and be clear in our mind that this term “fake news” is far too generic. There are many different stories. And there are also people for political ends who are calling things fake news that are very, very much real news.

So I also wanted to very quickly go through some of the stories that have been called—that are fake news, that are genuinely fake news out there, that have gone viral during the campaign. BuzzFeed did a great list, if you guys haven’t seen it. They did a list of 20 viral fake news stories of the 2016 elections.

Number one was Pope Francis endorsed Donald Trump. I just want to be clear, he did not endorse Donald Trump. That had, according to BuzzFeed, 960,000 what they called “Facebook engagements”—that is, clicks/comments/shares. And that was in the—I believe the three months before the elections.

The second was that Hillary Clinton sold arms to ISIS—again, a fake news story that had 780,000 of these likes/shares/clicks on Facebook.

“Just read the law: Hillary is disqualified from holding any federal office.” That had 701,000 of these Facebook engagements, according to BuzzFeed.

Another one: “FBI agent suspected in Hillary email leaks found dead in apparent murder-suicide”—567,000 of these likes/clicks/shares, these Facebook engagements.

None of these stories are true, although the arms to ISIS story does draw on a quote from Julian Assange that is hugely distorted. What Assange actually said was that a Hillary Clinton-led State Department had approved weapons shipments to Libya in 2011, and that those weapons had later ended up in the hands of jihadists. But that’s pretty far away from the truth of insisting that Hillary sold arms to ISIS. But you can see that the truly total hoax stories, the stories that have a kernel of hugely distorted truth, all of these things turn into fake news. And if you’re talking about 900,000, 780,000, 700,000 likes/clicks/shares, that is what we call viral fake news, these distorted-fact stories.

And the other thing, of these 20 stories that BuzzFeed identified, 17 of the 20 of them were attack stories against Hillary Clinton, and that tells you something as well.

Two other points—and then we will go on, and I very much want to—want to open this discussion here—is we don’t know the extent of the Russian role, how much the Russians are behind them. Almost all of the websites that push these out are newly created websites. We see links to the Russians. We certainly see that these stories get picked up on RT, the Russia Today news channel. We see them picked up by bots and other social media trolls that are traced back to the Russians. We see that—certainly now that the Russian narrative that was out there for a very long time, that they pushed out in Eastern Europe and in Central Europe, that we thought that we were proof of in the United States—not vulnerable to—came into the U.S. election itself. The Russians like this narrative. They pushed it out in Eastern and Central Europe. They pushed it out in the United States. We don’t know if they’re behind all of these websites, but they certainly—this is the narrative that they always pushed out, and they are—they certainly pick up these stories and amplify them hugely.

And so I will stop there. We can talk about how much influence any of these stories really have versus are people just sharing them because they’re bizarre. And we can also talk about the cost. And finally, we can talk about how do we push back without tipping over in the other direction, which is what I worry about, without ending up advocating censorship. And there is a real danger in that, too.

 

FASKIANOS: Carla, thanks very much for that terrific overview.

Let’s open it up now to the students for questions.

 

OPERATOR: Thank you. At this time we will open the floor for questions.

(Gives queuing instructions.)

Our first question will come from Washington and Lee University.

 

Q: Yes. So, given that fake news is largely a decentralized phenomenon, or at least has a decentralized component to it, how do we go about combating this phenomenon?

 

ROBBINS: Well, certainly fact-checking is an important thing, but it is also incredibly labor-intensive. It has this incredible Whac-a-Mole aspect of it. You can’t fact-check every story. It takes just a nanosecond to get a fake story out there, and it could take hours or even days to fact-check it. And people tend to not go viral with the check story the way they will with the—with the great hype story.

What we did see that is somewhat heartening in all of this is the way things played out in the French election, opposed to the way they played out in the U.S. election. The French were forewarned before the Macron election, and what they did is they set up this whole fact-checking process themselves. They were aware of what was going on. The French public and the French journalists were quite aware of what was going on because they had seen the spectacle in the United States. And when, you know, a day and a half before the election suddenly there was this huge dump of hacked emails allegedly from the Macron campaign that allegedly showed—there were all sorts of claims that they were, you know, going to show terrible misdeeds by Macron and his campaign—there was a huge amount of skepticism, first of all. And then these researchers sprung into action, and they traced the way that this stuff was getting—this news was getting—alleged news was getting pushed back to France through a variety of Twitter accounts that were linked to alt-right and Russian accounts, as well as Marine Le Pen accounts. So there was a great deal of skepticism there. And at the end of the day, it really didn’t change anything politically. So I think public education and awareness, so they didn’t have to go Whac-a-Mole knocking everything down.

Now, the French had one other thing which was sui generis, which is they had a blackout period, so there wasn’t an enormous amount of time for this news to get out. But I think there was also a great deal of skepticism there. So I think that we are all going to get smarter. They didn’t have to Whac-a-Mole every one of these things that was leaked before with the French campaign.

We will see what happens. The Germans are going to have their vote soon. People are very much, you know, aware. They’re much more vigilant in Germany, as well, now. We know that the Russians—we suspect that the Russians are sitting on a bunch of leaked emails, as well, from Merkel’s party, potentially from Merkel—you know, from the Bundestag. We’ll see what happens if they leak that. But there’s much more vigilance now, which perhaps will make it unnecessary to Whac-a-Mole every single one of these, you know, leaks—alleged leaks or fake news stories in the most sensitive of moments, like in—just in the run-up to a campaign or in the run-up to a vote.

Does that make sense?

 

Q: Yes.

 

FASKIANOS: Great, thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from Lahore University of Managing Science (sic; Management Sciences).

 

Q: Hello?

 

OPERATOR: Your line is now open.

 

ROBBINS: Hi.

 

FASKIANOS: We can’t hear you. Try again.

 

Q: Am I audible now?

 

FASKIANOS: Yes. Go ahead.

 

Q: Yeah. Thank you very much for the question. I’d just like to ask that, you know, there are certain sects that dub major news channels have bias as well in their opinions and reporting. And what do you think that government has take steps, and how is the public supposed to deal with such cases?

 

ROBBINS: I am sorry. Could somebody—Irina, I didn’t fully understand—couldn’t fully hear the question. Could someone repeat the question?

FASKIANOS: I couldn’t quite hear—I couldn’t quite understand it either. You were breaking up. Can you try again?

 

Q: Yeah, sure, sure. Am I audible now?

 

FASKIANOS: Yes.

 

ROBBINS: Yes.

 

Q: Yeah. So the question was that there have been sects in the history—in the recent history that (show up ?) in the major news channels having bias as well in their opinions and reporting regarding certain issues. So how do you think that the masses have to deal with that? And what steps must be taken by the specific and the current government—(inaudible)—in where the news channels and the agencies are working to stop such kind of biases?

 

ROBBINS: So are you—are you saying biased?

 

Q: Yeah.

 

ROBBINS: Well, one of the fundamental notions in the United States, and certainly enshrined in the First Amendment, is that government doesn’t have a role in regulating free speech. And while there certainly are—it is impossible for people to completely do away with their biases, I think there’s certainly a lot of very good journalism in the United States. And I certainly would not want the U.S. government to go around, as much as I may disagree personally with the ideological agendas of certain news channels or the ideological agendas of certain newspapers—one of the great things about a democracy is that you have enough different news sources that you can, one hopes, read lots of different things. I am constantly—when I give talks like this, I say to people: How many people are on Twitter? And lots of people raise their hands. And then I say to people: How many of you follow people you disagree with? And not so many people raise their hands. So what I say to people is, you don’t just read The New York Times. You don’t just listen to Fox News. You got to listen to a lot of different things. You got to read a lot of different things.

One of the readings that was recommended by the Council before this—before our meeting today did raise the problem that triangulation is not as easy these days because certain stories only get covered in this—in this filtered world, only get covered by certain ideological news sources. So you could end up doing a news search and find that you see fake news get repeated again and again only within a certain niche, so that the distorted facts could get, you know, repeated. There is a danger there, so you have to be a much more discerning consumer of news.

But it’s not the responsibility of the government to do this, it’s the responsibility of all of us to be more discerning, skeptical consumers of news. Because once you give the government the power to do that, what the government is going to do is the government’s not going to give you the truth; the government’s going to give you the truth that protects the government. And I certainly don’t want to give the government that power, no matter how annoying I may find particular news channels.

So, yes, I’m not saying that all news organizations are great. All I’m saying is that I don’t want to give the government that power.

 

Q: Right, absolutely.

 

FASKIANOS: Thank you. Let’s take the next question.

 

OPERATOR: Thank you. Our next question comes from University of Pennsylvania.

 

Q: Hi, Mrs. Robbins. I’m Rosie (sp) from University of Pennsylvania. Thank you for making time.

 

ROBBINS: Hi.

 

Q: Hi. So I got a question here that, first, we see that in the news in average page or average tweet or sentence that Mr. President always say that this is fake news, this is a fake media. So, like, the president tried to tell that there is an alternative fact, there is another kind of the truth. What I’m trying to quote here is—please allow me to quote one of the sentence from the propaganda minister from Hitler’s period, that if you tell a lie and keep disciplined telling a lie, people will try—people will start believing it as the truth. So my question is: Do you have any—do you see any long-term projection that this is not just about fake news, this is a new ideology or this is a new trend? And is this one of the—(inaudible)—idea about the (postwar ?) era? Do you have any perspective about it? Thank you.

 

ROBBINS: Well, you know, I’ve covered lots of—lots of presidents in my career as a journalist, and I have certainly had my share of fights with politicians who haven’t liked things that I’ve written both as a reporter and an editorial writer. But I have never seen a situation in which a president of the United States gets up and describes the press as “the enemy of the American people.” So I think that we are in very a very different time. There’s no question about that, in my mind. And I do think that it’s chilling. I really do think it’s chilling.

I found it chilling personally when the president of the United States goes to Poland and stands next to Duda, the head of Poland, and who has his own history of trying to suppress a free media in Poland, and says to them, instead of what a Ronald Reagan would have said, a George W. Bush would have said, a Bill Clinton would have said, an Obama would have said, an H.W. Bush would have said, and pretty much any American president—any American president would have said—which is, sir, you are a democracy; the strength of a democracy, you know, all of our allies are committed a free press no matter how critical they are of you. Instead, he said—he said to him: So, tell me about your problems with a dishonest press. So, I mean, that is—that’s not the way—that’s not the way—that’s not the sort of message that we—as the shining example of democracy in the world, we have ever sent out before. So, yes, we live in a very chilling time.

I want to believe that this is sui generis. I want to believe that this is a temporary bad time and that we will go back to the values that make our democracy such a great example, for the world, and that really makes our democracy so strong, which is this sort of push and pull that exists between the press and politicians. It’s not a great relationship, but it is one that both sides have always seen as fundamental to the strength of the country. But it’s scary, there’s no question about it.

And I think that, if you look at the polling data, the public is not happy with the press, but they’re not happy with the—with the White House or the Congress or anybody else as well. So we in the press have to be self-critical, make sure that we don’t go overboard and see this as a grudge match with President Trump. I think we have to—we have to maintain our dignity in this, and at times I think some of us have lost our dignity. But it is a chilling time. But I believe that there is a reason why the founders were committed—you know, gave the press this protection in the First Amendment. We really all have to fight to maintain it.

 

FASKIANOS: Thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from Stockton University.

 

Q: Can you please talk about the effects of media coverage and public views on crises like the ethnic cleansing of Rohingya in Myanmar?

 

ROBBINS: I’m sorry, what was the first—what was the first word?

 

Q: Can you please talk about the effects.

 

ROBBINS: Oh, the effects of the coverage?

 

Q: Yes, of media coverage and public views on crises like the ethnic cleansing of Rohingya in Myanmar.

 

ROBBINS: Well, sadly, not enough of an effect yet. I certainly—certainly, we wouldn’t be aware of what was going on if people weren’t covering it. But, you know, sadly, this is—what’s going on there is horrifying. But that is the role of—the role of the press, to go to places like this and to—and to raise awareness of these things, the same way it is the role of incredibly valiant NGOs to raise awareness of human rights people and aid people who play—who play these roles.

I mean, what is going on there is just unacceptable, and I’m saying that as an editorial writer. It’s just—it’s horrifying, and it’s horrifying that it’s particularly, you know, the role of the Aung San Suu Kyi government and all of that.

But nevertheless, you look at the role that the press has played over the years raising awareness, and I’m quite proud of my colleagues. On the other hand, whether—to raise awareness is then to persuade governments to—that they have to either be shamed or pressured into doing something different. You know, who is going go pressure, you know, this government in Myanmar to behave differently, particularly if the U.S. government isn’t taking the lead? I mean, which government is going to be—once the awareness is raised, what’s the next step? I mean, the press cannot, you know, bring economic sanctions, or what other tools are there to change the behavior of the government, and that’s the next step in this.

 

FASKIANOS: Thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from St. Edward’s University.

 

Q: Hi. So, as technology is developing very quickly in our society today and is increasingly being used to generate fake news, such as motion-capture technology being used that can take short videos of faces and map out faces and then use those mappings to generate those faces saying whatever, so it’s difficult in this time of incredibly advanced technology to trust even our eyes when it comes to fake news. So, if we can’t believe our eyes and we can’t believe our ears, is there anything that would be a solid indicator of something that we can either identify as fake news or as something that might need some more research, et cetera? (Laughs.)

 

ROBBINS: So are you talking about technology to fight back against the technology? Is that what you’re talking about, or just—

 

Q: I mean in the right hands, yes, but also I’ve seen—especially with motion-capture technology used to generate fake news, so having diplomats or other public figures saying things that they definitely did not say.

 

ROBBINS: Well, I haven’t—I mean listen, it’s—all one can do with that is hope that there are legitimate news organizations that are going to get up and say that didn’t happen, and one would hope that this is debunked very quickly. You know, that’s—it’s—that’s why you want to have as, you know, free and open a press as there can possibly exist there to push back against things of that sort.

I mean, I’m not exactly sure how I answer a question like that. And luckily, we don’t live in a society in which there’s only one news source. If then, you know, using that sort of information to tell people that—you know, to manipulate people’s minds that way, that would be really terrifying. But it is something—that level of vigilance we—(audio break). And as technology gets better and better, we certainly are going to have to be vigilant about that.

I’m sorry, are you looking for another answer? Because maybe I’m not following it.

 

Q: No, that was wonderful. Thank you very much.

 

ROBBINS: But you’re scaring me. (Laughter.)

 

FASKIANOS: Next question.

 

OPERATOR: Thank you. Our next question will come from Ashford University.

 

Q: Hello. Can you hear me?

 

OPERATOR: Yes.

ROBBINS: Yep.

 

Q: OK. Hi, yes. All right, so statement in response to a question posed earlier said you’re not comfortable with the government involvement in response to the phenomenon of fake news, particularly as it could lead to state censorship of information. In the case of the 2016 presidential campaign, there was substantial evidence that the fake news campaign was a component of a concerted cyber influence campaign from the Russian government to sway the election for the candidate of their choice. Considering this was the action of a state actor on an American institution, does the government have a role to play in engaging and countering foreign information operations targeting the American public? Should the U.S. government develop strategies to deter, counter, or preempt foreign cyber influence campaigns? If so, what would that look like? You mentioned the importance of public education as a means of immunization. Could such a strategy involve technical controls such as those controls to shut down bot, Twitter, and Facebook accounts?

 

ROBBINS: Whew, there are many questions in there. Let me try to unpack this a little bit.

I don’t think that the pushing-off point for that is necessarily censorship issue. I don’t think that the government should be shutting down news organizations that it disagrees with, or even ones that have spurious stories in them. So let’s just set that aside for a minute because, unless there is, you know, someone with absolute, you know, lawgiver capabilities in the rudest terms of it, you know, we don’t want to give that power to the government.

Certainly, seeing this as an interference, as an assault on U.S. democracy, this is a bilateral issue between the U.S. government and the Russian government that needed to be dealt with more directly. I understand the ambivalence of the Obama administration. It was very concerned that if it pushed back too hard, pushed back too publicly, that people would start doubting the credibility, the legitimacy of the election, particularly at a time when one of the candidates, the leading candidate for the Republican Party, was suggesting that the system was rigged. I understand their ambivalence about this. I personally, if I were writing editorials at the time, would have complained that they weren’t pushing back hard enough.

Yes, there needs to be—you know, this was a—this was an attack, and certainly there needs to be—there need to be sanctions and this needs to be—the Russians need to understand that this is absolutely intolerable and they can’t keep doing this. So the same way one would see this as some sort of a kinetic attack, this was an attack on American democracy. So, whether you want to talk about sanctions or what other way you want to push back, yes, there has to be some sort of a strategy, as well.

Your third question was? In that question, your final point, can you say it again? Hello? Are you there?

 

Q: I’m sorry, yes. Can you hear me? So I guess my concern was, with this one particularly when you have the Russians that were—that conducted this campaign, it’s likely we’ll be—

 

ROBBINS: Oh, about shuttering—shutting down Twitter or Facebook accounts?

 

Q: Yes, these technical controls to do that.

 

ROBBINS: Rather than leaving—rather than leaving it to Facebook and Twitter to do it?

 

Q: Or even relying on the news media or private institutions to immunize the American public.

 

ROBBINS: You know, this is a really interesting set of questions here, whether you consider Facebook and Twitter publishing platforms or not. Facebook has been shutting these—shutting these things down. You know, Twitter has a different policy; they don’t shut things down even if they’re bots. Facebook says it’s shutting things down if they can’t—if people don’t, you know—don’t establish their bona fides. Twitter doesn’t mind if you—if you don’t prove who you are.

You know, there is a cost, to my mind, of—you know, I would much rather give them more latitude rather than less latitude because of the possibility of tipping over into censorship. So I would rather, you know, pressure Facebook, have the public be aware of the role that Facebook plays. You know, Zuckerberg came out immediately after the election and basically scoffed at the notion that they played any role in amplifying this vilification. They seem to be much more aware of it now, much more sensitive of, you know, the reputation they’re getting for this. They see—you know, they’re now working with fact-checking organizations and Google is working with fact-checking organizations. They are trying to somewhat clean up their act, although they’re resisting sharing information—you know, their numbers, their data—with the fact-checking organizations. They have quite a ways to go to clean up their act.

At the same time, I really don’t like the idea of the government getting involved in this because that’s a very slippery slope. So I would argue against giving the government the power to shut down Facebook accounts or shut down Twitter accounts. It scares me because I think the step from there to shutting down news organizations is not a very—it’s far too close. It scares me.

 

Q: Thank you.

 

FASKIANOS: Thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from Columbia Business School.

 

Q: Yeah, hello. How are you?

 

ROBBINS: Hey, neighbor.

 

Q: So my question is about, you know, social media being the, you know, source of a lot of fake news. And what do you think about the application of artificial intelligence as a way to counter it?

 

ROBBINS: Well, that’s an interesting—it is an interesting question. You know, I’m not a technologist, and I am skeptical. There was a very good op-ed in the Times a few months ago about not letting Google become a censor. I worry that if you give machines the ability—you know, if you charge—give the algorithm the power to become the censor, that we’re going to end up airbrushing the internet, and that all sorts of things are going to disappear. And particularly—I mean, the reason why this has even come up is because the Germans passed this law that says that, you know, anything vile that ends up on the internet through social media, that these companies are going to be fined huge amounts of money.

And so suddenly they began to run scared. And so they’re scrambling to figure out a way to do this. And so at least if a human being is curating this, if somebody raises a concern, they—a human being goes through it, at least there’s some sort of weighing of it. If it’s something—it’s like a flash trade on the stock market, and they just sort of automatically pull it off, you know that they’re going to set their—you know, the technology in a way that they just are going to err on the side of protecting their own profits rather than err on the side of protecting free speech.

And so I am very skeptical about it, because I think that things are just going to be come airbrushed. It’s going to be like the old Soviet times. Things are just going to disappear from the internet that we never even knew were there to begin with. So I—you know, I don’t trust the algorithm to do this, because I don’t trust the people behind it, because they’re not going to make a decision that they want to protect free speech. They’re going to want to make it—they’re going to make a decision that they want to protect their bottom line. So at least if it’s curated by human beings, at least it’ll be more of a consideration.

Now, maybe there are technology people who can talk me out of it and say that they can write the algorithm in a way that they understand that they want to protect free speech and they really want to err on the side of including as much as possible. But, you know, the Supreme Court has struggled forever and ever, how do you define the difference between pornography versus something that’s just icky and offensive? How do you define the notion of hate speech or, you know, this question of shouting fire in a crowded theater? I mean, these are—these are things that the courts and human beings have struggled with forever. I don’t know if machines are going to do a better job with this. I’d rather trust humans with it. And I’m not a Luddite.

 

FASKIANOS: Thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from the University of Southern Mississippi.

 

Q: Good morning.

 

OPERATOR: Yes, ma’am.

 

Q: All right. Thank you. My question is—well, first I’m going to kind of talk a little bit. It seems that we live in a country right now of civil discord instead of civil discourse. I believe that conversation has died due to justification of, I guess you could call it fake news—anything that has an agenda. And we have people out in the streets causing violence. So my question is, is how do we educate the uneducated about fact-checking and reliable sources? Because we have kids in the street that are out creating violence and they have a justification for that violence from these media outlets and so on. And so I know that you do not agree with the government taking control, and I do not either. So do you see in the near future some sort of public education outreach programs to help educate these people on what is real and what is fake in today’s media?

 

ROBBINS: Well, I certainly hope so. First of all, I think one of the things we could all do is stop using the term “media” as if it were one big blob. And that there are many different news sources. Some are good, some are bad. You know, there’s a big difference between a supermarket tabloid and The Washington Post or The Wall Street Journal or The New York Times. And so I think that one of the things we want to do is we want to encourage students, and we want to encourage our family and our friends, to be discerning consumers and to be—have a wide palette. I mean, we want to do this. I think it’s the role of educators beginning—from the very beginning. It’s the role of parents. You know, this is an educational challenge.

And it’s—that’s the way to do it. And certainly, when we talk about public education, it’s education that is the best way to do it. there are all sorts of exercises for media literacy. You can say to people look at a story, deconstruct it, you know, how would you, you know, weight whether it’s true or false? I mean, there are ways that we can teach students to be critical consumers of information. And it’s not rocket science. And it’s possible to be done. It needs to be added to school curriculum. It needs to be added early, because we see that this isn’t a luxury. We see this as a pretty essential thing for an informed electorate. It’s absolutely essential. And we see the cost of it when the electorate isn’t informed. As you’ve said, there’s violence and there’s very bad political decisions that are made.

But I also think the politicians have a responsibility. When politicians get up there and say, you know, who are you going to believe, your own eyes or me—you know, don’t trust the lying, failing New York Times, lying CNN. Don’t trust anybody. Don’t trust the Office of Management and Budget. I mean, the White House has been saying you can’t trust any of the numbers that you hear from—not through OMB, but from the CBO, the Congressional Budget Office. The White House said this on the—on the health care bill. Don’t trust the numbers from the nonpartisan Congressional Budget Office. How can you have a debate about health care reform in which you’re not going to accept the numbers that come from the nonpartisan Congressional Budget Office? They basically said that that was fake news. If we can’t have a civil or rational discourse about something that is worth billions and billions and billions and billions of dollars, if we can’t even agree that there’s a baseline on numbers.

So our politicians have a responsibility to say: Let’s have a civil discourse. Let’s all agree that there are respected sources of information. When the president gets up and says, you know, all polls that don’t say that I overwhelmingly won the election are fake polls. You know, all polls that don’t say that I won, you know, the popular votes are fake polls. I mean, politicians have a responsibility to say, even when things that are not attractive to me, that doesn’t necessarily make them fake news. So the responsibility lies with, you know, parents, with educators. But they also go all the way up to the White House.

 

FASKIANOS: Next question.

 

OPERATOR: Thank you. Our next question will come from Colorado College.

 

Q: Hello. First off, thanks so much for making time to answer all of our questions today. Some of this has been touched on already, but do you believe that regulation of algorithms and filtering news, allowing parties to have equally exposed platforms, is a justifiable means to meet a democratic end, even though algorithmic regulation would infringe on the freedom of speech you mentioned?

 

ROBBINS: No. (Laughs.) I don’t like regulation. It scares me. I—you know, I don’t like the idea of getting the government in the business of censorship. I mean, I think that’s what your question is and that’s my answer. It’s—I think that—I’m willing to accept the weaknesses of certain news organizations for the sake of a guarantee of free speech, because I believe that there’s enough pluralism in the news environment to ultimately be a corrective.

 

FASKIANOS: Thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from Augusta University.

 

Q: Hi.

 

ROBBINS: Hi.

 

FASKIANOS: Go ahead.

 

Q: In one of the readings it talked about how in the 2016 election the older demographic was using—might be the one that’s using the traditional press or, like, print media more than the younger generations. I wondered how, given that Facebook was number three across the board as a—as a new source that was cited, how’s the press competing with these new online or digital methods? And considering how they’re easy to manipulate and how you talked about how it’s an easy entry point to produce fake news, how’s the media addressing this in competing with these news sources?

 

ROBBINS: Well, I don’t think Facebook is actually a news source. I mean, Facebook’s a platform. And when people say they get their news from Facebook it means—a lot of this is news from other sources that are shared on Facebook. And so people may be reading stories that they see on Facebook. And certainly, you know, The New York Times, The Washington Post, all the—all the, quote, “mainstream media” sources, you know, use—have relationships with Facebook. You can read news stories on Facebook. I know these are evolving relationships. And they’ve—certainly when I was at the Times we tried all sorts of different relationships with Google and Facebook. And they’re evolving relationships over time. So I don’t think it’s necessarily a competition.

The problem, of course, is—with Facebook, and you saw this in the story that was in the Times recently—is when people can buy ads on Facebook and they can come—they can target this very high up in your newsfeed. And someone can look at an add that, you know, proports to be news, and look at it and say to themselves: Well, that must have equal weight to a story that I’m seeing in my newsfeed that’s coming from The Washington Post. That’s the problem. It’s when—it’s when people are giving equal weight to something that they’re seeing on CNN and reading in their newsfeed that’s coming from EndTheFed.com, which is one of these recently created website that carried three of those viral stories that I read to you. That’s the problem.

And so I don’t think Facebook is necessarily the problem. Facebook is just—is just a messenger. And the challenge is really to make sure that people can break out of a bubble in which they somehow give equal weight to or spend more time reading things from their newsfeed that come from EndTheFed.com than they do to reading things from a plurality of much more legitimate news sources.

And I—you know, once again, I’ve given up distinguishing between hardcopy and online. I will tell you that, you know, when I was at the Journal and things were just going digital, you know, I used to be—to me, I really cared about the play and the hardcopy paper. Did I make, you know, into a good position on the front page in the hardcopy paper, versus, you know, it was just online. You know, that is so last century—(laughs)—as far as I’m concerned. I mean, I get hard copy papers delivered to the house, just in part because my husband still likes reading the paper in hard copy and in part because I feel like I’m sort of supporting something because I know that the average—I think it gives a lot more money to the paper, to The New York Times, than it does digitally.

But I—you know, I don’t read the hardcopy paper. I am so totally into the iPad app. I’m so totally into what comes out on the browser. It’s so much more interactive. It’s so much cooler visually. There’s a visual—there’s the upshot. There’s so many more things that you can do digitally. And I’m an old person. So I can tell you that I think it’s just a distinction that’s disappearing. So I don’t think there’s a competition between the print and whatever. I think the question is people who don’t distinguish between legitimate news sources and not legitimate news sources.

 

FASKIANOS: Thank you. Next question.

 

OPERATOR: Thank you. Our next question will come from Johns Hopkins University.

 

Q: Hi. I’m just wondering, where is the science on this and what is the state of the field? And what studies and authors should we look at to get smarter on this?

 

ROBBINS: Getting the science on what?

 

Q: Studies on fake news or misinformation.

 

ROBBINS: OK. Well, I can’t—I’m—we can get you—I mean, the Council already shared some things with you. We can get your more stuff. There’s—I mean, there’s some Harvard studies that are quite interesting right now that—not necessarily fake news, but on influencers and what people were reading in the runup to the election and the influence of Breitbart on the conservative groups. I mean, there’s a whole bunch of really interesting reading out there. I can share of this with Irina. There’s—and she can share it with—you know, we can post it. I’m sure, Irina, we can do that, right? And—with the transcript of this.

And there’s also—there’s a whole bunch of really interesting stuff about what the Russians are doing and how they’ve been trying to influence, you know, politics in Eastern Europe, Central Europe, and now in the United States. I have a piece coming out with the Aspen Strategy Group which we can share with you. And so there’s a lot of writing out there on this. And we will compile a list. And we can post it with the transcript of this.

 

Q: Thank you.

 

FASKIANOS: Yes, absolutely. And some of the resources that we sent out, as Carla said, at Harvard, the Shorenstein Center, the Pew Research—they’re doing a lot of great work. So we can send out a list of resources for you.

 

Q: Thanks.

 

FASKIANOS: Next question.

 

OPERATOR: Thank you. Our next question will come from University of Houston.

 

Q: What is your—what is your—

 

ROBBINS: Are you people OK out there?

 

Q: Can you hear us now?

 

ROBBINS: Just wanted to make sure you people in Houston are OK.

 

Q: Oh, yeah, we’re good. Yeah. So here’s my question. What is your logic behind trusting a group of people rather than software in regards to curation?

 

ROBBINS: Well, because I’m a newspaper editor. (Laughs.) And I know—I know that I took my job extremely responsibly. And one has to trust a group of people to—who have to believe in the mission of communication and journalism and truth. I mean, it’s not like I’m going to choose some random person on the street, one hopes, to do it. That’s all. It’s—I mean, I did this for years. I decided what needed to be in a story and not to be in a story, and erring on the side of getting the best and the most information out there. So it’s a question of, you know, the professional mission, that’s all. That’s my logic.

 

FASKIANOS: And judgement too, I would say.

 

ROBBINS: Yeah.

 

FASKIANOS: Next question.

 

OPERATOR: Thank you. Our next question will come from Washington and Lee University.

 

Q: Hi. So you probably saw the interview the other day with Ezra Klein and Hillary Clinton. And in that interview, I thought that Ezra made a good point, that a lot of the media that was pro-Hillary spent a lot of their time covering the fake news and responding to the fake news that was being spread by the Trump campaign and other supporters of President Trump. And President Trump’s campaign really just didn’t respond to any fact-checking or other accusations of falsehood throughout the process, and seemed to succeed with that strategy. It seems like at a certain point people have decided whether or not they’re going to trust certain sources, and whether they’re going to even bother reading an article from The New York Times or from Fox. So is it better for media to spend their time trying to combat fake news, or just to ignore it completely and try to spread their message instead?

 

ROBBINS: Well, I don’t accept the premise that media is pro-Hillary or anti-Hillary. I mean, there may be some media—you know, there are—there are organizations that are declared ideological, but, you know, I don’t accept the notion that—you know, that—there’s a lot of straight reporting out there. And while the editorial page of The New York Times was pro-Hillary, I don’t accept the notion that the front page of The New York Times made their decisions of what to cover because they were pro-Hillary. In fact, if anything, I think the point of what that discussion was saying was that the Clinton campaign would feel they covered far too many things that were negative to Hillary. So how would you call The New York Times pro-Hillary, from a news point of view? I mean, basically what you’re saying here is here’s a pro-Hillary news organization that was running anti-Hillary things. So why does that make it a pro-Hillary news organization?

I mean, the other question is, to move past the premise—the foundational premise here—the other question here is does one cover fake news? And I think that you can ignore stories. I think, you know, the Seth Rich story—I don’t know if you guys know about this, but this story about this DNC staffer who sadly was murdered on Capitol Hill, and that Fox News was running with this notion—a notion that was planted in part by Julian Assange with WikiLeaks—that he was murdered because he was—may have been the source for the email, the DNC emails, and that, you know, there was something—that somehow the DNC or the Hillary Clinton campaign was behind the murder of Seth Rich. And finally—Sean Hannity was pushing this big time, and finally Fox News backed off on this. But RT, the Russian television network, it’s continuing to push it.

And many people in the, quote, “mainstream media” stayed away from it for a very long time, until it got big pickup. At some point you have to cover something if it’s resonant. If not, people, when they Google it, the only thing they find are the people who are confirming it. So that’s a problem. And if you don’t cover it and you don’t try and debunk it, you know, then basically the people who are pushing the fake news own the entire conversation. On the other hand, if you keep running with it, you know, you amplify it.

So it’s a challenge. There’s no question that it’s a challenge. And you don’t want to ignore it completely because what happens if, God forbid, it were true? So it is—you know, it’s a constant challenge itself.

I don’t think that it’s the—it was the responsibility of The New York Times not to cover the Clinton email debacle. Certainly, if James Comey was paying attention to it, they had to cover it. The fact that Fox News was being hagiographic to Donald Trump, I’m sure the Clinton campaign regrets that The New York Times didn’t behave toward her the way Fox News behaved toward Trump, but that’s not the responsibility of a legitimate news organization like The New York Times. And one can argue that the Times, you know, may have run too many stories on the emails, and that certainly is what the Clinton campaign would say. But, you know, it’s—news editors aren’t going to make decisions in a legitimate news organization based on whether or not it helps a campaign.

 

FASKIANOS: Let’s see if we can squeeze one last question in.

 

OPERATOR: Thank you. Our last question will come from St. Edward’s University.

 

Q: Hello. How can we combat biased information consumption? Because a lot of people hold information that aligns with their beliefs and ideas above information that doesn’t. And will we—will it ever be possible to change this issue and make it better, or will this forever be a problem in our society?

 

ROBBINS: Well, I mean, this goes back to the education and the political leadership notion and all of that. I think that all we can do is hope that people get outside their media comfort zone. As I said, I, you know, say to my students, I love Twitter. I know Twitter is sort of a newsperson’s (passion ?) and I know it’s more of an older-person thing, but I encourage my students to go on Twitter. It’s a great way to curate the internet and it’s a great way to curate their news sources. And I say to them: follow people you agree with and follow people you don’t agree with, and follow straight news organizations. And the best thing you can do is try to—try to get as many different views of the world as possible and be an informed consumer of news and make up your own mind in the end of the day. And that’s—you know, you don’t make up your own news, as they used to say—(laughs)—on “Saturday Night Live,” and they may still say it, but certainly you need to use your own judgment. And the beginning of that is reading critically as much as you can. And I know that you guys, having dialed into this, are the smart people, and the people who care and have inquisitive brains. So go out there and spread the word.

 

Q: All right. Thank you.

 

ROBBINS: Thank you, guys.

 

FASKIANOS: Carla, thank you very much for being with us, and to all of you for your great questions. We really appreciate it, and we’ll do some follow up—send out some follow-up links.

 

ROBBINS: Thank you, guys. Great questions.

 

FASKIANOS: Absolutely.

Our next call will be on Wednesday, September 27th, at noon again with Samantha Power. She is the Anna Lindh Professor of the Practice of Global Leadership and Public Policy at Harvard Kennedy School. She will lead a conversation on the role of the United Nations in global governance.

So, in the meantime, I hope you will follow us on CFR Campus on Twitter—@CFR_Campus, talking about Twitter—for information about new CFR resources and upcoming events. Check out our website, CFR.org, for information. You can filter by topic and region, as well as look at articles from Foreign Affairs, the preeminent journal that we publish.

So thank you all again. Thank you, Carla. And we look forward to your continued participation.

(END)

Top Stories on CFR

Sudan

A year into the civil war in Sudan, more than eight million people have been displaced, exacerbating an already devastating humanitarian crisis.

Iran

The unprecedented Iranian attack on Israel presents U.S. officials with mounting challenges in trying to contain the conflict and maintain a deterrence against Iran and its allies.

Japan

The highlights from Kishida Fumio's busy week in Washington.