fbpx

Michael McConnell on Facebook’s Oversight Board and Content Moderation (Two Think Minimum)

Michael McConnell on Facebook’s Oversight Board and Content Moderation (Two Think Minimum)

Tom Lenard:

Hello, and welcome back to TPI’s podcast, Two Think Minimum. It’s Friday, May 14th, 2021, and I’m Tom Lenard, President Emeritus and Senior Fellow at the Technology Policy Institute. I’m joined by Scott Wallsten, TPI’s President and Senior Fellow, and TPI Senior Fellow Sarah Oh. 

Today, we’re delighted to have as our guest, Michael McConnell. If you’ve been following the news at all lately, you probably know Michael is co-chair of the Facebook Oversight Board, which last week published its decision in the case involving President Trump’s access to Facebook following the January 6th riot at the Capitol. Michael is also the Richard and Francis Mallory Professor and Director of the Constitutional Law Center at the Stanford Law School and a Senior Fellow at the Hoover Institution. From 2002 to 2009, he served as Circuit Judge on the United States Court of Appeals for the 10th Circuit. Michael has previously held chaired professorships at the University of Chicago and the University of Utah and visiting professorships at Harvard and NYU. He has published widely in the fields of constitutional law theory, has argued 15 cases in the US Supreme Court and served as a law clerk to Supreme Court Justice William Brennan and DC Circuit Court Judge J. Skelly Wright. He has been in an Assistant General Counsel of the OMD, where I had the pleasure of working with him on regulatory reform issues, and a member of the president’s Intelligence Oversight Board. 

Welcome, Michael, it’s pleasure to have you here.

Michael McConnell:

Thanks for having me, and it’s great to see you again,

Tom Lenard:

So, I think probably most people who are listening to this podcast know, but maybe you could start out by explaining briefly what Facebook’s Oversight Board does, what its purpose is, and what powers it has.

Michael McConnell:

So, the Oversight Board was set up by Facebook to be an independent body, to sort of look over their shoulder and review certain aspects of their content moderation policies. It is independent in the sense that Facebook endowed the money. So, Facebook no longer has any financial connection to the board. They selected the initial co-chairs and participated in the selection of the initial members, but after that, it becomes a self-perpetuating board where we will choose successors and so forth. Currently, there are 20 members from all over the world, every content other than Antarctica, and I’m looking for some penguins. They might do a good job. 

About a quarter of the members are from the United States, and I think anyone who looks at the membership list with any fair eye, will say that indeed, these are people who are independent of Facebook. Many of them have spent their careers criticizing Facebook, or otherwise no one has any connection to Facebook, and so far, the decisions we’ve had, what nine or 10 decisions, and all but one or two of those has actually been reversing Facebook’s content moderation decisions. 

And we have authority over appeals by users of takedowns and by third party persons objecting to the leaving up of content, and with respect to those, Facebook has pledged that it will compulsorily, it will follow, it will obey what the Facebook Oversight Board tells it to do. So, take downs and leave ups, but in addition to that, we have authority to make policy recommendations. Facebook has not agreed to follow those, but it has agreed to consider them and within 30 days to provide a written public response so that if they follow them good, if they don’t, then the whole world will know why. 

Tom Lenard:

Were you at all involved in the design of the board, or was it basically designed before you came on?

Michael McConnell: 

The basic design came from Facebook before anyone was appointed, but a lot of the details have been worked out since then, and I’ve been involved in all of those. 

Tom Lenard:

Maybe you can also go over what the key elements were in the board’s decision on the Trump case? 

Michael McConnell:

So, as probably most people know… this is an educated audience, but on January 6th, then President Trump issued two statements over Facebook while the riot was taking place at the Capitol, and I won’t quote those just now, but let me just say that they praise people engaged in the riot. Accompanied by, and I want to stress and acknowledge by at least perfunctory asking, calling on them to be peaceful, but the bulk of the messages were to, I would say, egg them on. Repeating claims about the historic landside election being stolen away from them and how terrible they had been treated and so forth.

So, on that day, Facebook suspended those particular posts, and then the next day suspended Mr. Trump’s Facebook account altogether, and then announced that this would be “indefinite”, but at least for two weeks. I think leading most of us to believe that there would be a review within that two weeks, and then some sort of firm decision. 

Facebook rules do not allow for an indefinite suspension. They have to give people an answer, and they can leave them up. They can take them down. They can suspend their accounts permanently. They can suspend their accounts for a specific period of time, but they can’t just leave them in Facebook limbo where you never know when or why you might back on, and we assume, and I say, we, it’s not the board because the case didn’t come to the board yet, but I think most Americans would naturally assume that during that two week period, that Facebook would be reaching some sort of concrete solution about what to do with Mr. Trump’s account.

Instead, they just tossed the case to the Oversight Board without reaching any decision as to how long it would go or what the terms would be, and then they referred the case to us for a decision. And what we decided last week was, first of all, we agreed that Facebook was justified in removing those posts, that it violated the dangerous organizations and individuals standard under the Facebook Community Standards by braising people in the course of a riot. 

And we also agreed the, there was sufficient justification for suspending his account altogether, at least for a certain period of time. Facebook refers, and I think this is important, to official guidance from other department of Homeland Security, that there was a heightened danger of continued violence having to do with inaugurations and the election returns and so forth that would extend ultimately to the end of April.

So, the second thing we held is that Facebook was justified and suspending Mr. Trump’s account for a certain period of time, but we said that the indefinite suspension was not justifiable essentially for two reasons. One, is it’s not in Facebook’s rules, and one of our jobs is simply to make sure that Facebook applies its rules consistently, but also that kind of indefiniteness violates principles of freedom of expression recognized in international laws, as well as US Constitutional law, that it’s simply too vague. It means on any given day, Facebook has this important speaker sort of dancing on a string. Are they going to let him back in? Are they going to keep him off? They need to make a concrete decision that will be specific. In addition to that, we issued a large number of policy recommendations, some of which had to do with transparency and clarity of the rules that would apply to all users.

But also, we had a few things to say about the treatment of political posters, of people who are either heads of state or high political officers. In particular, we urged Facebook to, in cases in which violence is in the offing, that is where posts present some kind of realistic problem of actual violence or unrest, and here we’re thinking just about January 6th, but about episodes around the world that have been much more serious in their way than that in terms of loss of life, that in those cases, Facebook should not apply any kind of a special political importance rule, but should favor safety over political expression. But we also have a recommendation which is designed, this is really about that issue more than it is about Mr. Trump, but urge Facebook to be attentive to the differences between heads of state and the political opposition and the way in which its standards can be manipulated and applied and exploited to shut down members of the political opposition.

Scott Wallsten:

I’m curious. It hasn’t been very long since that decision, but is your sense that Facebook will take those recommendations seriously and try to develop policies to respond to them? 

Michael McConnell:

I think so we haven’t been in business for very long, so there’s not much of a track record. So far, the only policy recommendations or decisions that Facebook has resisted, that they’ve pushed back against, was a decision out of France having to do with COVID supposed misinformation. 

Tom Lenard:

You stress the independence of the board, but do the board members have direct contact with Facebook employees about these issues?

Michael McConnell:

No. Now, I can’t tell you for sure what the no member of the board ever has any contact with any Facebook employees. They may be on panels. I mean, there are occasions when they brush shoulders, but Facebook has no… I have certainly never talked to anyone at Facebook about any case. I might, in my capacity as Co-Chair, sometimes talk to Facebook about the architecture of the system of review because changes are still being made, but never a case, and I would be real surprised if any of the board ever did. It would just be contrary to the way we do business.

Sarah Oh:

From your experience on the 10th Circuit, how would you tweak the architecture of the appeals process? How do you compare this current experience with your past judicial experience?

Michael McConnell:

That’s a wonderful question. It’s kind of fun to see the way in which different members of the board are influenced about that kind of meta-question by their prior experience, and I think it’s probably fair to say that I favor a more judicialized view of the way we do business. I want the opinions to speak for themselves rather than having members out sort of commenting on the opinions. That’s the way judges do business. I want the decisions to be confined to the actual matter brought to us, rather than for us to range abroad. If you read the Trump decision, I think you’ll see that there’s some dissenting members who wanted to go farther and to reserve range farther afield. So, what are some of the other judicial… I want the record to be… I mean, we call it the record and the courts. I want it to be fairly clear what are we deciding on the basis of. You’ll see that they’re public comments. I think we need to improve the way we process those, but I think it’s a very good structure. Now, that’s a little bit more like the administrative procedure act in a regulatory body than it is a court, but maybe my experience with regulation makes me very open. I think it’s a very good idea for informed people around the world and especially organizations that have expertise to chime in. We learn from those things. I don’t see our job as to decide cases according to how many people express what views about why. I don’t see it as testing public opinion in that sense. I see it as bringing expertise and ideas and thinking to bear so that if an organization that only has one member has something really important and useful and sensible and coherent to say, I’m going to pay attention to them, and I’m going to ignore the literally thousands of individual comments that just express an opinion about how it ought to come out. 

We are not meant to be a democratic body or meant to be a deliberative body, more like a court than like a legislature. And what’s really important for us is to try to hold Facebook’s feet to the fire and hold them to a higher standard of clarity and consistency and transparency, and I think perhaps anybody who’s read the Community Standards will recognize that there’s a lot of work to be done on clarity, and anyone… and I’m learning more and more just how the consistency problem is huge. 

And I think the single thing that frustrates Facebook users more than anything else is that they don’t understand why they’ve been treated the way they do. The explanations that they get are opaque. It has a serious impact on freedom of expression.

May I mention, just one very specific example of that is that oftentimes posts are taken down, even though there are important matters, but because of a single word or phrase that is objectionable. We had a case in which an Armenian user was raising alarms about Azerbaijani treatment of Armenian churches in the disputed area that there was just a war over, an area in Eastern Armenia or Western Azerbaijan depending on how you look at it, and it contained a word which experts tell us is a serious slur about the Azerbaijanis, and so the whole thing was taken down. 

But if Facebook had just told the poster, the poster had no idea that that was why they were taken down. They could have put their posts back up the next day if they had known that it was just one word that was the problem, but because Facebook doesn’t tell people why their posts are being taken down they don’t know what to do, and I think if people actually knew why they were treated the way they were, that not only would people feel better. They would feel maybe that they’re treated more fairly and more transparently, all that’s important. But more important, I think, is that they would then be able to repost and the freedom of expression would be enhanced.

Tom Lenard:

So, the Trump had to do with posts that incite violence or could incite violence, which is difficult because it involves the president, but substantively, it seems actually somewhat easier than other issues might be. So for example, issues having to do with “fake news”, since there’s so many different opinions on what constitutes “fake news” and labeling something “fake news” obviously has the potential to stifle a legitimate debate. How would the board, or how does the report approach that type of an issue?

Michael McConnell:

Well, Tom, I agree with you. I think misinformation and disinformation is a particularly difficult area, and as you say, I think more difficult than the Trump case. I mean, obviously the Trump case is very politically fraught, but at heart, you know, it wasn’t that hard to say that his posts were inappropriate and needed to be taken down. You know, how long and so forth, there are obviously issues there. The way misinformation, as I understand it is deliberate, the telling of lies, and disinformation is the dissemination of inaccurate information. This is very tricky. We’ve had some cases that nibble around the edges of this problem, like the COVID case from France. So, when you ask, “How does the board approach this?” I have to tell you, I don’t know yet, but it’s going to be a very difficult problem. I do believe that out there in the world, there is a myth that we know what misinformation is, and all we have to do is remove it. Whereas the range of shades of gray about truth and untruth are infinite. I’ve taught First Amendment law for what, since the mid-eighties, since I left the federal government in the eighties and the US Supreme Court and other courts have hunted this issue a lot more than the Facebook Oversight Board has, and the court, I think rightly has been very hesitant to allow suppression of speech merely on the ground that it’s false. [Inaudible] is the most important case on modern free press law, and it was about defamation, which is the publication of false information that sullies is a private person’s reputation, and the court said that there has to be breathing space for that, because otherwise, if every deviation… this was about a Civil Rights Era advertisement and [inaudible] factual errors, right. And some of them were more contested than others, and the court said just because it’s an error doesn’t mean that it can be suppressed, and the most conspicuous case United States v. Alvarez had to do with the False Honor Act, which prohibits people from deliberately claiming to have won Congressional Medals of Honor, and there are apparently a thousand or so cases of these over, I forget now, these claims. Some prominent politicians, that I will not name, have done this, and so Congress had passed statute prohibiting it, and the US Supreme Court said that that statute is unconstitutional because it is kind of a blunderbuss approach, and it doesn’t take into account the many nuances of the situation. 

It is my guess that as members of the board confront many and varied issues on different contexts, that we will become as attuned to the tricky and nuanced character of this problem as the American courts are, but it may take some time because there is a powerful sort of undercurrent of rhetoric out there that misinformation is bad for democracy and just needs to be taken down. It’s not that that isn’t true. It is true. It’s just that that’s too unspecific. It’s too unnuanced. Misinformation is bad for democracy, but preventing misinformation would be even worse for democracy across the board. Also, truth. Who gets to decide? And Facebook relies upon thirty some-odd outside fact checkers, but I bet a cookie that any of those fact-checking organizations has its own point of view and that they could probably be profitably fact-checked themselves. 

Scott Wallsten:

So, I want to come back for a second to the board itself and what its purpose is. How will you know whether the board has done a good job? I mean, it says the board’s charter… I don’t know if it’s the charter, but it says the purpose of the board is to basically make decisions. Will it be the decisions that you make that matter, or will it be whether Facebook listens to them that matters? 

Michael McConnell: 

I find it difficult to distinguish between those two things, because the decisions are made for a reason. If we were just like some continental European courts used to be and just announced their decision without a reason… well, we give a reason, and embedded in that reason for the decision are judgments about general questions of policies. So, I think they’re bound up together, and our policy recommendations, I think they come out of and are related to specific cases. 

The direction I was thinking you were going, Scott, is how do we know whether we succeed, and I do want to urge against a utopian expectation. We are not going to make the internet okay. The internet and social media have vastly increased opportunities for communication by ordinary people in ways that are both wonderful and terrible. It’s having effects on people’s character, but it’s also relieving loneliness. It’s facilitating and enabling people to express themselves on political issues so that it isn’t just the newspaper publishers anymore who can get out editorials. The good and the bad are both out there, and we are not going to be able to solve the bad. If we, at the end of 10 years, have made Facebook’s administration… if their rules are clearer and more sensible and not overbroad and not discretionary, if there’s more attentiveness to consistency of treatment and informing, and if we’ve made it a little bit better, I’ll be happy. I have no expectation that we’re going to make it all good.

Tom Lenard:

Well, I’d like to pursue one comment that you make in the decision which I found interesting. I think there’s a paragraph or two talking about that you sought clarification from Facebook about the extent to which it’s design decisions, including algorithms, amplified Trump’s posts and presumably, or amplify other objectionable content, and you said that you didn’t receive a reply from Facebook. Do you want to amplify on that a little bit or…?

Michael McConnell:

I personally am not surprised we didn’t get a reply, and on this, I’m just speaking for myself. This is a huge question. What kind of response? It would require a twelve volume treatise to get into this, but that doesn’t mean it isn’t something I profoundly hope that the smartest people at Facebook are pondering, because the amplification, the way in which, as I understand the algorithm, the basic business model and the use of algorithms is that they are designed to bring to individual user’s attention the things that they’re most likely to want to see. That’s, at that level of generality, a good thing. I mean, it’s catering to the customers and if I’m going to get an ad, I’d rather get an ad for something I’m interested in than something I’m not interested at. The problem is that human beings are faulty. I may betray my Protestant, Calvinist roots here, but who was it who said that Original Sin is the most empirically verifiable principle in the entire Bible? Human beings are faulty. We have things built into our psyches, including attraction to extreme events, and so a really pleasant, nice, optimistic, boring story is going to languish, and something that stirs people up is going to get a lot of attention. I don’t think Facebook deliberately stirs things up for the purpose of stirring them up, but by wanting to serve up to people and bring to their attention the things that they’re most likely to want to click on, we are catering not just to the good, but to the bad in human nature. 

And I think this is possibly one of the worst things about the social media environment is the way that it amplifies some of the worst of the worst in a way that prior forms of mass communication did not. So, it is true that newspapers will have explosive headlines and inflammatory headlines and so forth, but the same newspaper went out to everybody. So, it isn’t like serving up to each person what’s going to get them all excited, right? And I think the Trump incident is just a great example of this, but it’s also just an example of how hard it is to figure out what to do without losing the consumer preference aspect, which is I think a good thing about the internet.

Tom Lenard:

Of course, different people read different newspapers. So, there’s a self-selection there as well. But I had another question about something that was discussed in the report, which involves safety, and you know, that Facebook should be a safe place. Now obviously in recent years, you have on college campuses and elsewhere many examples of people saying that a lot of speech makes them feel unsafe. How would the board approach those types of issues?

Michael McConnell:

I’m sure members of the board have different views on that. I’m confident that they do. I personally think that the word safety has undergone a bit of mission creep, especially on college campuses, where we began by being worried about speech that actually encourages violence or retaliation or intimidation or concrete harm, to the idea that people should be insulated from speech that they wish other people didn’t have, like political opinions that they disagree with. That, I think we should not go that far. Facebook uses the term, but it’s not really very well-defined. So, I think it may be one of the things that the board does is to try to over time to draw the line between safety, meaning actual safety versus safety, meaning of being insulated, being put into a bubble. 

Scott Wallsten:

If we have a minute, I have a question, maybe this isn’t fair to do at the end, but about your book, The President Who Would Not be King. And also, this might be not quite well phrased because you talked about executive orders being mostly the president implementing powers that were given to him or her by the Constitution, but then this could challenge, but how would you rate Biden so far on his use of executive orders?

Michael McConnell:

There’s been a trend over the last several presidencies of doing things by executive order that earlier presidents would have done in a somewhat less flamboyant way. I think most of this is just pure PR. If President Biden issues and executive orders saying, “Hey, over there and Homeland security, you need to think about how to treat children better.” I mean, there’s nothing illegal about that, but it’s also not really an executive order, right? It doesn’t do anything. So, a lot of the executive orders are just fluff, but some of them, I think have been excellent both as a matter of policy, but also some of them have reversed unilateral decisions of the prior administration that I think were overreaches. There have been some that I think were unlawful. In fact, one of them or one of the early ones, which ordered DHS to stop deporting anyone, that was unjoined by the courts because presidents don’t have the right simply to say that statutes passed by Congress no longer are going to be enforced. Of course, they can exercise prosecutorial discretion about the use of resources and so forth, but they can’t just repeal statutes by executive decree. Relatively, I think there’ve only been a few of President Biden’s orders that step over that line.

Tom Lenard:

Well, Michael, it has been great to reconnect with you after all these years, and we really appreciate your taking the time to discuss all these issues with us. So, I think it was a very interesting podcast. So, thank you very much.

Michael McConnell:

Thank you for having me.

Share This Article

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.