fbpx

John Samples on The Oversight Board and Content Moderation

John Samples on The Oversight Board and Content Moderation

[00:00:00.090] – Tom Lenard

Hello, and welcome back to the Technology Policy Institute’s podcast, Two Think Minimum. It’s Monday, May 8, 2023. I’m Tom Lenard, President Emeritus and Senior Fellow at TPI and I’m joined by my colleague, TPI Senior Fellow Sarah Oh Lam. Last month, the Facebook Oversight Board issued a policy advisory opinion on the removal of COVID-19 misinformation. We’re delighted to have as our guest today John Samples, who is a member of the Oversight Board to discuss the report and more generally, how platforms should treat complex issues like COVID information. John is Vice President of the Cato Institute. He founded and directs Cato’s Center for Representative Government, which studies the First Amendment government institutional failure and public opinion. John is also currently working on a book-length manuscript about social media and speech regulation that extends and updates his policy analysis of why government should not regulate content moderation of social media. We’re delighted to welcome John to the podcast.

[00:01:04.880] – John Samples

Thank you very much, Tom. Mentioning the book reminds me that as I was thinking through the issues for this podcast, I realized that I might have because the book is about the same sort of issues that Meta deals with and that I might very easily, without intending, confuse my own views with what is the sort of Meta rules or the Meta position or anything. Right? Because when you think about it, you have to think about what they’ve done, and naturally, like everyone else, I think I’m right. So I might misrepresent, but what I’ll try to say is that there is a difference here between Meta and its rules and the Oversight Board and what it does and me. And I’ll try to indicate where I’m voicing my own view today.

[00:01:51.620] – Tom Lenard

That’s good. How close is the book to being finished?

[00:01:54.850] – John Samples

I’ve turned in 60,000 words and then the last bit of that I realized that I needed to write a chapter on elections and then maybe a concluding chapter and I think the elections chapter will be okay. The elections part is very interesting and just can’t omit it. I’m more interested in institutions and most of the book is about that and trying to come to terms. It became narrower over time. It’s largely about Meta, but fortunately, I think Meta has done the most in trying to build these institutions and so, like them or not, they’re out there. As far as I know, almost all of it’s public. You can find stuff about them. The answer is it’s going to be finished. I’ve written three books now in this notion when you ask someone that, people would always say a book is done when it’s done. Yeah, you’re just procrastinating. But no, that’s true. It really is. You only get one shot with a book and you got to make it as good as possible and make as much of a contribution as you can make. That’s the trade-off with getting it done, getting it off your back and going back to sleep through the night and all those sorts of things.

[00:03:06.010] – Tom Lenard

I’m sure most of our audience is somewhat familiar with the Oversight Board, but maybe you could just refresh us on what the Oversight Board’s remit is and what its authority is.

[00:03:17.170] – John Samples

The deepest sort of quick history is that many people will remember that for many years, all social media were really widely praised and so on, and there was no real problem. Content moderation existed. Content moderation being the taking down of content from a social media site or suppression of it would be another word, or also affecting it in such a way that reducing the audience for some content, are the two major concerns. All of that was pretty noncontroversial until 2016 and the presidential election then. But other things happened throughout the world too that made it more controversial. Of course, we live in a political system in the United States. This is true elsewhere too, that’s highly polarized and led to really putting content moderation on the public agenda in a big way. Mark Zuckerberg’s response at Meta was to come up with an ideal of a kind of court like system. Now, Meta began, as far as I can tell, as early as 2008. They were doing content moderation when they were really quite small, but they also began to think about appeals processes very early. And so one of my friends, Kate Kronick, has written a history of this, and they really started looking to find ways for appeals to go forward.

[00:04:39.100] – John Samples

And we’re developing internal appeals processes throughout the teens. The Oversight Board was, as it were, an inside-outside appeals process. It was inside in that it was created by Meta, but it was an external process because it was charged with being independent. There was also an institutional element to it because you had a group of trustees that were between the Oversight Board and Meta itself. Also, Meta made a financial commitment that was going to continue for up to six years. So in that sense, it was an attempt to have a court that would pass judgment on the propriety of Meta’s takedown decisions. It’s a unique thing, in that Meta is a business and has responsibilities to its shareholders, of course. The interesting thing, I think, for economists, people do, corporate governance is in what sense does this independent board fit in that story? I think it is an attempt, actually by Meta to build legitimacy for its content moderation by making it more than about Meta, that it’s not just Meta’s decisions being made. The short story is it looks like a court and it had 20 members, a few more than that now. It tends to hear decisions and also to make policy recommendations for Meta.

[00:06:02.750] – Tom Lenard

You’re talking about, I think that most of what the Oversight Board does is act as a court with respect to particular decisions or complaints about things that have been taken down or have not been taken down. But this COVID thing is the policy advisory opinion. It’s a little different than that. My question is how frequently does the board do policy advisory opinions?

[00:06:26.190] – John Samples

So we’ve done, this one would be the third, I believe. The answer is not that often, though I would expect them to grow in number in the future. The one thing I would also say is every case decision looks like a court decision. Somebody wins, someone loses. Usually, we say Meta, put that content back up. That’s the most frequent decision. But during that, the board can also make policy recommendations that grow out of the case. So you have policy recommendations all the time. These are just bringing together an issue. This specific one goes 42 pages, and they require a lot of staff time and deliberation on them. Remember, also the other thing to keep in mind is that they involve the entire platform, which is to say 3 billion daily users. Both Meta and the Oversight Board, in endorsing these kinds of policy advice, have to think about the entire platform. Now the other thing I would add on both the policy recommendation side with cases and in these kinds of policy advisory documents is that Meta it’s not obligated to carry them out. They are obligated if we say put that content back up, they have agreed that they will do that.

[00:07:45.210] – John Samples

They’re not obligated in the same way to do policy recommendations. So it’s important that the Oversight Board make the best possible case and indeed, in terms of what Meta both should do and what’s in the best interest of the company. I think if you talk to my colleagues, a lot of us would say that things we recommend to them is really often something the company should want to do and hasn’t done for some reason. And we’re trying to get them to do that from an outsider’s perspective.

[00:08:17.970] – Tom Lenard

What was the genesis of this particular study? Did Meta ask you to do it?

[00:08:21.730] – John Samples

Policy advisory requests come from Meta and they have to come that way. Although sometimes I would say informally that I think there’s a kind of musing of interest in which we have a series of cases. In this case we had a couple of cases about COVID. There’s political controversy, political debate and discussion. Then the Oversight Board thinks it looks like a good thing and Meta thinks having the Oversight weigh in is a good thing. But the actual formality is, in this case, they said, this was last year. It’s been a couple of years since we started. What is really a kind of special COVID regulation related to the COVID pandemic. We’ve carried out a policy and by the fall of last year, Meta said, we’ve taken down 27 million pieces of content. Now the Pandemic seems to be lessening. And about a few couple of weeks after this, President Biden said he thought the pandemic would be over soon. The question to the board was, should Meta pull back? Should they change the way they’ve been going about content moderation in light of the pandemic?

[00:09:36.320] – John Samples

And what we did with this PAO is talk about that but also talk about a number of issues related to Meta’s work during the pandemic. I think if you look at American media, you see a lot about what happened and is happening during the pandemic in terms of content moderation on social media across the board. What we tried to do here was say, look, that discussion is going on. We want to get some concrete suggestions, concrete ideas that in a sense respond to a lot of the things that are being said right now. So instead of trying to decide these really contentious issues that are about content moderation and what happened and who did what, we tried to come up with some ideas that everyone should be able to agree to and Meta should be able to down the line. So it’s a pragmatic thing, really. Being pragmatic for 3 billion people is a challenge.

[00:10:31.670] – Tom Lenard

What are some of the more prominent or obviously, you said there are 27 million things that you’ve taken down, but what are some of the most notable examples of the types of things?

[00:10:41.300] – John Samples

One of the interesting things that we discovered during this work and it was really public, actually, was that Meta fairly early in the whole undertaking, came up with 80 different claims that they said that they would take down on site. Now, I should say that Meta had also come up with a process. The crucial date in all of this was January 30, 2020, when the World Health Organization declared a global emergency. By the way, the other crucial date here was last Wednesday when the World Health Organization decided that the emergency had passed. So on that 30th, Meta came up pretty quickly with a three-fold test. Is there a global emergency? Yes, there was. Is content false, two. And number three, does the content cause what was called imminent physical harm? Now, what that meant mostly, first of all, it meant we were telling people to do things that weren’t going to work or very extreme things that were wrong. Mostly the general idea is it would make the pandemic go on longer. And of course, by a year into it, it all became pretty much about vaccinations, I think, or at least that was the big new thing.

[00:11:59.800] – John Samples

They had publicized 80 propositions. I looked at these propositions and I didn’t feel like even though I’m not an expert in the field or anything like that, but a lot of them were things like, if you get vaccinated, your body will become metallic or your body will become magnetized. There was a lot of misinformation like that. The things that we see so often that turned out to be contentious and truly contentious in the sense that they turned out to be right were generally not in those 80 propositions. The one exception that was put up and then taken down was for a period of about three or four months in 2021. One of the 80 propositions was that the virus had a human origin, a lab origin, and that indeed was on the list for a while, but they also took it down when it became more open to questions. So it was treated initially, I think, as a kind of conspiracy theory or perhaps as a hate speech idea, but it didn’t persist. And we all come from a public choice background, so we expect when the rules get started, they’re going to continue forever no matter what.

[00:13:14.560] – John Samples

bayesian updating is not a central idea in public choice, but this case, it seems like they stopped it on that point.

[00:13:23.140] – Sarah Lam

The lab leak theory, do you think the Oversight Board was fast enough to take it down? Because a lot of the criticism about online platforms and misinformation is these hot topics. Time seems to be of the essence. You don’t have years to deliberate. Is that something that has come up? The timeliness.

[00:13:43.930] – John Samples

In the case of the lab leak, Meta did that on its own. And then typically, I would say the other thing about this is the fact that if we assume, which I have no reason not to assume that those 80 propositions that were prohibited, they were the only basis for taking stuff down about COVID then you could contest against the standards. You could go there and say this is wrong, this is right, or whatever. But yes, it is true, and we’re trying to do better in getting more of a rapid response, and we’re beginning efforts to do much more on that. The court analogy for the OSB, I think, was part of it in the sense that these, and the deliberation part, takes a while to do these cases. I think the fact that concerns have been taken to heart and we’re going to do better, it’s a challenge.

[00:14:37.190] – Tom Lenard

The lab leak theory and taking it down, I think, brings up a couple of other questions. One is, regardless of whether it’s true or not, it’s not obvious to me how that information presents a risk of imminent harm one way or the other.

[00:14:51.890] – John Samples

I think in that case, my impression was at the time was that there was a great concern about conspiracy theories and it was labeled as a conspiracy theory. And it was also, as you may recall, at least in public, was thought of as a form of hate speech, which, to be specific, in the Meta context, it would be making claims about a group of people that in this case spreading disease or something like that. Perhaps that’s one of the reasons it went back up. I should say it’s still not clear whether the lab leak theory we’ve heard about several parts of the US government think it’s plausible. One of the problems to me, just to be more general about content moderation is there’s an imminent harm idea and there are some things that clearly might do that, but you really want to decide for your platform is what kind of things should people be talking about? We don’t need to have debates. In a sense, Meta has decided and it’s not pure free speech, but still we don’t have debates about whether vaccinations make your body become a giant magnet.

[00:16:02.570] – John Samples

There’s a lot that we do need, and that’s the line drawing business that Meta at some point comes involved in. The whole thing is immensely complicated by the fact that they don’t do it. It’s not like reading it in the paper or something. They have to do it for 3 billion people every day. The estimates are 3 million pieces of content have to be dealt with. That 27 million number that they took down during COVID I mentioned earlier, I did some calculations that was about 1% of the total content that they dealt with during the two years of COVID. The number is huge. But the problem is, every day presents another, and this is where I think economists really could shine in talking more about content moderation. Meta has said a lot of it is done initially at least by algorithm, by machine learning. It has to be. You’re dealing with millions and millions daily. And of course, those kinds of algorithms face the trade-off between false positives and false negatives. This is perhaps still the least understood aspect of all of this, that you have to make a choice.

[00:17:16.500] – John Samples

If you decide, for example, this is my supposition. Meta has, as you might not be surprised to hear, the algorithms perk up a little bit when they hear the verb kill and they might actually not worry too much about false positives. I don’t know that for sure, but you can see why they might not. They might say, look, there’s bunches of words out there where the kill is just metaphoric or whatever, or it’s meaningless or it’s just people. But there’s a small number of cases where there’s a real possibility of something bad happening and Meta getting blamed for it. Then this is of course, the false positive. You have to pay a price in false positives that is suppressing speech that doesn’t deserve to be suppressed. It is a problem in many ways, unless, as it were, the Pareto curve goes outward and the technology can get you to a minimum payment of false positives to get the bad stuff. And then the problem is that looks intentional, and that’s why you have an appeals process. The appeals process can do some of that, but in some ways, humans too are subject to the same problems.

[00:18:27.890] – Sarah Lam

Actually, to go back to the timeliness point, a controversy like the lab leaks hypothesis, you need time to unwrap it actually. So after two years, it’s becoming more obvious that it could have been it’s like a very high probability event. So actually timeliness and false positives might be kind of the same idea. How long do you have to let the truth come out versus making a type one error or a type two error? Do you guys have a grid for that? I guess it is that balance. You’d prefer one type of error over another.

[00:19:03.760] – John Samples

So I think there’s Meta which has their views and inevitably Meta is going to be the most important because the appeals process, we got a million appeals the first year, so that appeals process will go on. But most takedowns are not going to be appealed and they have a view of that. Our view is to be set up. It specifically says, our charter says that voice, which is their version of free speech, is the paramount value of Meta and therefore for the Oversight Board. So in cases of doubt, choose voice, right? I often think of Chief Justice Roberts who said in an opinion something like, in cases of doubt, the censor does not get the benefit of the doubt. And that’s, I think, a fair view of our point of what we’re doing. If you look at the cases, that’s what we’re doing and that’s our charge. In this case, with COVID I think we tried to walk that line. One of the issues is the size of the platform because there were places that had not had COVID yet but might have information issues and things like that.

[00:20:13.900] – John Samples

We had an opening there for Meta to actually still continue at some levels to take down content. I do think the actual size of the platform, as I thought more about this and the board worked on this, is a big issue because there are lots of different contexts and causing harm, causing imminent physical harm varies a great deal and it can vary by region, by country, by whatever. But the other issue here is that Meta was very clear that they want one rule for the entire platform and they do that because they have to moderate or regulate the platform and its content by machines. That’s part of the history too. I mean, it’s just the size of the platform, but part of the history was they used to rely mostly on users to complain about stuff. And again, Kate Klonick, I mentioned before, wrote the history about this. She said they found out almost immediately that people sent in all kinds of complaints about content that didn’t violate any rules but they just didn’t like. So the human reviewer, as it were, had problems. One rule is what you’re going to have and that’s a challenge too, I think, and it was a challenge for this policy advice.

[00:21:30.880] – Tom Lenard

The lab leak issue, I think, brings up two issues that I think are more general and which I think are at least to me, very important and interesting in terms of the whole COVID and other scientific issues. The first is that when you read the report and maybe Meta had no choice about this, but it seems very deferential to the public health establishment, to the WHO and the health agencies, at least in the United States, probably all over the world. But at any rate, as time has gone on throughout the course of the pandemic, the public health establishment has taken somewhat of a hit in terms of its reputation. One can argue whether that’s justified or not, but there are serious critics. They’re sometimes belittled as being out of the mainstream. But I think serious people think in the United States, when all is said and done that our approach was not the best in many respects. I guess it brings up the question about how you don’t want to have a monopoly of information just provided by the public health agencies or the WHO. And the second issue is this issue that you talk about in one of your recommendations.

[00:22:39.390] – Tom Lenard

Recommendation 13 State Actor Request, when I look at the lab leak thing, obviously I don’t know. You probably know it strikes me that there was pressure from the government, at least in the early part of the pandemic, not to give too much credence to the lab leak theory. It also cast dispersions on the NIH, which, depending on which day you read the newspapers, did or didn’t provide financial support to the Wuhan lab. At any rate, there was politics involved. It wasn’t just a matter of facts, right?

[00:23:11.940] – John Samples

First of all, I would say I don’t know exactly why they restored or stopped taking down lab leak content. I don’t know what the internal process was. I do know, there’s actually another example of Meta’s public efforts, which is Meta has a sort of engagement process. Now what that means is that they essentially meet with organized groups, organized interests involved in a policy area. Sometimes it’s a country, sometimes a region, sometimes here was to figure out what to do about COVID. That was published by a man named Peter Stern, who’s the head of that team. And in that publication, he revealed that part of the debates they had was that some people, then most of these people, are experts of one sort or the other. That people had expressed a concern that if they relied too much on the public health authorities, particularly the World Health Organization, it would, over time, might discredit them. But I do think that the way they had set up the decision process, which was the emergency falsity and the harm, they were inclined to go to a recognized expert, but not just a recognized expert in the United States but throughout the world.

[00:24:30.580] – John Samples

That was a tendency toward the World Health Organization, although we know from these lawsuits in America that there was also the CDC. Now to the issue, though certainly on the board there was a concern that you just expressed, it’s not a good idea to have a monopoly control, that there’s a lack of contention even. There can be mistakes made, even if there’s not worse things and with worse motivations. But certainly, mistakes can be made. So our response there is in the PAO, which is both a call for going through and making sure that the prohibitions that they are acting on are taken down, are still true and also to have a much more variegated decision-making process with more interest, not a biased process. Or to make sure the process is not biased so that you have not just organized interest but abroad. I think probably what will happen now that the global emergency is over is that Meta will look to more local, that is, national or regional health authorities. But still, there’s the same issue.

[00:25:47.600] – John Samples

You want a process that’s vigorous and filled with debate. And the article I mentioned by Peter Stern does suggest that that happened, but we don’t know to what extent. I think there was a concern about just too much concentration of power in itself. Even if there are no mistakes are made, do you want this process over time? And the PAO suggests no. You want to move toward a more pluralistic process and debate than you had. The other thing about the governments was obviously in different ways. Members of the Oversight Board have government concerns, sometimes from experience, sometimes from our traditions and our Constitution, legal arguments. What you have in there is on the one side, what were government requests that were made and to be at least transparent about these kinds of situations, it’s going to be tough to keep the two separate. It turned out to be that way. But I think there needs to be a clear transparency on what governments were looking for and that’s what’s called for here. The other side of it, of course, is what I think we know from public debate that’s harder to deal with by Meta or in general, which is you end up with the government and these companies entangled in ways that are unclear in themselves.

[00:27:07.980] – John Samples

They may even be unclear to the people involved in the entanglement. Right. What we have tried to do is set a clear criterion for when they need to be transparent about it. But also some of the entanglement I think goes beyond that. It’s more often Meta was asking government officials about particular claims. And I think for people who are certainly classical liberals or similar to that, there’s always a concern when that kind of entanglement takes place. But even in the United States, such entanglements don’t necessarily rise to the level of constitutional violation. So we tried to get as much as we could in a reasonably clear criterion for revealing government actions and how they relate to social media. And I will say on my own part, yes, Meta has 200 million Americans; these are extensive platforms for speech and for preparing for the run-up to elections and so on. The interactions with government may be inevitable, but they are a matter of concern. That’s what we try to act on here in a pragmatic way.

[00:28:24.370] – Tom Lenard

I know the report is intended to be prospective rather than retrospective, but obviously what you do prospectively depends on what you’ve learned. There is a somewhat well known incident about a group of experts of MDs who recommended an alternative approach which they called focus protection, just basically gets to the lockdown issue which said we shouldn’t be stopping everything. We should focus protection on the most vulnerable essentially and have other people go about their business, get the kids back in schools, et cetera. There’s evidence of emails among the highest officials at NIH saying we should shut this down. I’m curious whether maybe you know or you don’t know or maybe you can’t say whether that was conveyed to social media platforms like Facebook, like Meta, how explicitly or implicitly it was conveyed, whether they had a policy of if not taking down, suppressing the importance of those posts.

[00:29:25.250] – John Samples

What I will say is the Oversight Board was interested in this question and Meta was asked about it. And Meta said that their policy of suppression did not concern policies. That is to say what they were after was things that told you to take false cures, that told you that the virus was not more dangerous than the flu, that told you the magnetic thing or that the vaccinations would cause something totally bizarre. They were trying to get to the question of people offering medical advice in a way they said that I have heard people say that issue you raise at that particular group of people, group of experts concerned Twitter more than Meta. I don’t know if that’s true or not, but I do know what Meta’s position is on that. I think Meta has got a lot of legal issues related to all of this and that probably will mean that we’ll learn more over time and also makes their publicity efforts about their own efforts rather. More complicated to be straightforward about this but that’s their position, that they stayed away from policy. The other thing I would say is I think it’s going to be interesting just going forward.

[00:30:43.500] – John Samples

Meta has one of the big issues is politicians and elected officials. For reasons you can imagine, you start taking down their speech or really going straight into elections and democracy. Well now Robert Kennedy Jr. is running for President and Meta has a policy, they have these fact-checking networks. And indeed, in a sense, the World Health Organization or public health authorities were fact-checkers about the truth or falsity of claims. Well, Mr. Kennedy makes claims and I think the sort of thread to be run there is that if he says things that really lead to imminent harm they will take down stuff even if you’re running for office. But generally speaking, the idea at Meta is that look, these people are running for office. Voters have a right to hear what they say. I think that will be interesting to see how that works out. I’ve already seen some people forwarding his material that I thought in the past that might be taken down. But he’s a presidential candidate now. He’s running for office.

[00:31:52.740] – Tom Lenard

His claims about vaccines are not limited to COVID vaccines and I think he’s been on this thing about vaccines causing autism or childhood vaccines causing autism for a while. I will say that a number of years ago that was a somewhat respectable view. It was a view that was taken seriously by a lot of people. I mean, they weren’t all considered cranks, I don’t think, initially when that view was first expressed.

[00:32:16.010] – John Samples

This is the false positive problem in another version, which is sometimes false positives aren’t clearly false positives for some time. Right. Ideas that later turned out to be plausible or true sometimes are met as absurdity or dangerous absurdity or so on. So one of the tricky problems here is to try to and I think one of the ways to get at this is sort of things that violate claims, that violate laws of physics. That’s the magnetism thing. But inevitably, and I think one of the things that concerns many people too, is in some countries a lot of the false claims are going to be tied up with religion, with religious leaders as well as politicians and so that itself has its own set of sensitivities. But I do think the public officials part gets a lot of complaints. But you can see, I think that’s a sound thing the voters cannot be protected from themselves. That would make a liberal democracy much less liberal and much less democratic.

[00:33:20.470] – Tom Lenard

After somebody takes a vaccine they could just go up with a magnet and see what happens.

[00:33:25.250] – John Samples

They could do testing. I do think COVID was interesting from an economics point of view because the strongest argument was that it had a period of incubation that actually you didn’t have to protect people from the speech they might agree with you were really protecting people from the effects of them agreeing with speech. In other words, they could spread the virus to third parties. It’s a hard argument to make, maybe and so on, but it’s at least potentially makes sense and it keeps you away from paternalism. You don’t have to protect people from themselves which I take to be a pretty important thing, whether it’s the US government or Meta or social media group it’s really a core thing that people need to be able to hear and to make their own decisions.

[00:34:15.590] – Tom Lenard

I think the whole theory of required vaccinations is the externalities that this is not just that it shouldn’t be purely an individual decision because it affects a lot of other people. Something that when this whole thing started out, I thought was obviously true. Now that we’ve lived through it. I see that even though I think my personal view is that people should be vaccinated, it seems to me there have been a lot of negative or some negative results from the thing, one of which is that a fair number of people, I think. I’m not sure if the data have stopped taking other vaccinations.

[00:34:46.740] – John Samples

Well, the other interesting question here that the board got into some which to me is also typical where I think economics is really intellectually interesting is the question of causality. Let’s just say the way the Economist thinks about this thing let’s just say there is this bad idea on Facebook how do we know that they cause harm? It could be that yes, they’re all out there but they cause minimal harm. I always point to the United States in the late 19th century three states pulled back and rescinded vaccine mandates because of the power of the anti-vaxxer movement. There’s a study that shows that involved Americans that seem properly designed to me and they said the American participants in it, 28% of them already from the start intended not to get the vaccination. The causal effects can be quite minimal actually, depending on the situation. You have to try to figure that out. I do think there is some evidence that it makes some difference but it’s also a mistake to think that the material on social media caused a large part of the differences. That it had big effects and that by getting rid of it you would get rid of the whole problem of people not getting vaccinated or get rid of much of it.

[00:36:11.160] – John Samples

It depends on the country, it depends on how open the country is to vaccinations and to following public health (inaudible), but it’s not clear. I think the board did a decent job of making that issue was there but there is evidence that had some effect. The causality idea is there because it’s not obvious that it is. That’s what I like about economics, is you ask questions about things that everyone takes to be obvious, and it’s not obvious.

[00:36:39.600] – Tom Lenard

I think that’s one of the things we like about economics, too. This has been a great discussion. I think we’re basically out of time. We appreciate you taking the time to talk to us. I think this whole discussion about what is COVID information and misinformation, as well as other scientific subjects is going to be ongoing.

[00:36:58.900] – John Samples

Let’s go on for a while, then we’ll get together and have an idea about what economics could have contributed to this kind of operation. I just think it really is true that people think of economists, you two will well know this, is it’s like it’s going to be Keynesian versus the Monetarist or something like that? No, that’s not it. It’s these basic ideas that all economists believe. They’re just so powerful. Law is powerful too, I’ll grant that. But you need this extra thing in there where these people think like this to help you get better decisions.

[00:37:32.540] – Tom Lenard

I think the basic failure that economists should have been pointing out was not to consider trade-offs. That was, I think, the largest policy failure, which is obviously that’s at the core of the way economists think about issues.

[00:37:47.030] – John Samples

I mean, I think I will finish by I don’t know this to be true about Meta, but I do think it’s useful for citizens in general to think back to January 30, 2020. If you were Mark Zuckerberg or the policy team at Meta, this stuff is coming out. There’s a great deal of uncertainty. You really don’t experience this 40, 50 years, 60 years in the past, so there’s that. Then you have the problem of you’re running a business here. It’s got a brand, and who knows how many people are going to die from this? You could be blamed for it. I think that’s a thing that’s not appreciated about it, and that’s a legitimate concern; I don’t know that would make a judgment about what that led them to, but you can see how they ended up and hindsight bias. All of us have that. So looking forward from January 30, 2020, that was something on the business side.

[00:38:41.560] – Tom Lenard

I don’t know that social media, including Facebook or Meta, was approaching this any differently than other types of media.

[00:38:48.140] – John Samples

That’s an interesting question. I just think the social media and the participatory element and the scale in my years at the Oversight Board are the one thing I would say that’s very different, I try to appreciate scale more than when I started.

[00:39:02.780] – Tom Lenard

Well, thanks again. This was great.

[00:39:05.360] – John Samples

Okay, thank you.

Share This Article

View More Publications by Sarah Oh Lam, Thomas M. Lenard and John Samples

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.