fbpx

Does Big Tech Need its Own Regulator? with Neil Chilson

Does Big Tech Need its Own Regulator? with Neil Chilson

Dr. Sarah Oh:

Hi, and welcome back to Two Think Minimum. Today, we have Neil Chilson as our guest. Neil Chilson is a Senior Research Fellow for Technology and Innovation at the Charles Koch Institute. Prior to joining CKI, Chilson was the Federal Trade Commission’s Chief Technologist, where he focused on the economics of privacy and established the FTC’s Blockchain Working Group among other things. Prior to his appointment, Chilson was an advisor to then acting FTC chairman, Maureen Ohlhausen. In both roles, he advised Chairman Ohlhausen and worked with commission staff on nearly every major tech related case, report, workshop, and proceeding. Chilson is a regular contributor to multiple news outlets, including The Washington Post, USA Today, Seattle Times and Morning Consult. Chilson holds a law degree and a master’s degree in computer science. And in the interest of full disclosure, we should let listeners know that we received funding from your employer.

Scott Wallsten:

And we’ll see whether that’s still true after the interview.

Dr. Sarah Oh:

Thanks, Neil for joining us today. You’ve been busy over at CKI and Stand Together with a recent chapter in the Global Antitrust Institute’s, Digital Economy Report entitled “Does Big Tech Need Its Own Regulator?” Could you tell us a bit about what you wrote in that chapter?

Neil Chilson:

Sure, Sarah and Scott. It’s great to be here. I’m a longtime listener and a big fan of the work at TPI. So my chapter, is one of, I believe there’s 38 chapters, or 34 chapters. It’s a very big report. They’re sort of standalone chapters, but they’re grouped thematically. And my chapter takes on this idea that has grown in cache, that big tech companies, we need a new regulator in order to govern all of the issues that surround them. It’s largely come out of sort of antitrust camps, but a lot of the issues that we hear about around tech companies are sort of lumped in together. And that is one of the, one of the challenges I hope we’ll talk about with a new specialized regulator. But I take on specifically four different reports, the most prominent proposals and the most fleshed out proposals for a new tech regulator, and I look to summarize those reports and then I sort of take a cost benefit analysis here where the largest promise benefit of a specialized regulator is typically a need for specialized expertise. And I see how well the reports argued that there’s a need for new specialized regulatory expertise and how they identify that expertise. And then the biggest costs primarily being the risk of regulatory capture. And I talk about how the reports talk about regulatory capture, if they do, and how they suggest mitigating it, if they do. Ultimately, I conclude that some of the reports make stronger cases for a need for regulatory expertise, but none of them really are particularly good at identifying what’s specific, why there is an expertise that’s not being satisfied by another agency that needs to be created in a new agency. But even worse, almost none of the reports talk about the largest risk of regulatory capture, and the ones that do sort of have a slap on fix at the end that seems to contradicts a lot of the statements that are made throughout the rest of the reports. And I’m happy to dig into any of the reports and those topics further.

Scott Wallsten:

So, on regulatory capture, tell us why you think that would be an issue. And also why would it be worse for a new agency than it is for existing agencies?

Neil Chilson:

Yeah, so all agencies tend towards capture. That’s something that Stigler told us, and it’s a bit ironic that one of the reports that I’m addressing here is, comes from a center named after George Stigler. So a fact that I’ve pointed out whenever I can, it’s not that new agencies are necessarily susceptible to capture, it’s that specialized agencies are more susceptible to capture than generalized agencies. And that’s the point I make in the article. And this makes some sense just from that the more narrow the audience for your regulation, the more likely you are to achieve one of those, the common public choice problem of having a very focused interest group that can put dispersed costs on a very broad group of people who basically, it’s not in their interest to really pay attention to what’s happening. That’s much more common in the specialized agency like, for example, the FCC than it is in a more generalized agency, like say the FTC. And I’ve talked a lot about that in other work about the difference between those two. Just to make this a little more concrete, you know, I spent some time as an attorney advisor to a commissioner at the FTC, and I had friends who were attorney advisors to commissioners at the FCC. And if you just put our calendars next to each other, the main difference you would see is that my friends at the FCC were constantly having meetings with industry groups and industry participants who were begging for their time to talk to them about how to shape the regulation or not to do the regulation. If you looked at my calendar at the FTC, nobody was asking to come in and talk to us about that. Pretty much, the only time we met with industry people was when we were suing them. So very different, just a sort of concrete example of the difference in the sort of conversation even between industry and the regulator in a generalized enforcement agency, like the FTC, versus a specialized regulatory agency like the FCC.

Scott Wallsten:

So, I mean, I completely agree with that, but if regulatory capture is an issue and sort of in the, if it’s an issue in the foreseeable future, why are tech companies opposed to a digital regulator?

Neil Chilson:

Well, it’s not clear that they are to be totally honest. We have increasingly heard some calls for, from Facebook, for example, that they want to be regulated in some of these spaces. Most of them have not been, I don’t know that there’s been a lot of statements from the companies about the form of the regulator. In fact, that’s not even uncommon across all of the calls for increased regulation. Most people don’t talk a ton about the form of the regulation, about whether it should be in a new agency or not, or if they do, they sort of say it offhand without, without digging too deep. So, I’m not that surprised that the companies haven’t really said something specific on this, but we’ve seen examples in the past of calls for specialized regulation. I mean perhaps the most prototypical example is again, the FCC, but in the early days there was, it didn’t look exactly like an agency, but when AT&T formed a sort of agreement with DOJ, for them not to sue them on anti-trust anymore if they provided universal service, it had a flavor of this, right? Where it was like, well, we’re now regulated by you, but it’s in our interest because we can use some of this regulation to eliminate competitors. And so, there’s some other examples that are more esoteric around trucking and some other issues, but there is a long history of agencies that are stood up for specific industries over time, becoming tools of those industries and a way to protect against, in particular, competition within the industry, but even worse than sort of disruptive competition that just doesn’t fit into a regulatory category. And one thing I point out and the academics who have studied this have pointed out, is this doesn’t even have to be nefarious, right? It’s not that the industry necessarily has to be purposely trying to keep competitors out by shaping the regulation. Although, that happens a lot. If the architecture of the regulation just fits the current industry, then when a new industry or a new player or a new technology that’s disruptive, that sort of breaks the business model of the existing industry comes about, it’s very difficult for that new competitor to really fit into a regulatory box, and so they tend to have a lot more regulatory risks upfront often. And examples of this are increasingly common in the sort of ride sharing or resource sharing economy, whereas Airbnb or Uber, they had some examples of this, but in those cases, in many cases, they were able to sort of take it to the customers first, but that doesn’t always work for all the business models that don’t fit into a regulatory model.

Scott Wallsten:

So there are, I mean, there are other kinds of capture that can happen at an agency. Maybe capture isn’t even necessarily the right word, but I mean, Sam Peltzman’s work showing that there are all kinds of interest groups and regulators are always being lobbied by everybody, from every possible side of an issue. And ultimately that can lead them to be captured by any of them. But going back, why do critics, why do people advocating these new methods, why do they say we need this? I mean, part of it is that they believe that the current generalized agencies have failed.

Neil Chilson:

Yeah, that’s true. And you know, it’s interesting because of the four reports that I look at, which are the four most prominent ones, like I said, that propose a new agency. Three of them basically say, we think it might be a good to have a separate agency, but we’re not 100% sure. And actually we’d be fine probably with something else. The Stigler Center report is probably the best example of this. Throughout the body of the individual committee reports, and then the body of the main report they repeatedly call for a standalone agency. But then in the sort of executive summary, they say, well, but we recommend that it at least initially be established as a branch of the Federal Trade Commission. And because the FTC has a long history of not being captured. So, that’s pretty common among these reports. The one report that really kind of doubles down and says, no, absolutely we need a separate agency is the Shorenstein report led by Tom Wheeler and a couple of other folks from the FTC. Gene Kimmelman, I think, is a coauthor on that as well from the FCC. And they argue that essentially a new agency is needed because the tech industry is so different from what other regulators deal with. And they call for things like an agency that has digital DNA, but they not only propose a sort of new agency for tech, they propose a new administrative law model, essentially, that would have a sort of collected group of a committee. I’m blanking on whether exactly they call it right now, but a committee of stakeholders who would present proposals to the agency, and then the agency could vote on those proposals. Which in many ways, I think is almost a formula for regulatory capture in many ways, because it basically lets the people who are in the room set the agenda and it actually sets up a room for those people to get in and get together and come up with a proposals for the agency.

Neil Chilson:

But the main reason that all the papers argue for is the need for specialized expertise. And I get into that somewhat because while a lot of them sort of hand wave about what exactly expertise is needed here, they don’t really break down the different types of expertise that agencies could have. And in my paper, I talk about the fact that agencies have several different types of specialized expertise. One of them is a sort of regulatory expertise just in the process of making rules or enforcing laws. And another one is industry expertise, which is, you know, specialized knowledge about the particular industry that’s being regulated. And you can see the need for specialized expertise, having an agency that has some expertise in making rules, but a lot of agencies have that. So that type of expertise is not particularly rare and hard to get. Industry expertise is hard to get, often, but it has some other concerns.

Neil Chilson:

And in fact, in this industry, in particular, even calling it an industry is a bit of a stretch, right? It’s a sector in many ways, these companies have really different business models. All of the proposals talk about the same four companies, right? Sometimes five companies, but those companies do really different things. And probably the proposal that’s the most honest about this is the competition and markets authority report, which builds on some of the work that Jason Furman did for the UK, where it actually limits its focus. It talks a little bit upfront, it talks about the tech industry more generally, but then it limits its focus to Google and Facebook’s advertising, which is much more like a specific industry. And I think the reason that these reports avoid sort of really clearly defining the industry is because they’re walking a tight line here, right? Part of them is arguing these companies are monopolies. So, if you lump them all into the same industry, or you’re saying they’re all in the same maybe market, you’re sort of undercutting some of your competition complaints. But if you say that, well, they’re all so different from each other and they don’t compete with each other that they’re really only in their own markets. Then the question is how, if that’s true, how can you make an agency that specialized in all of them? Because they’re so different from each other. And so, what most of the papers actually point to as factors that maybe are different, talking about things like network effects, in conjunction with the fact that these are multi-sided platforms, the question is, are those things that you need a new agency to be expert in, are we completely lacking any agencies that have some expertise in those areas? And the answer is no. I mean the FTC and DOJ have both looked at platform economics. They’ve brought cases in this space, and network effects, many agencies know about this. I mean, the FCC has been focusing on network effects for a long time and it’s the same with the FTC. And so where they do point out the need for some specialized expertise, it’s not clear that we’re missing that in the current regulatory structure.

Scott Wallsten:

A couple of things about these proposals bothers me. I mean sort of find what you’re saying. One, the idea that digital is a description of anything is absurd. I mean, it’s like as if we decide, we need to regulate companies that take notes using pencils. I mean, digital has nothing to do from one company to another. I mean, I guess maybe it’s a stand in for the word platform, but even that it’s not quite. And also, I find proposals that suggesting a new agency to be kind of lazy actually, because that way you don’t have to actually define the problem. You can say, “Oh, let’s just create this.” There was some sort of vaguely undefined problem, and we know it’s there, and another agency will take care of it. Obviously, I’m grossly overstating it, especially like you said, from the report Jason worked on, where they discussed things like the advertising market, but that sort of seems like that’s kind of part of your conclusion from these studies?

Neil Chilson:

Yeah. You know, I make a sort of awkward metaphor in the identifying an industry to something that data scientists do around cluster analysis. And now thinking back, I now realize that usually when you make a metaphor, you should make a metaphor to something that people probably already understand. So, given my computer science background, it was like the obvious analogy to me, but maybe my paper now has to teach people about cluster analysis before they even understand the metaphor. But the main point being that if you’re going to group things into an industry, you have to identify what makes them like each other, but also different from other things. And these papers just don’t do that well.

Dr. Sarah Oh:

I agree. I mean most brick-and-mortar industries seem to be going digital now anyway, right? So that is the scope of the problem. Walmart is going into Prime soon, and you know, cloud and AI are touching all the older industries. So digital seems to be an all-encompassing word that would absorb all of commerce. Did they mention that at all?

Neil Chilson:

So actually, some of them are somewhat honest about that. Like the Stigler Center report, and the Shorenstein proposal, like their definition of the industry is extremely broad. And basically, I think Shorenstein basically says, oh have the quote right here, “Consumer facing digital activities of companies with significant strategic market status.” That’s how they define the industry that would be regulated by this company, which by my reading includes basically any large consumer company with an online presence like Walmart or Target. It’s hard to believe that you could have an agency that could be an expert in that wide of a swath of the economy. And so, the general expert rationale for a specific agency kind of breaks down in that. Others are more narrow. Like I said, the CMA is the CMA. When they talk about, to the extent they talk about a specialized regulator, they’re focused really just on Facebook and Google’s advertising issues.

Neil Chilson:

Feld probably makes the most, Harold Feld that is, at Public Knowledge, probably makes the most well-thought-out attempt at defining an industry, and points out that there’s a lot of challenges here in defining it. His ultimate conclusion is he says that the shape of the sector may not become clear for some time. And Congress may need to revisit its initial decision about how to define the industry, which again, I think it just admits that we’re not really sure what the agency would be specialized in. We can’t define it, and we may have to revise it over time, which I think sort of puts the cart before the horse, given that specialization is the primary rationale for creating a new agency

Dr. Sarah Oh:

Do you see analogs with the anti-trust question for defining the market? So for all these antitrust cases, dealing with the big FAANG companies defining the market is the main question. And so it seems like it’s the same question here for regulation. How do you define the industry?

Neil Chilson:

Yeah. You know, I think they’re similar questions. I think the big difference being that Congress has many times in the past created agencies without having, I mean, there’s no legal requirement that they actually define a market, right? So, and that’s the big difference between that and an antitrust case. So my argument is not that Congress has to define a market in order to create a new agency. It’s just that that’s the only time it makes a ton of sense, is to at least understand what you’re trying to regulate. And so that’s kind of how I see the big difference. And also I think there are some differences between markets and industries. You could make an argument that there are a cluster of companies that are similar enough to each other, even if they do different things in different markets, that it might make sense to have a single regulator, but you know, you have to make that case. I just think most of these papers don’t.

Scott Wallsten:

You wrote more than a year ago, I guess, in the Morning Consult, that Americans were optimistic about technology, and to the extent that that’s still true, what explains the strong push for new regulations on them? I mean, when you wrote, I think the UK report was the only one not the focused on the US, but several other countries have made proposals like this. What accounts for the popularity of these ideas if people are happy with technology? Or maybe things have changed and they’re not?

Neil Chilson:

Well, I mean, I still think that people generally are very pleased with the companies that they use in the tech sector. Amazon consistently polls as one of the most popular companies in the country, certainly much more popular than Congress. And so, to me, the question is, is there a disconnect between the sentiment that people feel for these companies and the political bang that you can get out of attacking them? So, I do think that people have concerns about the companies, but consumers are complicated, right? You can be pretty happy with something and still want more out of it. The question is whether or not you can get more by regulating them. And in this case in particular, can you get more for the consumer by having a specialized regulator for these companies? And so, I think, I don’t know that I’m doing a good job of answering your question, but my sense is that there’s a lot of political benefit to proposing ways to hurt some of these big companies or to regulate these big companies, even if that is sort of disconnected from the general sentiment of the popular culture around this. And that’s just in the US, I think in other countries, it could be, you can add a whole other layer of things, which is sort of protectionist, anti-American big companies taking our data sort of thing, that may have a totally different flavor for what motivates regulators in those countries.

Dr. Sarah Oh:

In those proposals, do they talk about the scope of regulation? Would it be a mix of privacy, Section 230, antitrust? What exactly would they be rulemaking about?

Neil Chilson:

So again, several of them are pretty vague. Some of them are broad, and some of them are vague. The Shorenstein Center is very specific that it’s very broad. Sorry, the Stigler Center, I should say, is very specific that there’s a bunch of different issues. They all are, according to the report, more or less rooted in market power problems, but the solutions might be things very different from antitrust law. And so, it does appear like the regulatory authority that something like the Stigler Center report would advocate for would be very broad across not just antitrust issues, but competition policy issues more broadly, but then also perhaps things like privacy regulation, or you know, some of the reports Feld’s report includes content moderation and some other big sticky issues in there as well. So yeah, so they’re all, they’re sort of comprehensive, right? If it has to do with this company, or these four companies, generally speaking, if there’s a problem in that space, the assumption is that this new specialized regular would cover it. And in some ways, they are taking as their model, the Federal Communications Commission, which sort of works that way, right? I mean, there are some specific limits on it. It has to be telecommunications versus wire airwaves, but that’s pretty broad. And the FCC has done lots of things that are sort of adjacent to that, like billing and some other issues, that maybe overlaps with other agencies, but generally speaking the model is this industry, anything in that industry, any problem in that industry, it’s going to be handled by one of these, by this specialized regulator.

Scott Wallsten:

So now that Sarah opened the can of worms that is content moderation, I’d thought that’s something you think about a lot. What would be the, the right policy approach to dealing with content moderation issues right now? Setting aside political realities, what would you want to see happen?

Neil Chilson:

Well, if we’re just talking from a policy perspective, from what should government do, in many ways, what government can do is limited by the First Amendment here. These companies are private actors, content moderation is deciding what content is going to be up on your website, that you run. That’s an editorial function. I think the First Amendment limits what government can do on that. There’s a really different question that maybe you’re also asking, which is what should companies do? And I don’t think there’s any one answer. I think companies have different audiences that they’re trying to reach. They have different consumers that they’re trying to serve. I think, and this goes to another thing I’m working on, which is my book on emergent order… I think that a lot of these companies, because they’re full of engineers who are used to designing a solution that kind of works from the top. Although, many of them have experienced with the fact that that doesn’t always work particularly well. They are taking a sort of top-down approach to content moderation as well. And I think that what we’re seeing is that has not just technical challenges, right? It doesn’t scale very well, but it also has some real policy problems, because when you put somebody in charge of moderating content, when you put one person, or one group, in charge of moderating content on a platform, that’s who gets lobbied by the politicians or by the interest groups about how they make decisions in that space. And what people want, is one solution that will fit for everybody. And that’s hard for a platform like Facebook that serves, you know, billions of people, and lots of different interest groups, and lots of different communities that all have different possible sorts of local standards. And so, I look to models like what Reddit does or Wikipedia, where they have a much more sort of emergent approach to content moderation, where you have local decision-makers who make content moderation rules for the community that they’re in, and users can step in and out of those communities, depending on how well they like that. Now, that has downsides too. I mean, there’s harms that happen, and those are challenging. And it also requires a certain toleration of content that might bother you, right? So, it does take a sort of, you have to have a more tolerant culture to have something that works well, like a Reddit space. And so, yeah, there’s trade-offs, but I would like to see a lot more experimentation with the more emergent, decentralized models of content moderation.

Scott Wallsten:

Have you seen companies like Facebook be interested in that? I mean, it seems like they’ve got an impossible problem to deal with. You think they’d be trying lots of things.

Neil Chilson:

You know, I think, I have no special insight into what Facebook is doing on this. I mean, maybe they have a skunkworks project that’s doing something like this, but Twitter has mentioned some interest in approaches that would allow people to at least filter their own feeds based on sort of third-party algorithms that people might bring. Kind of building on some stuff that Mike Masnick has written about protocols, not platforms. And yeah, so I do think that some of the companies are interested in figuring it out. Right now, like I was saying earlier, it looks like for Facebook, that’s one way to sort of get rid of the responsibility, right? Like you can’t lobby me because my users are doing the moderation, and I don’t even pay attention to it. Facebook has approached that more by emphasizing group chats as a function rather than, you know, social media feeds, and that trend will probably continue. It raises some other issues for people who are concerned around encryption issues, that will probably continue. But both of those alternatives are still way better than, I think Facebook’s at least has sort of nominally suggested that the ultimate solution should be that the government set the rules so that they don’t have to figure out what to do. And if you think the problem is impossible for Facebook to decide, I think it’s even more so I possible for the government to decide, even setting aside the First Amendment problems.

Scott Wallsten:

So, tell us about that book. You announced it on Twitter, that you’re writing a book on emergent order, and I think you promised it by December, right?

Neil Chilson:

That Tweet, I haven’t quite figured out how to update it. And another problem with Twitter, right? Like I can update that Tweet. So, the book is actually now coming out in April. No surprise that COVID slowed a few things down, but the book is called, Out of Control: Enabling Emergent Order In Public Policy and Private Life. And my goal for the book is that the reader walk away from it with a sort of gut-level instinctual appreciation for the fact that there are many areas of our life, personal and public, that nobody has control over, but that are orderly. And we have examples of this all around us. I mean, cities are developed this way, and ant hills, ant colonies are a classic example. Your body is an emergent system where you don’t really have control, direct control over much of it, but you can influence how it behaves. And so, the book is about one-third explaining what emergent order is, the fact that we can have order and patterns without having somebody control. One-Third talking about what the policy implications of that are using some examples, such as content moderation, privacy, and COVID-19 responses. And then one-third, this is a bit more, this is the hardest part of the book to write for me, frankly, was as a sort of self-help section. It’s kind of more self-helpy right? It’s basically like, if you thought about the world, and you really recognize the fact that even you yourself are largely a process, rather than an object, and that you are an emergent system yourself, how would that affect how you think about building habits, and how would that affect how you think about taking action to help the community, the large institutions that you are part of, which are also emergent systems themselves? And so that part is a little more self-helpy, and maybe philosophical, but it was pretty fun to think through that issue too.

Scott Wallsten:

So, in the first third of the book, when you’re defining it, you’re not making any normative claims. You’re just saying this emergent order exists in many areas, and then the second part is given that how should policy respond to it? And so, what are the sorts of ways that policy doesn’t currently respond to it, but should?

Neil Chilson:

Well, I think, like I was saying, in policy, one of the big challenges is that when people see big, complicated problems, they want to put somebody in charge of them, right? And so this is direct back to what I was saying about content moderation. You get a team together of the smartest people, and you design a solution. That is very appealing, but in many cases it actually can have some negative side effects because you may not fully understand the problem. And in most cases for complex problems, you don’t really fully understand the problem. And so, it won’t be a surprise to anybody who has followed me on Twitter for long or read some of my past work, but one of the lessons that policy can take from this is that first of all, policymakers should be humble about what they can know and what they can achieve. And they should also look for incremental solutions that can be sort of evolved over time. So, one of the most common sort of difference, two examples, you know, the two polar opposites that you can think of is something like a common law approach, which evolves applying general principles on a case by case basis, based on the facts of a specific instance, versus more regulatory approaches, prescriptive regulatory approaches, that try to set up all the rules and define how things will work and then measuring all future behavior against that set of rules. And one of those has a much bigger knowledge problem than the other. It’s much harder to gather all the knowledge in most cases to set rules out that will apply, typically perpetually, until they’re modified by the regulator. Whereas the case-by-case approach, if you kind of know what you’re aiming for, and that’s what you focus on, you can iterate over time, and evolve the principle over time. And so that’s just one example and the content moderation space, I already gave that example. And privacy, I sort of compare and contrast some of the approaches of GDPR or CCPA-like approaches to the FTC’s or general principle approach of not causing harm or deceiving consumers, unfair acts or practices. And so, this is not to say that these incremental approaches don’t have downsides, they do. They’re not perfect, but the trade-offs, particularly in fast changing areas, I think are better suited to making sure that we can continue to try new things. And actually as a society, continue to incrementally approve and adapt to what we need and to the problems that we all face together.

Scott Wallsten:

It sounds a little bit like, I mean, at an even higher level, this is kind of a way to resolve different tensions and things that you believe. So, you know, you’ve libertarian leanings, and yet you’ve spent a lot of time in the government. You’re not a crazy, let’s get rid of the government person, and so this sort of seems like a different way to think about how government should interact with the rest of the world.

Neil Chilson:

Yeah. I think that’s exactly right. I mean, the reason I’m writing this book honestly is because somebody was like, “Neil, what would be the book that you handed me that like explained how you think about things,” and that tension right there, that you’re talking about, the tension between individual action and the importance of institutions, I think is core to the book. And both of those, emergent order, is key to understanding the tension between those two. The importance of institutions is something that I think I’ve gotten more, more and more interested in. How do you resolve that with individual liberty? And this is a fight that’s very, very old, but it’s resurgent on the right in many ways around how should government make people be good? I think it’s sort of the short version of that. And there’s a growing movement, I would say on the right, to say, well, people aren’t really free until you teach them how to be free. And so if you value freedom, you should value the institutions that teach people how to be free. The problem with that approach is that what people mean by teach is really different. And so…

Scott Wallsten:

Well sometimes what people mean by free is different, too.

Neil Chilson:

Well, I think that’s exactly right. And so the question is how do you resolve that? And one way that I think about it is realizing that the institutions that we participate in, whether that’s our families, or our communities, our schools, our churches, our governments, we can help influence them, but they definitely influence us as well. And that we really need to be cognizant of that. And that should make us think hard about what institutions we participate in because they will shape us. But it also should give us a sort of a moral sense. Maybe that’s a little… we should want to make the institutions that we participate in better. And that it in some ways are, because we’re getting benefits from them, we should work hard to make sure that we are helping make them better for people in the future and for people who are here right now. And so, I really just personally think that we have a responsibility to do that as individuals, that one of the best ways to use our freedom is to make the institutions that shape us and shape others better. And I think that’s something that’s sort of been lost, and it’s easy to lose if you just look at all the flaws and the institutions that we currently have. It can be easy to be dispirited and say, well, that’s why we shouldn’t have any of these, but really they’re pretty important to how we work together as humans.

Scott Wallsten:

That’s actually a pretty hopeful message about government or the way government could be. And it’s definitely a lot different from what we seem to hear in the news every day now. I’m really looking forward to that book, and then we’ll have you back because we need to do a book talk.

Neil Chilson:

I would love to do that. I will reach out to you to book that.

Scott Wallsten:

Sure. Excellent. Well, we should probably wrap up. We’ve gone well over time, but Neil, thanks so much for being with us. I really liked our conversation today.

Neil Chilson:

It was great. I really enjoyed it. Thanks for the interesting questions, and I hope to be back sometime soon.

Scott Wallsten:

Thanks. Talk to you soon.

Share This Article

View More Publications by Nathaniel Lovin, Sarah Oh Lam and Scott Wallsten

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.