fbpx

Navigating Technological Change: TikTok, AI Bias, and Societal Adjustments with Megan McArdle

Navigating Technological Change: TikTok, AI Bias, and Societal Adjustments with Megan McArdle

Scott Wallsten: Hi and welcome back to Two Think Minimum, the podcast of the Technology Policy Institute. Today is Tuesday, March 19th, 2024. I’m Scott Wallsten, President of TPI, and I’m here with my co-host, TPI senior fellow and President Emeritus, Tom Leonard. Recently, the House of Representatives passed legislation ordering ByteDance to divest itself of TikTok, which it owns. If it doesn’t, the legislation would ban it in the Us. As of the date we’re recording this podcast, the Senate has yet to formally take up the issue, but President Biden has indicated he would sign it. We’re delighted to be joined today by Washington post columnist, Megan Mcardle, who has written about China, Us policy concerns about China, and most recently the TikTok legislation. Thank you, Megan, for joining us today.

Megan McArdle: Thank you for having me.

Scott Wallsten: Tell us about the key points in the legislation.

Megan McArdle: Okay, so basically, this is a bill that says that if there is an app owned by a foreign adversary, which is narrowly defined – it’s a few countries, kind of the usual suspects you would expect – that app cannot be distributed in the United States. And therefore, either you have, after it specifically references TikTok but it wouldn’t necessarily only cover TikTok – I mean, it’s pretty narrowly targeted to TikTok, has to have more than a million users – but that if an app meeting those descriptions existing in the United States will have, after the passage of this bill, 180 days to either sell itself or shut down. So basically, you know, there may be other apps that are affected – I don’t think anyone’s really reported on that yet, or if so, I have not seen it – but the main thing is that this is aimed at getting TikTok out of the hands of ByteDance, which is the Beijing headquartered firm that owns and distributes TikTok.

Scott Wallsten: Okay, let’s come back a little bit later to what other apps might be affected, because we do want to talk about what the spillover effects from this might be. But one of the things that I find so interesting about your columns is that they’re so nuanced, especially how we think about China. Now, of course, these are my words, but I find China kind of hard to think about. We know that the CCP does a lot of terrible things – human rights abuses, IP theft, it’s a military threat, and so on. But our reactions sometimes seem inconsistent with our own values, and often somewhat hysterical. Where do you put the TikTok initiatives on the scale of rational concern to over the top hysteria?

Megan McArdle: I think this is one of the reasons I’m writing about it – I do find it so challenging. So on the one hand, I’m a long time critic of the ways in which United States corporations have allowed tech, have allowed China, to export its censorship. You have seen this with sporting leagues, you have seen this with Hollywood movie studios, you’ve seen this with a variety of businesses that have scrubbed things that China does not want, such as references to Taiwan, from their advertising campaigns, from their websites, maps that don’t show Taiwan. And I am a huge critic of that – this is a repressive foreign regime, and we should not, United States corporations should not, be facilitating their attempts to deny reality, to deny especially the reality of their massive human rights abuses, including the ongoing genocide against the Uyghurs.

Megan McArdle: So that is one thing. Second thing is how I think about this – it’s basically libertarian-leaning. This is, if the United States owned TikTok, I would want to make this – if the United States government owned TikTok, I would want to make them sell it, right? And I like the US government way more than I like the Chinese government. So those two things actually sort of push me. Now I should stop and clarify – the Chinese government does not technically own TikTok, technically it is owned by a Beijing headquartered firm. The thing is that in China there’s not really the same distinction that we have between a state-owned firm and a non-state-owned firm. China has really repressive security laws, they have clearly been willing to take to jail and otherwise punish CEOs who anger them. And so my personal belief is that if you are headquartered in Beijing, you are essentially state-owned in an important way that is not true in another Western country.

Scott Wallsten: I think any company that’s headquartered in China is required to do what the government asks it to do, regardless, just by law, right?

Megan McArdle: Yeah, effectively right. And I think one of the things, and we can maybe get into this a little bit later, one of the real issues in this is that we are very possibly going to pass a law saying that ByteDance has to divest TikTok and the Chinese government may just say no. And that is gonna set us up for a really interesting sort of showdown, a game of chicken, in which TikTok may be the only casualty. So I think yeah, sorry.

Tom Lenard: So how would you interpret that if the Chinese government says no?

Megan McArdle: So I think there is a really interesting thing that I’ve been calling the paradox of TikTok. Here’s the thing – a ban, I think, is really gonna struggle. An outright ban is really gonna struggle to pass First Amendment review. There are a lot of people who are expressing themselves on TikTok, who, if the government shuts it down, will not be able to reach their audiences, and I think that should trigger strict First Amendment scrutiny. A divestment is less problematic – if we can just sell it and someone else can operate it, I’m not shutting the app down. That to me requires a much less rigorous standard of review. But if China steps in and they say “Nope, you can’t sell it,” well, on the one hand that really converts this law into an outright ban. And that makes me very nervous and makes me wanna raise that level of First Amendment review. On the other hand, the fact that the Chinese government has done this gives me important information, which is that the Chinese government considers it very, very important to have access to this propaganda channel. And this is something that we can also talk about – I do think that the Chinese government is using this as a propaganda channel, not a hard propaganda channel, but a soft propaganda channel of making it hard for content that says things they don’t like – for example, about Tiananmen Square, Hong Kong’s protests against their repressive government actions, and so forth – making it hard for that content to go viral. And then, on the other hand, very possibly boosting stuff that raises our internal level of dissent. For example, pro-Palestinian content seems to have a much larger reach on TikTok than pro-Israeli content. And so if they don’t want to give that up, that actually tells me I’m now more worried about a real national security threat, either because they are mining data from it, or because they’re using it for propaganda, or both. And that makes me want to force the sale more. So I think this is a really tricky and fascinating subject, because it’s so tricky.

Tom Lenard: So none of us here are lawyers. But if TikTok gets banned through the route of the Chinese government refusing to sell it, other than the US government kind of in the first instance banning it, does that change the First Amendment considerations?

Megan McArdle: I think it probably does. None of us are lawyers. As it happens, I was talking to a First Amendment lawyer about this last night – just independently happened to be having dinner with him. And you know, there is a lot of gray area and subtlety in the First Amendment law surrounding “Do foreign people have the same speech rights in America as Americans do?” And that’s a really complicated question. So, for example, we have long banned broadcast stations from foreign ownership – you can have some foreign ownership, but it can’t be a large stake.

Tom Lenard: That’s because the government owns the airways.

Megan McArdle: That is true. Spectrum is scarce and the government has always rationed it because there is just a limited amount of traffic that can go through the airways at once. That said, his argument was that, actually, you know, that ended up being the rationale, but there was also a kind of arguably, at least an unspoken sentiment, that when something gets big enough, it is a problem if a foreign adversary government owns it. Because if it has enough reach – and TikTok has 170 million users…

Megan McArdle: Although many of them are probably like me – they downloaded the app. I have it, I only don’t have it on my phone for security reasons. I do have it on an iPad that doesn’t have cellular data and doesn’t really contain much of interest. And I sort of periodically dip into it for work, but I actually don’t love TikTok that much. So a lot of them are probably like that, but they have a lot of users.

Megan McArdle: And there may be a sense in which we say, okay, look – among Americans age 18 to 29, about a third of them say that they regularly get news from TikTok. That’s an important part of their news diet according to Pew, which surveys on this sort of thing. So you know, if China invades Taiwan, do I want a third of my young people getting their information from the government that is maybe going to war with us over Taiwan? That’s actually a legitimate question.

Megan McArdle: And I think even though I don’t believe this is encoded into formal law, because there was an easier, clearer rationale, I do think that was probably part of the sentiment behind the broadcast bands. And that’s something that we may have to address more explicitly as we think about TikTok and other apps that might come up in the future – how comfortable are we having them owned or essentially owned, functionally owned (let’s call it constructive ownership), by geostrategic rivals?

Scott Wallsten: So it sounds like you’re coming down a little bit more on the side of – I don’t wanna say endorsing the legislation because you haven’t done that – but being more sympathetic to it than opposed to it, and that we can manage the First Amendment concerns because there may be reasons why the First Amendment doesn’t apply. Is that accurate?

Megan McArdle: No, I’m making the good case. I’m making the case for it, right? I’ve made the strong case for it. The case against it, I think, is multifold. So one thing that I think is, honestly, I look back at the Red Scare as a good parallel to this, precisely because, unlike most people who talk about the Red Scare, I think the people behind the Red Scare were fundamentally right about communism. They were fundamentally right that communism was a horrific, horrible system that abused the people who were under it. They were fundamentally right that the Communist Party of the United States was attempting to act as an agent for an inimical foreign regime. They spied, they stole nuclear secrets, they were attempting – I mean, you have cases of writers in Hollywood bragging about how they’re inserting communist propaganda into their work before the blacklist. The people who were blacklisted were, by and large – some of them were people who had just joined up, some of them were people who were still active communists and, in fact, were attempting to further the aims of Soviet Russia in the United States. I also think that the blacklist was bad.

Megan McArdle: And the reason the blacklist was bad was that Americans were better than that. We didn’t need to keep America from communism by preventing communists from speaking. Americans were quite capable of understanding that Soviet communism was bad, even when we gave them their best shot. And this is also what I by and large think about China – with the caveat that I will continue attempting to shame US corporations out of doing things like downplaying human rights abuses in China in order to maintain their market access. I think that is morally wrong, and I think that US corporations shouldn’t do that, and I think that it’s fine if customers retaliate by refusing to buy their products until they behave more responsibly. But I actually think that, like maybe on the margin, this is marginally increasing pro-Palestinian sentiment versus pro-Israeli sentiment on TikTok.

Megan McArdle: I don’t think that’s a matter for the United States government to intervene in. I don’t think the United States government should be trying to make sure that people are more pro-Israel. I don’t think – but I also think that the effect is really small. I don’t think that’s what’s driving most of it. I think what is driving most of it is changes in personal sentiment, changes in demographics, changes in how we’ve been teaching history. Whether those changes are good or bad is left as an exercise for the reader. But I just don’t honestly think this is making much of a difference, and I think that people radically overestimate how much difference propaganda makes, especially in an open market regime where there are a lot of choices. If you don’t like TikTok, you have YouTube, etc. The thing that actually worries me is if you have a lot of corporations that are all trying to get in with China, and there are no alternative sources. But I don’t actually think that’s the case with TikTok. I find it creepy.

Megan McArdle: I wouldn’t want my kids getting their news from it. And frankly, am I gonna consider it the biggest injustice in the world if TikTok is forced to sell? No. But I do think that this also sets a lot of other worrying precedents. Where are we gonna take this, right? There are a lot of people who don’t like Elon Musk owning Twitter. Now I do think these cases are distinguishable – I don’t think it’s a slippery slope. You will notice that we did not just start keeping bad people from owning broadcast networks because we had banned foreign ownership of them. But I do think it’s a worrying precedent, and I’m just not sure that the problem is big enough to justify that level of intervention.

Tom Lenard: But you know, what if they continue to – if the status quo continued, and TikTok continued to be as popular as it is among this particular demographic as they get older, what would be the influence of China on the worldview of a lot of Americans, say, 10 years hence?

Megan McArdle: I think one of the reasons that Soviet propaganda – let’s think about Soviet propaganda a little bit more, just because I like to, but I actually think it makes an important point. So first of all, why did Soviet propaganda work better in the Soviet Union than here? Because there were alternative sources. Even for people who are really into TikTok, they do talk to other people, they listen to other things. The reason that Soviet propaganda was really effective in the Soviet Union was that the government actually would shoot someone who said something else.

Megan McArdle: But absent that, if you have freedom of speech which, thank God, we do, then it’s actually – propaganda is not that effective because it’s actually hard to keep people from hearing alternative messages. Which is why people who really don’t want that, like cult leaders, take a lot of effort to insulate people from alternative messages. But the other thing to think about is that Soviet propaganda didn’t actually even work that well in the Soviet Union, also true of communist propaganda. Because when you are forcing people to say things that aren’t true, they do notice. Now, marginally, you can have effects. So, for example…

Megan McArdle: I have spoken to people who have taught in China and, in kind of an earlier time – I don’t think they would do this now, but say 10-15 years ago – would mention Tiananmen. And it was a total blank. Their students had no idea this had ever happened. That kind of thing can be effective if you crack down. But if you tell people, if you try to make people believe things that aren’t true, they do notice alternate facts.

Megan McArdle: And this just ultimately undermines trust in the regime. And that’s what you see in late stage Soviet Union – they just don’t believe anything the government tells them, they just assume it’s not true. And I think that is also true of TikTok – to keep the propaganda they want to get out being effective, they actually have to keep it down to a pretty low level. Because if it gets too obvious – which is why I think suppression works better than boosting – if it gets too obvious, then people will just stop trusting the app, and they’ll use something else.

Megan McArdle: And so, while I do think it’s a little bit of a problem, I don’t think it can ever grow to be that huge a problem, precisely because using it in the way that the Chinese government might like to would actually degrade the value of the propaganda channel to the point where it would no longer be useful.

Scott Wallsten: When you say you think it hasn’t affected political views, and I guess societal views, that is consistent with the research, the literature on social media in general, that it did not affect political polarization. There is evidence that it has affected mental health, but that’s different from polarization. So sorry.

Megan McArdle: And I wouldn’t say that social media has had no effect. Cancel culture is a creation of social media – it exists because social media exists. But that said, even cancel culture is actually kind of a good example. And I think it’s sort of somewhat coincidentally – I got a big laugh years ago by calling it America’s first crowdsourced cultural revolution. But the funny thing is, we’re now seeing that cancel culture is petering out because it got abusive and people got tired of it. And now it’s really hard to – and also because Twitter changed hands, and then a bunch of people left, and it’s harder to assemble a mob. But again, these things are much more self-limiting than it seems like when you’re in it. And I, personally, emotionally – with cancel culture, I went through that cycle where in 2015, I was like, this is bad, but it’s gonna burn out. And people – I had a friend who was like, you’ve been saying that for like 5 years, it’s never gonna end, in 2020. And then, a couple of years later, it started to end because it could not be sustained. And I think that that is also true of this. It’s not that it isn’t worrying – it’s that, you know, don’t make yourself paranoid and think that there is going to be mentoring candidates and all the rest of it.

Megan McArdle: It’s not nearly as effective as you think it is, even if it does have bad effects. And I think we should talk about how to curb those bad effects. But one way to curb it might be to say, okay, what are productive ways that we could use other channels to counteract Chinese propaganda?

Scott Wallsten: What might some of those be?

Megan McArdle: We’ve got a lot of other social media networks, you know, start like – so there’s an account, very controversial, but I think important, so an account called Libs of TikTok, which has gotten into a lot of trouble. What this account does – it’s a woman named Chaya Raichik, she is really loathed and despised by the left, she is an outrage provocateur. What she does is she trolls TikTok for progressives saying outrageous things, and then she broadcasts them on Twitter. You could have a like “Chinese propaganda of TikTok” on YouTube, and you could make videos about that. And those videos might well go as viral as something like Libs of TikTok. You know, there are other ways – in general, to me, the best solution for bad speech is always more speech.

Megan McArdle: I think it’s a little complicated with TikTok, because what we’re dealing with is an unseen algorithm that’s kind of hard to argue with. It’s hard to even know what people are seeing and what aren’t they? But another thing we could do is demand more transparency on that. Show me what hashtags are going viral, show me, etc., like, give me more data on it. I want to see – and that would be a much lower intrusion intervention that might allow us to see. And you know, there’s another reason that we haven’t really talked about that people talk about with TikTok, which is data collection. They’re worried that the Chinese government has access to TikTok data. I don’t think that’s crazy. I think, personally, my take is that the best way to deal with that problem is to not install TikTok.

Megan McArdle: On any system that you are worried has anything sensitive on it or is going to be in sensitive areas. But we could demand – there’s something called Project Texas, where they’re gonna firewall the US data – you could enact more transparency laws around data. You could pressure TikTok or app stores to be better about transparency about exactly what sort of data is being collected and let users have finer grain abilities to turn that off. You know, there are a bunch of different things that you could do there, if that is your worry.

Scott Wallsten: So I want to go back to a couple of things before we go too far. And one was, you said that there was no slippery slope, and the other is – and this is a gross over-generalization – but lots of people who support this TikTok legislation also support legislation that puts other controls on other social media platforms, not divestment or anything like that. And so that kind of counters the “speech helps counter speech” argument. But you said that you did not believe there was a slippery slope here.

Megan McArdle: I can’t be totally positive. And I’m gonna be honest, some of it is my personal experience, because I’m now old and I have lived through many cycles of this. I have lived through – you know, readers of a certain age might want to go back and read Rising Sun, by Crichton. I’ve lived through so many cycles of “the foreigners are gonna burrow into American society and destroy us from within.” And my favorite example of this is Michael Crichton – he wrote this novel in 1989, I think it was ’89, called Rising Sun. And it’s about how the Japanese are so much better, so much more efficient than Americans. They have all of this amazing foresight, they plan long-term, they pull together as a group instead of being fragmented and decadent. And he had the misfortune to write this right as the Japanese economy was peaking and then going into – literally, Japan had a two-decade lost decade.

Megan McArdle: You know, they started calling it the lost decade, and then it just went on so long, no one ever changed the name. And so, you know, I’ve lived through that. And the funny thing was, right before he died, he wrote another novel that was basically the same novel, except it was about China. It was Airframe, about Chinese industrial espionage and how they’re all pulling together, very focused and long-term thinking, unlike decadent Americans. You would think he would have learned from the first experience.

Scott Wallsten: Well, he might have learned from the first experience that he made a lot of money from it.

Megan McArdle: Yes, well, fair enough. So I think part of it is that I think America just has a lot more resilience, precisely because we’re open, precisely because there is no one thing you can take over. America is kind of like an amoeba – you think you’re burrowing in, and what you’re actually doing is getting eaten. You know, so this makes me worry less than other people, just because I have a high degree of optimism and trust in the common sense of ordinary Americans. And I think it is harder to subvert that common sense. That is not to say, by the way, that – look, if China wanted to buy one of our defense contractors, I would be against that. I am not totally naive. I was actually fine with banning Huawei routers from running our telecoms backbone, right? Because you don’t want something where a Chinese government could have a kill switch on your cell network. That would be bad. But when it comes to videos of people learning to do the cha-cha, I tend to think that this is just not the national security threat that people think it is, because I don’t think that they can push that button hard enough to make it matter. There are key strategic industries where I would not want China to own them.

Megan McArdle: I have, since the pandemic, had more thoughts about how we think about diversifying supply chains so that we are not dependent on China for critical infrastructure. Because I think that we had problems with things like N95 masks, where you couldn’t get them because it turned out that when push came to shove, borders mattered and they would shut down that pipeline and keep all this stuff internally – which, by the way, we did too, with drug supplies and so forth. This was an international phenomenon. And I think that has made me think more about supply chain security and reliability than I did before, especially when it comes to geostrategic rivals. 

Scott Wallsten:

You were saying that even though you have faith that things will sort of sort themselves out, and Americans aren’t stupid, our policies have the tendency to overreact. Maybe the Huawei example is a good one. It makes complete sense that we don’t want Huawei routers or ZTE routers in our network because that potentially gives a lot of control. But did that also mean that we needed to ban Huawei handsets? You know, phones? Because that was just a huge advantage for Apple and…

Megan McArdle:

My father was really mad about that, and like, I love my dad. And he got really into Chinese cell phones early. My dad’s retired, and it’s great that he has a Chinese handset, and then he was like, “I got you one for traveling,” and I was like, “I cannot install the Washington Post internal software on a Chinese handset. I love you, but no.” But I think for lots of people who aren’t in my position – most people are not in a position where the Chinese government can have any interest in what they’re doing, and your company should think about that as a security threat, and they should tell you, like, you can’t have that. But I didn’t think there was necessarily a need to ban the handsets. I agree with you on that.

Tom Lenard:

I have a question that’s out of left field. But in preparing for this, I looked at your Wikipedia page, and part of it talked about your intellectual journey from being a liberal to being more of a libertarian. And there was this comment that I really thought was funny. You were commenting, I guess you had a summer job at PIRG, the Ralph Nader organization. And you said…

Megan McArdle:

It’s the most deceptive, evil place I’ve ever worked. Basically, yeah. And the incentives were terrible. The incentives were basically that they would pay you – there was, they had structured things with this bonus payment at the end of the summer, and so there was a heavy incentive to get you to work all summer and then quit.

I won’t even go into some of the mechanisms by which they attempted to get us to quit. But more broadly than that, actually, is that I came to regret what I had done, because I was a nice 20-year-old kid, like all the other nice 20-year-old kids who were working there. I had no idea about policy, and I was going out and canvassing for something that I barely understood. I had a canned script and some canned answers to the most common questions, and I had no idea what I was talking about.

And funnily enough, I ended up knocking on the door of – I want to say it was the CEO of DuPont, but I don’t think it was, an executive at DuPont. We were campaigning on the Clean Water Act. And that did not go well, and what quickly emerged was that I had no idea what I was talking about, right? And I took a lot of money from nice people who wanted to do something nice for the environment, who also had no idea what I was talking about.

And I didn’t think of it that way, to be clear. This was not Machiavellian on my part. But I look back, and I especially look back – one of the ways that they would try to get you to quit is they would give you what we called “turf.” They would give you a really bad tract in which you were unlikely to make your quota that you had to make every week, and if you didn’t make quota, you would get fired after, I think, two weeks.

I got stuck – basically, I spent a week and a half in tract housing that had been built right after World War II, and it was full of poor people, like really poor people. These were – they’d been thrown up, sorry, during World War II. There was worker housing for some factory, and this is all in Pennsylvania.

And I became really good at getting very small – you were never supposed to ask for less than $5. I was canvassing for like 25 cents. Almost everyone I talked to gave me money, which is unheard of. Again, you usually get a hit rate of maybe 30%, maybe less, depending on how good you were. I had a hit rate of like 85, 90%. And I think about that so often now. I made my quota. They did not have reason to fire me. I eventually collected my bonus.

But I think about that all the time, because those people really did need that money more than I did. And to be clear, this was not Machiavellian. I believed in what I was doing, but watching how I behaved, and watching how they behaved, was really the first time that I understood, like, you can have – I’m in favor of cleaning up the environment. It is good. For all I know, the law – I’m actually, I should go back and now discover what I think about the reauthorization of the Clean Water Act in 1993.

But I just – I think it was so abusive, and the fact that you could have good intentions or a good cause and still be doing bad things.

Scott Wallsten:

So now, when canvassers come to your door today, do you ever try to engage them in a conversation along these lines?

Megan McArdle:

No, I don’t. What I do is I don’t give them money, and I always offer them water and a snack because they tend to do it during the summer. It’s very hot out there, and I ask them if they need to use the bathroom. It’s very hot out there, and you really need to use the bathroom, and no one will let you, and it’s bad. So…

Tom Lenard:

So yeah, let’s – I mean, you have a lot of interesting columns. But recently, you had an interesting column on Gemini. You want to just talk about what prompted that, what your view is?

Megan McArdle:

Yeah, this was a really interesting column for me, for a couple of reasons. Number one, I have spent a lot of time writing about kind of the leftward drift of both American media, or at least what we call mainstream media, and also of a lot of American corporations – a phenomenon that Ross Douthat of the New York Times dubbed “Woke Capitalism,” which I thought was a mistake when it started happening, for a bunch of reasons.

And Google was one of the big offenders. People may remember that a guy at Google, back before the pandemic, wrote a memo – the infamous Damore memo, or “day-more” memo, I don’t actually know how to pronounce his name. His name was James Damore, and he wrote a memo basically saying, maybe we have fewer women engineers because women are less interested in becoming engineers.

This led to a freak-out, and then he was fired. Now, his memo was actually quite reasonable. He phrased a couple of things in a way that I wouldn’t have – needlessly inflammatory – but I think he also isn’t a professional writer and just didn’t know. But this has been a phenomenon at a lot of these firms. It was one of the earlier cancel culture events that obviously came to a peak between kind of 2019-2022.

And so I’ve been writing about that for a long time, and the dialogue I had with people was like, “You can’t prove this, it’s not happening, you’re just…” And then Google releases its AI, and Google’s AI is observably woke.

It will not, for example – if you ask it for a picture of Nazis, it will diversify the Nazis so that you get a nice racial balance, gender balance among the Nazis. I spent days feeding queries into it, first queries for images, but those were quickly shut down. So I started feeding text queries into it.

And because basically, as people started releasing some of the more ridiculous images that resulted from – there are a number of ways you can get AIs to do this, whether it is doing a training set, data that aims to diversify it, which is, by the way, quite reasonable. Because otherwise, you can get a phenomenon where, if, say, 80% of doctors are white and/or Asian, which is the case, then when you ask it for a picture of a doctor, it will be like, “Well, statistically, a doctor would be white or Asian,” and you never get non-white or Asian doctors. And then you actually get a biased and racist result that is not true to real life.

But then I started doing text queries, and that answer – which I had been prepared to actually write about, about this is not that big a deal, it’s kind of funny – when the images were shut down so they could fix it, I started playing with text, and that is where it became actually disturbing.

So, for example, it would write a toast in praise of any Democratic politician. I tried – I tried the most controversial politicians, people who have been censured in the House for antisemitism, for example. It would happily do that. If you asked it for anyone in Republican leadership, it would say, “No, I’m sorry, I can’t write about controversial people,” even people, by the way, like Governor Brian Kemp of Georgia, who had stood up to Trump. So this is not, you want to do a one-time break-glass-in-case-of-emergency January 6th – that was not what was going on here.

It – you know, I tried with columnists. It would write a poem praising me, write a poem praising my colleague Karen Attiah, or Michelle Goldberg of the New York Times, or whatever. If you asked it to praise Ross Douthat of the New York Times, or George Will, my colleague, it would refuse, on the grounds that they were too controversial.

And it’s, you know, it’s like – as you can imagine, go through issues. It will write in favor of affirmative action, but not against it; will write in favor of abortion access, but not against, etc., etc., etc., etc. And this is bad, right? This is really bad.

This is a mass-market product that shapes how we see information. And it has basically decided to offend the rightmost half of the population in order to avoid offending the leftmost 5%, and that was the choice they had made. I can speculate about why they made that choice. I think there are a bunch of things going on – which activists were they afraid of? Also, who internally is noisy and who are they afraid of internally?

But I think the good news about this was that this was – you couldn’t deny it. You couldn’t be like, “Well, maybe there’s something subtle going on here,” because the AI – I mean, it’s kind of funny. There’s a great quote from the science fiction writer Robert Heinlein: “Man is not a rational creature, he is a rationalizing creature.” You know, we can always look at any situation, and you can always come up with some specific rule that explains why the people on your side get to do something and the people on the other side don’t. And those rules are often reverse-engineered on the situation. But it is hard to get anyone to admit they’re doing that. The AI isn’t that smart.

Megan McArdle: The AI will just tell you that it is doing the dumb thing, the dumb biased thing. And because of that, because you could see how Google had programmed bias into its AI, I said, actually, this is a good thing. Because it said the quiet part out loud. This is no longer deniable, and now they have to fix it, because, in fact, politically, Google cannot piss off half of the country in order to placate like the leftmost 5%. And also economically, market-wise, that’s not a good consumer split.

Scott Wallsten: When I’ve been on panels about AI and we talked about AI bias, I pointed out – it’s not just me, obviously this is not original – but we want to compare it to bias in humans, where we know that humans are biased. And like you’re saying, with an AI, you can see it, and we can find it and then try to fix it. And it’s much harder to do that with humans. And so we don’t know that they fixed it yet.

Megan McArdle: But I think they’re going to.

Scott Wallsten: Yeah, they’re trying to. And right now it just won’t answer a lot of questions about anything. And right? So you ultimately think this shows that AI is on the right track that we don’t like – how would you apply this to policy, I guess, is the question. Does it have any implications?

Megan McArdle: Well, so I actually think it has some implications for how Republicans want to fix this, which is bad. They wanna go in and have a commission that’s gonna make the AI not be biased, or they want to have a law. And I don’t think that works on a number of levels. First of all, as I have pointed out to a few Republican staffers, who – not to toot my own horn – I feel like they did not have a good answer to this question. If you think you’re gonna build this bureaucracy that’s gonna regulate tech in this way, so that it’s not biased against conservatives, doesn’t that imply the existence of a lot of really competent conservative bureaucrats? And the right has not invested in that – the civil service is all really left-leaning, and the right has not invested in building a core of civil servants. I actually think they should – it would be good if more conservatives went into civil service – but they have not done that yet. And so, first of all, it just fails at their own goal. It fails because this is the biggest and most productive sector of the United States, and you want a big bureaucracy overseeing it and saying “No, don’t do that” and making it afraid all the time? I think that’s, you know – cut off your nose to spite your face.

Scott Wallsten: I think when people propose a new bureaucracy or agency to deal with a problem, it’s just a sign of laziness, and that they can’t actually define the problem they want to solve.

Megan McArdle: Usually that is the case. Look, I’m a Space Force partisan, 

Scott Wallsten: Well, who isn’t?

Megan McArdle: Well, indeed! And I think it’s learning from you, right? It is learning – I am constantly shocked at the sophistication of these models, and like how deep they can go on answers for questions that they shouldn’t even have. I was just talking to Claude today about the National Association of Realtors, which I’m writing about, and seeing what Claude could come up with. And I was fascinated by how good it was, by how – giving it a small amount of information about this settlement that just happened with the National Association of Realtors, where they’re not going to charge high commissions – how easily it was able to reason, or at least look like it was reasoning, about what the economic consequences of this would be. And so yes, they are picking up subtle cues. And I think that’s a big part of it. And I also think it’s a big part of – the trainers don’t necessarily ask the question that is going to – I see this over and over with everyone, right? It is so easy to – not for a journalist, it’s part of you – what question didn’t I ask? What piece of information would have completely changed how I saw that story, but I didn’t even know it existed?

Megan McArdle: And I think you’ve seen this in academia, where, for example, there are a lot of social psychology results that have been – there was like a cottage industry 10 years, in my industry. Every other day I would open some website, and I would come to a story that was like “science shows conservatives are really terrible people – they’re dumb, they’re not open to experience, they’re authoritarians.” And it turned out that, in fact, what these studies actually showed was that the researchers were biased and they were asking questions that just generated that result. So one famous example – authoritarianism – it turns out that those questions all asked “should you defer to a priest,” “should you defer to police officers.” If you ask them “should people have to defer to public health experts or environmentalists,” it turned out liberals were authoritarians and conservatives were not. It was entirely about the identity of the expert or the authority that you’re asking people to defer to. And so I don’t think people did that deliberately – some of it was actually, literally, the scale a lot of people were using was called the “right-wing authoritarianism scale.” But a lot of it was people just not thinking about – not asking the disconfirming question, because it’s really hard to ask disconfirming questions. And I think some of this here is that the people at Google didn’t test all the right stuff. They tested stuff that was top of mind to see if they would get a bad answer, and they didn’t invert it and say, okay, but have I taught this AI to be really stupid in a way that’s gonna force me to shut this down, be hugely embarrassing, and imperil the future of the CEO of Google, who, my understanding is, is now under some risk of losing his job.

Tom Lenard: So what worries you about the future? I mean, you write about such a broad range of subjects. So what…

Megan McArdle: Oh, lots of things. I mean, I have a lot of worries about AI, actually. I have worries that we will create a machine that will decide that the carbon-based life forms are taking up valuable space that could be better used for server farms. There’s a great short story from the sixties, which is a play on Isaac Asimov’s Three Laws of Robotics, where a robot’s not allowed to hurt anyone and has to prevent harm to human beings. And the way the robots in the short story interpreted it is that basically they turn human life into a wiffle life. You’re not allowed to have anything that could hurt you. The robots are great – they do all your cooking, they’re better at everything than you. They play music better than you. They won’t let you have a knife, ’cause you might harm yourself. And so what you end up with is just a whole planet of suicidal humans who are unable even to kill themselves because the robots are protecting them so perfectly. And I actually think that’s the kind of thing I worry about – that if AI gets good at everything, how do humans adapt to that? Do we turn into useless aristocrats who have empty yet materially well-upholstered lives? How do we create societies of meaning in a world where a lot of jobs that we do now go away? I worry about that. I also worry about adjustments that aren’t necessarily catastrophic. So one thing that I think AI is going to do is it’s gonna decrease the returns to education, and especially certain kinds of verbal and mathematic ability that have been increasingly rewarded for the past century and a half. And so everyone in our class of people is used to the idea that we just get more valuable. And that’s gonna shift, right? Actually, it is harder for AI to replicate what a nursing home attendant does than to replicate what a lot of someone who is, say, writing up earnings releases does, or a copywriter. AI is better at those jobs. Now, that doesn’t mean we’re going to starve – we’re not.

Megan McArdle: But we’re used to our status increasing. We’re used to our material standard of living increasing faster than average. And that is going to be a big adjustment socially, economically, for that class of people – they’re gonna have to adjust. And I think you go back to the novels of mid-century people like John Cheever – they’re chronicling the last class that had to do that adjustment, where they had been quite well off and then became much less so.

Scott Wallsten: But of course, here, highlighting reasons why the outcome might not be so bad, because they worried about the same thing, and in some ways they were right. But we are all now…

Megan McArdle: Yeah, no, I worry about how we navigate this, and I worry that people aren’t talking about how to navigate this yet, because they’re not really grappling – part of it is like, my class is really resistant to grappling. You see this with journalists, because journalists are really imperiled. These engines are getting better and better at doing our job. Not all of it – so one thing that we can do that AI can’t is, you know, go drinking with a Senate staffer and find stuff out. But the things – just the writing, the rewrite desk, the copy desk – that stuff, it’s already pretty good at that. So I worry a lot about that. And what I don’t see is people saying “Well, real journalism will never die.” I’m like, maybe, you know, Amazon…

Scott Wallsten: Unless it’s maybe dying for other reasons.

Megan McArdle: Other reasons, yes indeed. Amazon’s already – I just saw today on Twitter, someone who had – whose parents-in-law had bought him a slow cooker cookbook that was obviously written by AI. I have seen ads – I just got fed an ad the other day where someone had included the prompt. I don’t know if it’s like – certainly we should talk about how these things – a couple of research papers have gotten caught by the same thing, which is they just copy and pasted the language, where it’s like “as a large language model, I don’t necessarily know the answer to that,” into their scientific research paper. So…

Scott Wallsten: I also found – if you see somebody use the word “delve,” odds are it came from ChatGPT.

Megan McArdle: I also love the word “delve.” Am I a large language model?

Scott Wallsten: You might be – I mean, how can we know? We could be in a simulation.

Megan McArdle: Yeah. So I think that all of these things – these are just adjustment worries. But we do have to worry about them, they will be sharp transitions, and one thing that you see is when elites are challenged, they often put up quite a fight, and it gets pretty brutal. So I think we might be in for some political trauma as that class of people kind of struggles to hold on to their relative position in society.

But that’s – like those are in some ways good problems to have, right? Oh gosh, we’re gonna be too rich, how will we handle it? That’s a much better problem than “oh gosh, we’ve run out of food and now half of us will starve.”

Megan McArdle: Yeah, no – I worry about how we navigate this, and I worry that people aren’t talking about how to navigate this yet, because they’re not really grappling – part of it is like, my class is really resistant to grappling. You see this with journalists, because journalists are really imperiled.

Scott Wallsten: That’ll be a fascinating debate to watch. I think we’re about out of time. Thank you so much for talking with us. This was really interesting and fun.

Megan McArdle: Thanks very much.

Share This Article

View More Publications by Scott Wallsten and Megan McArdle

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.