AI and Tech in Europe with European Parliament’s Peter Brown on Two Think Minimum

AI and Tech in Europe with European Parliament’s Peter Brown on Two Think Minimum

00:00:14.690 –> 00:00:24.439

Scott Wallsten: Hi and welcome to Two Think Minimum, the podcast of the Technology Policy Institute. Today is Monday, July seventeenth, and I’m Scott Wallsten, President of TPI.

00:00:24.470 –> 00:00:33.290

Scott Wallsten: Today discussions about AI and in particular generative AI seem to be everywhere with governments thinking about whether and how to regulate it.

00:00:33.440 –> 00:00:51.820

Scott Wallsten: The EU in particular has been aggressive about regulating tech; it’s implementing the Digital Markets Act and the Digital Services Act which regulate how large platforms behave, and the Parliament recently approved a draft AI act which, as its name implies, would regulate artificial intelligence.

00:00:51.860 –> 00:01:12.840

Scott Wallsten: So we’re gonna talk about Europe and tech today, mostly as it has to do with AI, and so we’re thrilled to have with us Peter Brown to educate us. Peter is a senior policy adviser at the Strategy and Innovation unit of the European Parliament. He recently returned to Brussels after serving as a senior advisor of technology policy in the European Parliament’s liaison office in Washington, DC.

00:01:13.040 –> 00:01:33.080

Scott Wallsten: Prior to working with the European Parliament, he’s advised several Fortune 500 companies and many national governments and international organizations on technology strategy and governance in cloud, cyber, Internet of things, AI, and data. He’s been engaged in global standardization policy and IT standards development for more than 25 years. Peter, thanks for being here.

00:01:33.380 –> 00:01:47.370

Peter: You’re very welcome. Good to see you again, too. 

Scott: So why don’t we start off by – just tell us about the AI act. What is it? What does it hope to do, or what does it aim to do, and what is motivating European legislators?

00:01:48.360 –> 00:02:43.350

Peter: Well, I think we have to remember the sort of context where the European Union is operating is unlike, say, the US, and other geopolitical areas. It has as part of its treaty very much a sort of consumer and citizen-centric approach to legislation. So whilst there are clearly economic and societal benefits for the uptake of artificial intelligence and other emerging technologies, there are concerns about the implications that AI systems might have to fundamental rights which are protected under, for example, the European Union’s Charter of Fundamental rights, as well as safety concerns for users when AI technologies are embedded in products and services.

00:02:43.350 –> 00:02:59.150

Peter:  So the general approach for legislating on AI has been, what the European Commission and its initial proposal talked about, a sort of human centric approach to AI to ensure that Europeans can benefit from new technologies developed according to the EU’s values and principles.

00:02:59.150 –> 00:03:23.229

Peter: It started off in, I think it was early 2020, with a white paper artificial intelligence; the European Commission set up a high-level expert group with input from a range of experts, 50 plus experts from private sector, from research and from public sector agencies to look at the whole approach. 

00:03:23.230 –> 00:04:17.160

Peter: Their initial approach was to have a sort of a more non-binding approach to having a set of ethics, guidelines for trustworthy AI, and looking at a number of policy and investment recommendations. But it shifted largely as a result of that high-level expert group to an approach which was Peter: saying that you needed a set of harmonized rules for the development placing on the market and use of AI systems.

00:04:17.160–> 00:04:33.660

So that’s basically where where the initial commission position started from and the legislation, or the draft legislation as was presented to the Parliament, covered these sort of technology neutral definitions of AI systems, a risk-based approach to how AI is deployed, and lays down different requirements and obligations for the development, placing on the market and use of AI systems within the EU.

00:04:33.740 –> 00:05:03.150

Scott Wallsten: When doing this, did they consider how this might affect the direction that technology develops? I mean, they have this idea that this will push it in a good direction. But we’re still so close to the beginning of this and we don’t really know where it’s going to be, where it’s going to go. Did they approach this in sort of a cost benefit sense, or do they just approach it in a, ‘well, these are all the bad things that can happen so let’s make sure they can’t’?

00:05:07.580 –>  00:05:36.489

Peter: No, I understand that. And I think the European Union has a range of instruments and a range of initiatives which go from that more protective approach that you’re talking about, where you know, you tend to, or on, the side of caution, the so-called precautionary principle that governs a lot of consumer protection, data protection, and privacy legislation in the European Union. 

00:05:36.490 –> 00:06:15.799

Peter: But it goes all the way through to investing in innovation; looking at ways that the European Union through its various industrial policies and research programs can actually promote new technologies, obviously for the benefit of good. And so, in that sense, I think there is this balance to be struck between what might be seen as a sort of preemptive and rather conservative, cautious, approach that maybe typifies a lot of, certainly outsider views of, what the European Union does in terms of policy,

00:06:15.890 –> 00:06:48

Peter: with the recognition that this is not something which is gonna go away, that it is emergent, as you say – both AI and the range of other technologies. And as such you’ve got to strike a balance between encouraging that innovation and not sort of killing it before it gets anywhere. But, at the same time, trying to stay faithful to those guiding principles which are enshrined in the European Union’s treaties and Charter of Fundamental Rights. 

00:06:48 → 00:07:26.330

Scott: So you know, kind of embedded in your answer, and I don’t know, maybe I’m not hearing it the right way. I don’t know if you attended it this way, but it also sounds like there’s a little bit of protectionism in there. And of course that’s always what Americans say when they hear about various European rules and regulations, that you want to promote innovation and development of, I assume, companies and organizations in Europe.  But I actually haven’t heard that as part of the AI Act, and I don’t know what companies it would be benefiting in Europe. Did you mean it that way? I use the word protectionist in a negative sense.

00:07:26.340 –> 00:08:33

Peter: I don’t think it’s, you know, trying to wall off a European market against the sort of aggressive American big tech firms, or whatever which is, you know, the sort of the bogey man that sometimes is all that you hear about. And when you look at the key legislators involved, both from the European Commission, who initially proposed this and the lead legislators in the European Parliament that have been promoting the draft in a way that sort of crystallizes many of the European Parliament’s priorities; you’ve got a group of people that are clearly tech-savvy, on the ball in terms of where technology is heading, understanding some of the core issues at heart, and without, you know, any offense to US Congress, but what we’ve seen over the years in the US Congress – I mean, its demographic is different. The average age members of Congress is is probably higher than the average it is for the European Parliament. There is a bit of a generational gap. 

00:08:33 – 00:08:36

Scott: The higher age of Congress is higher than the most nursing homes, so…

00:08:36 – 00:09:35.600

Peter: I wouldn’t dare comment on that. I have to be diplomatic, but I think there’s a grain of truth in what I’m saying. I remember the very first hearing with Facebook, you know, and some of the almost cringe-worthy questions from some of the members of Congress which showed that their understanding of technology was very superficial. They were sort of talking to the gallery without really understanding some of the issues here. 

I have to say I have been – and in terms of my own age and experience I’m probably close to the age of the average US legislator than the average European Parliament one – but I’ve been very impressed that the key legislators on the European side, both in Parliament, and the people pushing it from the European Commission, initially, are really on the ball. They really understand in-depth the technologies. 

00:09:35.600 –> 00:10:25.239

Peter: And I think it’s far from their wishes to create a sort of walled garden, you know, a protectionist approach for the European Union, but rather to sort of establish a good balance between innovation and those requirements which are really, you know, central to the European Union’s mission of defense of human rights, defense of the individual, of customers, and of citizens. So I think the fact that you’ve got a pretty well informed legislature there has been to the benefit here, and I think for the most part is, you know, there are quibbles about details of the policy, that’s clear, but I think most of the people that I’ve spoken to who have engaged with the Parliament and the Commission over the last 2 years have been impressed by the the level and quality of debate that we’ve seen.

00:10:25.570 –> 00:11:04.090

Scott Wallsten: This is a good chance, I think, to step back a bit and maybe you can tell us a little about how bills become laws in the European Parliament. I mean a lot of our listeners know in great detail how that happens here. You know, Senator so and so is supporting this because they need x, y, and z. And we know the nitty gritty, although to me the Schoolhouse Rock is still the canonical explanation of how a bill becomes a law, but, you know. Tell us a little bit about what is the European Parliament, and who are the actors that matter there, and what it is that they want.

00:11:04.580 –> 00:11:43.919

Peter: Okay, yeah. So I mean, from a maybe slightly dry, academic point of view, it’s a relatively easy cascade of things to get our heads around. We start with the treaties of the European Union, so unlike national legislature that, basically, if an elected member in a legislature decides that they want to propose a piece of legislation on whatever, the legislature is sovereign in terms of what it wants to legislate on.

00:11:43.930 –> 00:12:15.249

Peter: In the European Union’s case there is a very clear and very detailed description of what the competences of the European Union are particularly vis-à-vis the authority and competencies of the individual Member States; and therefore a large part of the treaty, particularly the so-called treaty and the functioning of the European Union, one of the core elements here, lays out very clearly what the European Union can and cannot do. 

00:12:15.250 –> 00:12:39.980

Peter: And in some there are certain areas of policy which are explicitly laid out. So, for example, on agriculture policy, industrial policy, competition policy, international trade, energy. These are areas which are explicitly covered in the treaty. 

00:12:40.640 –> 00:12:58.199

Peter: Others are a little more sort of indirect, like the so called completion of the internal market, which means realizing in practice this theoretical goal of having a single European market of goods, services, people, and flow of capital between the EU Member States without any friction, without any barriers, and that

00:12:58.340 –> 00:13:29.009

Peter: so-called legal basis – now I’ll come back to that in a moment – gives a lot of wiggle room for legislators to say well we think this particular piece of legislation, or this particular idea, is important for the functioning of the European internal market and therefore that’s our basis for working on. In addition, and this is relevant particularly for the AI act, you’ve also got references in the treaty to privacy and particularly to data protection.

00:13:29.130 –> 00:14:06.279

Peter: So you got the treaties to start with, which lay down what the European Union can’t do.

Then the actual process starts with the European Commission. Interestingly, and again, in contrast to many national legislatures, the European Parliament, as a legislature, does not have a right of legislative initiative. In other words, anything which is proposed as a future EU law has to be initiated – with one or 2 very minor exceptions – has to be initiated by the European Commission.

00:14:06.480 –> 00:14:24.629

Peter: The European Commission does this in 2 ways. Firstly, it examines what needs to be done in a particular area, what it thinks should be done, and looks for the basis in the treaties, the so-called the legal basis to act and say, ‘Okay, we can. We have authority to work in this area, and we will propose something’. And second –

00:14:24.900 –> 00:14:46

Scott: What does that mean? In practice? I mean, how does that affect the agenda that’s proposed in the rules that end up moving through the Parliament? I guess, who are the people that set the agenda?

00:14:46 –> 00:15:10.110

Peter: Okay so, what you have is the European Parliament elections every five years. Then Parliament has a role, together with the Member States, to nominate, and to have confirmed, a new President of the European Commission; who then, once nominated, goes about finding the other members of the of the Commission, the so-called College of Commissioners, which are, you know, there is a sort of

00:15:10.250 –> 00:15:31.379

Peter: negotiation, then, between Member States and the Parliament to establish a College of Commissioners who will be responsible, given responsibilities for particular portfolios like you do with individual secretaries or ministers in a national government. 

00:15:31.560 –> 00:16:06.049

Peter: Now that College of Commissioners, once confirmed, just as little side by – you know, the confirmation process was actually inspired very much by the confirmation processes in US Congress for senior executive officials; Once that commission is established, it has an obligation, not a legal one under the treaties, but a sort of political obligation to lay out a program of work of what it thinks its 5 Year mission should try and achieve.

00:16:06.280 –> 00:16:24.990

Peter: What it then breaks that down to is, then, what’s called an annual legislative program. The Commission, year by year, presents normally around September. It presents a program of work for the following calendar year, saying, ‘These are the areas that we think are important’.

00:16:25.000 –> 00:16:57.010

Peter: Now for this current legislature, so 2019 to 2024, the Commission laid down 5 sort of main pillars of work, one of which was the so-called digital agenda for Europe. And under that digital agenda there’s been, and in each of the annual programs that we’ve seen coming from the Commission, there have been a whole range of legislative proposals. So that’s the high-level political framework. 

00:16:57.210 –> 00:17:22.650

Peter: Then that is put into practice with the Commission making individual legislative proposals based on a legal basis in the treaties, as I mentioned. They make a proposal and it comes to the Parliament as the one branch of the legislature and to the Council of the European Union, which represents the Member States.

00:17:22.670 –> 00:17:51.110

Peter:  It’s a bit facile to me to make a comparison with the House of Representatives and Senate, but there are some parallels. But, basically, the Parliament, together with the Council, has to take the initial proposal from the Commission, and agree with each other – the Council and Parliament – on a position, and, you know, they each, on their own side, make amendments and propose changes to the Commission proposal. 

00:17:51.110 –> 00:18:22.620

Peter: They then go into, like the equivalent of a conference committee in Congress

 to negotiate and come up with the final text. And once that process has been finished, then the draft legislation which was initiated, maybe, up to a year or longer before from the Commission, then gets to the point of being signed off as law with often a specific date for entry into force of the law – and as in many of the tech areas, dates or deadlines by which certain provisions of the law have to be enacted.

00:18:22.880 –> 00:18:44.549

Scott Wallsten: So, for the AI Act, who are the key people? You know, sponsoring it, involved in it, pushing it. I mean, here if a bill is from Elizabeth Warren, people will think one thing, and if it’s from Ted Cruz, people think another thing. What’s the analog here for the AI act?

00:18:45.030 –> 00:19:24.149

Peter: There isn’t a comfortable and easy analogy to make for 2 reasons. One, it’s a large, complex, multinational institution with 27 Member States, I think 107 national parties represented within the European Parliament, 700 plus members. Most of the members of Parliament are organized into transnational party political caucuses – so the 7 major caucuses. 

00:19:24.350 –> 00:19:54.730

Peter: So, there isn’t a simple majority and minority, you know, with the chair of a committee and a ranking member like you have in the House and Senate. So, because no single political family has a plurality or an absolute majority of votes in the Parliament  that has to be compromise, and that compromise is normally a cooperation between those, or numbers of those, political caucuses, those political groups, as we call them. 

00:19:54.940 –> 00:20:13.000

Peter: That in itself, I think, is a big difference. Let’s take for example the AI act. The proposal comes to the Commission, and the Parliament, first of all, decides which should be the lead standing committee or committees that should take responsibility for the piece of legislation.

00:20:13.130 –> 00:20:46.970

Peter: Then, once that’s determined, within each committee they will decide through negotiations and discussions which members or which political group should take the sort of leadership positions, the sort of key legislative positions, which the position we call rather elegantly, using the French word, ‘rapporteur’, that becomes the sort of spokesperson for that committee, and is the main negotiator and lead legislator for that committee and which will have been agreed upon between the political groups.

00:20:47.060 –> 00:21:15.689

Peter: Behind that lead legislator, there are what we call the shadow rapporteurs, which are basically – the other political groups nominate somebody as well to basically keep an eye on the lead rapporteur and to advise, push back, negotiate when particularly there are positions where there isn’t a broad consensus, and work behind the scenes to try and build that consensus. 

00:21:15.830 –> 00:21:53.950

Peter: So a lot of the process in Parliament, distinct from the sort of bipartisan approach that, you know, for better or worse, you see in many national parliaments, means that a broad consensus is necessary because the reality, the numerical reality, today in the European Parliament is nothing will get adopted unless it has the support at least of 3 of the 7 political groups, assuming that they, you know, they follow their respective party line or internal party discipline, so it is not very much. 

00:21:54.320 –> 00:21:58.370

Scott Wallsten: What does it mean for the draft legislation to be approved? I mean, is that a signal that it is almost certain that this will become a law?

00:21:59.780 –> 00:22:31.079

Peter: Not necessarily. We have had situations where the Parliament has come to a consensus on something, and the position is very different from that of the Commission and the Council, I’d say, representing the Member States of the European Union, might have a very different position, and the negotiations may or may not get to a position where they have a compromise there, and could break down and you end up with no legislation at all.

00:22:31.230 –> 00:23:01.459

Peter: In most situations there is a final position. Where we are at the moment with the AI Act, for example, is the Commission made its proposal, the Council already adopted its position, I think, in December last year, relatively quickly. Parliament took much longer, I think, went into more detail, maybe, than the Council. 

00:23:01.470 –> 00:23:24.259

Peter: It adopted its position in June, and having done so, that Parliament position is effectively the negotiating mandate now for the leadership of the Parliamentary committees that are driving that draft. That’s their negotiating mandate to go talk with the Council to try and agree a final compromise on those areas where there is still disagreement.

00:23:25.240 –> 00:23:50.409

Scott Wallsten: So let’s come back to the act itself for a moment. How will we know if it

turned out to be successful if it passes? I mean, if Skynet doesn’t take over, will they say ‘well, there you go it. It all worked out well’, or do they hope to see – well, what is it that they hope to see? How will we know if it’s successful, if it succeeds or fails?

00:23:51.630 –> 00:24:26.610

Peter: It’s a good question. I mean the cynical side of me says, well, if it doesn’t work political leaders, whether in the Parliament or in the Commission, will say, that’s because you didn’t fully support the position that we’d argued for, and, if it succeeds, they’ll say it’s because you did accept the compromise that we put forward. Trying to assess in this sort of thing criteria for success is always going to be difficult. 

00:24:27.000 –> 00:24:52.490

Peter: That said, I think, and I hesitate, but I think it’s necessary, I hesitate to make the comparison with say GDPR, which was the last big sort of tech focused horizontal piece of legislation, horizontal in the sense that it’s a broad application across all areas, and I think most people, and I think the Parliament would be amongst the first to say

00:24:52.540 –> 00:25:24.319

Peter: we didn’t get it all right, we set a very high bar for what we thought was necessary to protect fundamental rights and privacy of European citizens, and we didn’t get it all right, particularly in areas of international corporation, the fact that we now are seeing a third attempt to get an agreement with US on exchange of personal data. Everyone’s crossing their fingers and hoping that they’ll stand in courts of law this time.

00:25:24.320 –> 00:25:58.019

Peter: But clearly it was a very ambitious project, and a lot of it worked, and there were a lot of complications in its application. And I think there are people quite legitimately saying, you know, we could have done better and there’s other things we should have addressed there. I haven’t spoken to anybody, either legislators, staffers, or anybody who believes that the AI Act is going to be the sort of be all and end all of legislation in this area.

00:25:58.090 –> 00:26:24.999

Peter: The approach was rather to identify what were the core concerns in terms, particularly about risk, of what is an unacceptable risk in potential deployment of AI systems where you cross a red line or you don’t want organizations crossing a red line, you’re saying, you know, we will not do this in the European Union.

00:26:25.110 –> 00:26:42.329

Peter: And then, for the rest, trying to identify what is low or minimal risk, which requires no intervention; what is limited risk, which may need some either self certification or attestations by providers of solutions, saying, well, we’ve

00:26:42.330 –> 00:27:27.219

Peter: looked at the AI system we’re deploying, and that, you know, we’re being transparent with you and sharing what we think is how our system works, and we don’t think there is a big risk. And of course, the big focus is on the regulated high risk AI systems where there are much more clear approaches in terms of what needs to be done. And I think it’s precisely in that area where, in terms of identifying not just success criteria, but identifying whether it’s been successful or not, I think, will be not easy to judge but can be assessed because

00:27:27.250 –> 00:27:46.549

Peter: everything falling within that high risk category is going to be subject to a series of conformance criteria and proofs by the producers of the AI systems that they are in conformity with the provisions. 

00:27:46.960 –> 00:27:51.759

Scott: So what’s an example of a very high risk AI?

00:27:52.610 –> 00:28:35.309

Peter: So, let’s think. One of the most controversial areas has been the use of biometrics for identifying when people, you know natural persons, there is a sort of red line that both the Commission lay down on which the European Parliament strengthened, which is, say, it was about the use of biometric identification in real time situations where AI is used in real time on a large public, say, for example, in the public space, in a sports stadium, in a shopping mall, or whatever, that sort of 

00:28:36.000 –> 00:29:11.269

Peter: real time biometric identification into individuals was unacceptable in terms of the potential infringements on private life. However, even in the high risk areas they’re saying, well, biometric identification can be used. And there are plenty of situations where it’s actually of benefit both for the individual and for the organization using it. I mean, think of things like border control management and boarding planes.

00:29:11.320 –> 00:29:33.590

Peter: I mean my most recent arrival in the US, you know the border control official just takes a photo of me and matches that against my passport on file and sees that I’m the person that that claims to be so, and no more questions – I’m through. So there are clearly benefits of biometric identification.

00:29:33.720 –> 00:29:58.819

Peter: However, if you start using biometric identification to categorize people. You know, particularly protected groups, categorizing people by race, by ethnicity, or by gender. And then using that identification for other purposes. Then you get into that sort of uncomfortable area where the European Union is saying, no, we don’t like that. 

00:29:59.210 –> 00:30:49.909

Peter: So biometric id, I think, is one of those areas, it’s one of the 8 specific areas laid down in the draft act which covers in detail the sort of the provisions that people need to implement in order to stay on the right side of the law there. But there are others, you know, on operation critical infrastructure, in the field of education, in law enforcement, the administration of justice, democratic processes, employment. 

00:30:49.920 –> 00:31:02.560

Scott Wallsten: How have different groups reacted to it from, you know, companies to public interest groups? I mean, here when you hear from any major company, any organization that is doing AI, they will say we think there should be some regulation and, of course, much less specific as to what. 

00:31:02.610 –> 00:31:19.790

Scott Wallsten: How do they engage with the Parliament? I guess it’s 2 questions. What does lobbying look like there? Not just from companies, but from all groups. I mean, lobbying has a bad name but you need to have some kind of interface. And what have the different groups thought about it, and how have they tried to influence it?

00:31:20.250 –> 00:31:47.069

Peter: Right, the first one is relatively straightforward in the sense that, yes, lobbying is there. I don’t think it has such a bad name as some people might think. We recognize that different interest groups want to promote their interest and want to pedal influence where they can. What we do have is a so-called transparency register, which requires that anybody who is lobbying

00:31:47.070 –> 00:32:09.420

Peter: the European institutions, and new components in particular, are registered. So you actually have a register of lobbyists with the declaration of their interest. You know where they’re coming from, who they represent if they are not representing themselves directly. There is an element of transparency. They are already in the lobbying process, which is, I think, good.

00:32:09.520 –> 00:32:45.390

Peter: I personally think that, and I think that’s shared by a large majority of people in the European Parliament, the rules about how and under what circumstances lobbyists can access elected members and influence them maybe need tightening up, or need to be clarified more. But the general sense is lobbying is part of the daily work of the institution. Don’t think anybody would see that otherwise.

00:32:45.430 –> 00:33:08.000

Peter: So in that sense, I think it’s there and we recognize it and people work with it. Your second question, in terms of how that has manifested itself in the AI Act – you can actually see, and again, it’s a matter of public record in the European Parliaments, resolution which lays down 

00:33:08.440 –> 00:33:39.929

Peter: its point of view and its proposed amendments to the draft legislation, in the annexe to that there is actually a complete list of every single organization that has lobbied the Parliament during the process of the piece of legislation. Who they’ve spoken to, when the meetings have taken place, and who’s been involved. So again, there is an element of transparency there which I think is helpful.

00:33:39.930 –> 00:34:09.370

Peter: But it also highlights the vast range of organizations that have been approaching the Parliament. We’ve had everybody, from the European Association of Consumer Bodies right through to the big tech companies themselves, or trade associations, and others who represent those companies. So we’ve had the whole panoply of different representations.

00:34:09.540 –> 00:34:44.950

Peter: I think, in terms of the substance of what they have lobbied on. Well, first of all, I think a lot of private sector bodies naturally are averse to regulation. They believe in a free market, they want it to operate with as few impediments as possible. That does fly in the face of alot of the core principles of the European Union. But,

00:34:44.980 –> 00:35:12.440

Peter: generally, private sector entities will obviously try and lobby to minimize the regulation. That said, when it’s clear that there is going to be some regulation, I think the biggest thing they want, without speaking for them – that’s their job – but I would say, is clarity and lack of ambiguity.

00:35:12.520 –>  00:35:44.839

Peter: And as long as the rules are clear and they apply to everybody, there is a recognition, whether they like what the rule is saying or not, then as long as it’s coherent and fair in terms of applying across the entire market. There’s a recognition of ‘okay, we all have to play to the same rule book, and we can deal with that’. Again with the reference to GDPR, I thought it was interesting that a lot of the big tech companies were very skeptical about the GDPR and about its initial drafts

00:35:44.900 –> 00:36:00.490

Peter: but once it was approved many of the US big tech companies – Facebook, Microsoft, and so on were amongst the first to fully conform with the provisions of GDPR. And they said, ‘well, if this is what the law is going to be, we want to be there and conform with it as soon as possible.’

00:36:00.810 –> 00:36:12.199

Scott Wallsten: I mean, that’s not really surprising, right? One of the effects is to make it harder for entry. So of course, the big guys like it.

00:36:12.490 –> 00:36:46.460

Peter: Exactly and this for me has always been a bit of an irony about EU legislation in the tech area, and, you know, I don’t get too much of the reservation here in saying this, but my opinion was very much that there was a risk with GDPR, which I think has been proven to be legitimate, which was the people that are going to be able to conform the most easily are those who have the economies of scale built into their organizations to be able to do so and

00:36:46.690 –> 00:37:07.679

Peter: it costs a lot of money to conform with GDPR, a lot of effort, legal, technical, and other changes in organizations to conform; that’s clearly easier in a large multinational corporation than it is for a small or struggling medium size enterprise.

00:37:08.490 –> 00:37:32.289

Peter:  And I think, again without being out of line, my biggest concern for the AI Act, as is with a number of other policy areas in in tech, and I’m not saying this is the case because it clearly isn’t in many areas, but if it is the case that one of your objectives is to try to sort of

00:37:32.850 –> 00:38:03.949

Peter:  rein the sort of overbearing power of lot of big tech and the big tech companies, you have to be careful how you go about that because you could end up just hurting your domestic or other small businesses that want to try and get a foot in the market, because they just don’t have the way with all to do the assessments, the conformity, the certification regimes that are laid down by these various pieces of legislation.

00:38:04.050 –> 00:38:35.350

Scott Wallsten: So to push you a little further into the danger zone in terms of keeping your employment, and I’ll just be very blunt and recognize that this is actually a nuanced question, but you know a lot of people will look at this and say, ‘okay, Europe keeps passing these rules on the digital economy and, you know what, there are no big European tech companies and with a very small number of exceptions, tech innovation does not come from Europe anymore.

00:38:35.360 –> 00:38:44.250

Scott Wallsten: Why does the European Parliament look at this landscape and think you know what we need? We need more of what we’ve done that didn’t work. 

00:38:44.550 –> 00: 38: 53

Peter: First of all, I’ll push back on what you are saying. 

00: 38: 53→ 00: 38: 54

Scott Wallsten: Yeah, please. 

00: 38: 54 –> 00:39:44.180

Peter: I think Siemens, SAP, Nokia and plenty of other corporations in Europe would disagree with you that there isn’t a European tech market. I think there is a historical issue there which is not, you know, not any EU problem or potential solution, which is yes, there has been historically a lack of investment in emerging technologies within individual countries in the European Union. Or there’s been a divestment of that early innovation, you can see that through ICL in Britain, right through to, most recently, with the acquisitions of ARM, you know, which is one of the was one of the biggest breakthroughs in sort of a semiconductor chip manufacturing that the world seen in the last few decades. We just, Europe or its individual Member States, didn’t see it as a strategic priority to hold on to some of those. And I think, part of the problem we have.

00:40:05.140 –> 00:40:42.519

Peter: This issue is recognized by the European Union, which is we don’t have the economies of scale. We have 27 Member States, each with their own agenda, each with their own priorities, each with their own investment ideas and R&D programs, and what the European Union can provide on top of that is relatively modest compared with what the individual Member States could do or all the Member States together can provide. So yeah, there’s a sort of recognition. There is a bit of catching up.

00:40:42.590 –> 00:41:13.730

Peter: I don’t think anybody really believes that the way to encourage European tech is by, you know, beating US or other big tech over the head and hoping they just get the hell out of the European market. I don’t think anybody seriously believes that. There are probably a few people who would believe that. But I think there is this sense of tempering

00:41:13.810 –> 00:41:45.090

Peter: the completely free market model that the US has traditionally had with broader societal concerns which are encapsulated in the EU treaties, which does make the European Union as a polity different and have a different balance of interests in how it does policy. So in that sense, yes, it’ll tend to be much less sort of unfettered free market.

00:41:45.350 –> 00:42:28.310

Scott Wallsten: Is there an affirmative tech policy in Europe? I mean these, what we’ve been talking about, are, I don’t want to say that they’re negative, but it’s about controlling tech and how it develops, but are there initiatives to encourage it? So far Threads isn’t available yet there, right? Because Meta is worried about complying with DSA. And if people like Threads instead of Twitter, that’s obviously already consumer harm. So that’s kind of on the negative side.  What’s the affirmative agenda?

00:42:29.120 –> 00:43:12.390

Peter: I think there are a couple of clear examples. One is the so-called Horizon Europe program from the European Union, which is your major tech investment program and R&D program which has allowed the European Union to stay pretty much at the forefront of certain key areas like quantum computing. That’s not an area where either the US or, for that matter, China has been the sort of unchallenged number one in that area. It’s where I think a lot of European companies are sort of putting the weight.

00:43:12.460 –> 00:43:58.599

Peter: The other area is the European version of the Chips Act. You know, the desire to invest in silicon ship manufacturing and to secure the supply chain that that involves in such a way that you are able to build, to manufacture, and deploy in Europe without having the risk of either supply chain disruptions, or having to rely on third country supplies to your market. So I think those are a couple of areas. In the area of AI, that’s an area where I think it’s starting to be looked at. I think we probably talked about this when we first chatted back in April in the roundtable in Washington.

00:43:58.610 –> 00:44:23.739

Peter: I think European Union has something which nobody else can really challenge anyone on in the area of AI, and that’s in the whole area of multilingualism. Because of the nature of the European Union and the fact that goods and services can be freely exchanged across the European Union Member States, that poses a challenge for whether it’s just the simple translation of

00:44:23.740 –> 00:44:54.799

Peter: regulatory and conformity statements in documentation, whether it’s in user guides or in real time, in being able to converse with, say, a hospital administrator when you’re on holiday in a third country and you don’t speak the language. So for the European Union to invest in the multilingual issues around AI, I think is something which would be extremely welcome and where there is a clear value proposition that nobody else can provide.

00:44:56.250 –> 00:45:10.049

Scott Wallsten: Okay, I think that takes us to the end of our time, and that’s a good place to stop because it’s a positive note. Peter, thank you so much for talking with us today.

Share This Article


View More Publications by Scott Wallsten and Peter Brown

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.