fbpx

Jane Horvath on Privacy Policy

Jane Horvath on Privacy Policy

Scott (00:00):

Hi, and welcome back to Two Think Minimum, the Technology Policy Institute’s podcast. Today is Tuesday, January 17th, 2023, and this is our first podcast of the new year. I’m Scott Wallsten, president of TPI, and I’m here with my co-host, TPI senior fellow and President Emeritus Tom Lenard. And today we’re delighted to have Jane Horvath with us. Jane recently became a partner at Gibson Dunn, where she is co-chair of the Privacy, Cybersecurity, and Data Innovation Practice Group. She worked at Apple from 2011 to 2022. She built and led Apple’s privacy legal team, most recently serving as Apple’s Chief Privacy Officer. And before that, she served as Global Privacy Counsel at Google, the Chief Privacy Council on Civil Liberties Officer at the US Department of Justice and general counsel at Digital City, which was an America Online (AOL) subsidiary. And now she comes full circle, having begun her legal career as an associate of Gibson Dunn. And by the way, that should serve as a reminder to bosses everywhere that people who work for you now might come back as your boss. So, you know, treat everybody well. Jane, thank you for being here today.

Jane (01:03):

Thanks for inviting me to speak with you.

Scott (01:06):

So let’s just talk at, you know, at the very, very highest level, you’ve been working on private policy for a long time. Tell us, you know, sort of generally how have approaches to privacy in popular legal and policy senses have changed since you started working on it?

Jane (01:22):

Very good question. I- when I first started practicing law, there was no privacy profession. I had a computer science degree, I was a technology lawyer and I went into a then startup- America online. And we were served with a search warrant by the FBI on the launch of Operation Innocent Images. And it made front page news of the Washington Post. It was an investigation. People can use technology for good. And in this instance they were using it for bad- to trade child porn, but it made front page news and there were a lot of questions. What was AOL doing with people’s data? Was AOL working with the government and just transferring data? I mean some of the same issues that we still see and are grappling with today. And at that point I was the junior lawyer and my boss came in and said, you need to write a privacy policy.

Jane (02:17):

And you know, what’s that? Well the privacy policy is telling people what we’re doing with their data. And it was very much about, you know, looking at how do we use their data, share their data, et cetera. And it was one of the first privacy policies drafted, not because I was a privacy lawyer, probably because I was a junior lawyer and someone needed to do it. But as technology has grown and as more and more of our data has been put online, and of course Europe was far ahead of this in ‘95, they passed the first privacy law, the European privacy directive. But I would say US law is still catching up to Europe, but privacy is also- because it hasn’t been as much legally regulated, although the FTC does have authority over unfair and deceptive trade practices. So anything a company says about their privacy practices is enforceable by the FTC.

Jane (03:14):

And we now have some states that are passing laws and we might see a federal privacy law. It has enabled companies in some way to come up with and drive privacy. And for a large extent, that’s what we did at Apple. We were looking at the law as a floor, but we were looking at what was the right thing to do. And so I feel like because it wasn’t so regulated, it has enabled a lot of really interesting privacy enhancing technologies to be developed. But I think we are at a point that we do need an omnibus privacy law. I think having a bunch of states passing privacy laws that are not always consistent will make it much more difficult to transact business in the US without a uniform omnibus law.

Scott (04:11):

So you said a couple of things that implicitly- not sure cause they go together- you said that Europe had the first, EU had the first privacy law in 1995 and that they’re still ahead of us, but at the same time, that the lack of privacy rules here allowed companies to innovate on privacy. So how, how do those two things go together?

Jane (04:32):

I think they go together. Europe is, I mean, most companies are still governed by Europe. You know, most large companies, the companies I work for always had to look to Europe. But I think that as you know, there was the directive back then and as companies matured, there was the European floor to push companies to do the right thing. But it also gave companies the ability to innovate and engineer around some of the problems. I think that’s one of the most exciting things I saw at Apple is, I work very, very closely with our set of privacy engineers. And I always said that that was a secret sauce. Something could be legal because, you know, a law is written and it’s not, you know, technology evolves and a law is written at a point in time and it takes a while to amend laws.

Jane (05:29):

But engineering, you can engineer things and get ahead of things as to what is the right thing to do. So I can give you an example of a question. So we were discussing what is lawful to collect for data to be collected off of a device. Non-identifiable, so legal, not controlled by European privacy law, but the privacy engineers could come in and say, do we need to collect all that data off of every device? Could we sample? And so there was no legal requirement to do that, it was just the right thing to do. And so I think marrying this innovation in the privacy engineering space with the legal requirements and policy requirements is really the secret sauce. So fast forward, we do not have a baseline privacy law in the US. We have states that are enacting various privacy laws, some of which are not consistent with each other. And I think that then creates a lot of legal uncertainty. And that’s why I’m saying that I think it would be very good for the US, like Europe, to have a baseline omnibus privacy law at this point.

Tom (06:47):

So, I’ve been covering this area on and off probably for about the same length of time as you have, certainly since about- since 2000 or before. And every, you know, every year people say we need a privacy bill. And when you survey people’s- consumers, they say they, you know, they want more privacy. But year after year, no federal privacy bill has been enacted. So I guess my first question is, how do you explain that a defect in democracy or <laugh>?

Jane (07:19):

I, I, I would say it is democracy. But I also, we live in a federated system, so the states have decided, and California has led the way. California also led the way, if you’ll recall, on data breach notification. They were the first state to pass a data breach law. And now we’ve got, I think close to 50 states that have different data breach laws, you know, what do you do in the event of a data security incident? Here, I think it’s really interesting for me to see, because if you look, I’m gonna go again to Europe because I think Europe is a good example. The Data Protection Directive in ‘95 was put in place to encourage trade amongst all the EU countries because they felt like countries would not trade with each other if there wasn’t some uniform level of privacy protection. And so we fast forward, and the directive, which is basically an order for each state to pass a privacy law based on these requirements…

Jane (08:28):

And the directive, it came to pass that different states were passing inconsistent laws. And so there was a lot of, you know, each member state was not completely consistent on their privacy laws. So that is when Europe drafted and passed a privacy regulation. And a privacy regulation differs from a directive in that every member state has to have the same law and that creates uniformity across the EU. So I feel like we are in that point in the US where Europe was, with inconsistent regulations that will be less economically efficient for companies that have business in all 50 states. And we, likewise, need to have a uniform level of privacy protection across all states. Doesn’t make sense that a citizen in a state without a privacy law has less protection than a citizen that lives in a state with a privacy law.

Tom (09:34):

But some people, you know, some people say that the reason, you know, virtually all or all of the great tech companies were developed here and not in Europe is at least in part due to the more stringent regulations around data that the Europeans have. So you obviously don’t agree with that.

Jane (09:55):

I- I really don’t agree with it. I think that that’s been a- an excuse for a long time. I think that there are, it might sometimes be harder to do something, but I think that there are ways that you can create a product that does not trigger a privacy issue. So, you know, you can use end-to-end encryption for storing data or for data processing. You can use device identifiers, you can process data on device. And all of those technologies are privacy-friendly technologies and they’re innovative and they don’t result in a privacy problem. And those technologies are utilized by companies in the US and there is no privacy law that requires them to do that. I think that there has been a lot of hiding behind this idea that we are not going to be able to innovate unless everybody has access to everybody’s data. And I think that that is, I think that you’d also find that in the computer science departments of most universities, they would disagree with that as well.

Scott (11:08):

It- it’s, it’s too big to say that any one thing is responsible for all of innovation in any given area, but what sorts of things should we look for to know whether a set of rules encourages efficiency? Like what could we look for in Europe that would help us know whether their approaches to privacy are better- or maybe better isn’t even the right word, but different. What, what, how would the, what would the effect look like? What are- what should we be looking for?

Jane (11:33):

I think looking at European businesses versus US businesses and only looking within the window of a privacy law, you’re losing all the other different regulations that are in effect in Europe. It’s not just privacy regulations. So I think it’s very hard to say that it’s only privacy regulations that has kept a European company from being a champion, because if every large American company is governed because they operate in Europe, they’re also governed by European privacy law and have been for a very long time. So I think it’s hard to argue that this innovation deficit between Europe and the US is purely a result of the fact that Europe has privacy laws and the US doesn’t.

Scott (12:25):

Actually, that sort of leads to another question I was gonna ask. I mean, doesn’t it seem like the EU is the de facto privacy regulator now, since so many US companies have to follow it, more or less?

Jane (12:35):

I would say that the US is rapidly becoming an outlier on privacy regulation. China has a privacy law now, Japan, Korea, they’re spreading throughout the world and it’s rapidly becoming the case that the US is an outlier by not having a baseline privacy law.

Tom (12:58):

So you were talking about the, you know, the inconsistent state laws and that was one of the rationales for passing a federal law. But of course one of the big stumbling blocks in terms of federal law is the- is the whole issue of preemption. I mean, if you don’t preempt and you know, the problem of inconsistency is still there. So-

Jane (13:19):

Yeah, absolutely, and that has been the big debate over the- we got very close, I thought, the closest we’d been this summer and the big debate was private right of action versus preemption. And there was a grand bargain around that that fell apart. But I think it’s not just preemption, but it’s also, how do you enforce this privacy law?

Tom (13:48):

Right, right. In the absence of federal legislation, it looks like the FTC might promulgate a privacy rule making that essentially be a substitute for legislation. I guess a couple questions: The first question is, do you think that’s likely, would it be a substitute for federal legislation? And-

Jane (14:11):

I think the FTC is very much involved in trying to formulate a privacy rule making. They sent out a request for comment. It will be very, very interesting to see what ultimately comes out. And I think they’re probably focused on their statutory authority to issue privacy rules and, you know, they have to have the authority to issue rules and they can’t go outside the scope of their authorities. So it will be interesting to see, I would argue that most likely their privacy rule making would not go as far as a federal privacy law as we’re looking at aa federal privacy law, you know, the one thing consumers feel like they’re out of control- that they don’t know in their what’s going on with their data, that they feel that their data is out of their control. So looking back to the fair information, privacy principles, believe it or not, the US developed those back in the seventies in the context of healthcare. It was the HHS in building a law around consent, around transparency and trying to encourage innovation around transparency. I think, you know, we’ve seen with the cookie consents that are popping up all over the world that that’s quite frustrating for consumers. It’s hard for them to understand. So whatever law we come up with needs to, I think, be principles based to allow innovation around how to meet those principles.

Tom (15:52):

When I was looking at this issue a while back, it seemed that a lot of the privacy advocates, even, had moved beyond the notice-and-comment type of framework and basically saying that it was infeasible cons- you know, consumers never read the notices, they just agreed to them, so it didn’t, they didn’t really do anything. What’s a substitute for that? Or do you agree with that?

Jane (16:15):

On some level, consent is difficult, you know, and I think there’s certain segments of data, sensitive data, that need to have different rules around them. In the EU, you have a lawful basis for processing data and I think outlawing certain processes becomes difficult. And that’s why I’m saying that I think that whatever framework we come up with needs to encourage innovation around that framework. It’s hard to think of what the other option is other than transparency and choice. How does someone understand what’s going on with their data? If someone doesn’t teach them what’s happening with their data and allow them to consent? It’s a harder question. And the alternative, I haven’t heard a good alternative to notice and choice.

Scott (17:12):

Right. And it seems like it’s sort of becomes almost impossible with the- the more transparent you make it, the harder it is for somebody to understand that’s, you know, we end up with these privacy policies that are so long, nobody ever reads them.

Jane (17:25):

Well- and I think a risk-based approach, I, you we were discussing earlier like differences between Europe and the US. I think looking at the riskiness of data, I can go back to the cookie consents and, you know, you wanna open a website and a website is using cookies for 10 different things and many of them are not at all risky with respect to your data. And I think that risk-based approach is something that we could incorporate in the US. I know the UK is revamping their privacy law and they are trying to move way away from these cookie consents because I think they’re quite irritating to most consumers. They want to open a website, they don’t even understand what a cookie is. So when we’re looking at different levels of data, there is data that is always going to be more sensitive than other data. Your location data, for example, is quite sensitive. You can learn a lot about someone from tracking their location all day long, can learn where they live, where they work, where they frequently shop, you know, their habits. Whereas other data that’s not location data, that may be pseudonymous, needs to be treated with a wholly different set of requirements than data that is so closely tied with your daily habits.

Tom (19:00):

So you worked for two of the great technology companies of the information technology- age, Google and Apple. What are some interesting things about the differences in the cultures between the two companies that people might wanna know? What were the differences in- I mean, every company has its own culture, and so what were some of the interesting differences?

Jane (19:23):

So full disclaimer, I haven’t worked at Google for over 12 years, so I was there at a very different time than the company is now. So my description may be different from the culture, but when I was there, it was- I like to almost describe it like it was a daycare center. There was so many different experiments going on, lots of energy. We basically tried out a lot of different things, released a lot of things in beta, and so it was a very public company. There was a lot going on, but the public was very involved with what was going on there. Apple, on the other hand, I moved over, it’s a company that’s been around for 30 years. It develops hardware. It felt a lot more, in a sense, grown up. It knew it was a business and more than anything the culture was secrecy. And so there’s always a lot going on at Apple, but you will never know what’s going on until whatever is going to be released is really ready to be released. It’s not generally released in beta. You know, I would say that’s one of the biggest differences- is the secrecy. And again, my knowledge of Google is older, so that’s my observation.

Scott (20:46):

So, you know, we hear about the secrecy in on the product side. I mean, just from, you know, reading biographies and stories about Apple, but how did that affect you working on legal issues? You know, when your job is to prepare everyone for what’s, you know, what they need to worry about, think of how to integrate these issues into the products. How did you find out what people were doing and what you needed to work on with, <laugh> when the culture was, “don’t talk about it?”

Jane (21:09):

Well, it’s, you know, in some ways it really, it was a relat- it is a relationship company and you live or die on your relationships and it behooved you to reach out and be a team player. And I never felt like I wasn’t included in anything. And you know, primarily that was because privacy was a corporate value and privacy was- is hugely committed to the issue and no one wanted to be on the wrong side of that issue. And so I always felt like I was sought out for consultation. Our whole team was. And I also felt like because of the secrecy, it was so- it made it really fun because no one knew what you were doing and you were really excited about what was being developed. And you got to- I got to- be, serve that role of the consumer. You know, I, have you thought about this or thought about that and do you think that’s creepy? And you know, I think that that was- it actually made- the corporate culture was one of excitement about bringing out a new product and rolling it out, all of that. You felt like you had a stake in that, you know, and as a lawyer, that’s really exciting because you were actually part of something bigger.

Tom (22:34):

Particularly from your vantage point. I mean, you know, Google, Google’s a company that basically earns its living through advertising and through collecting data in order to, you know, is the basis for the advertising business. Apple’s development is Apple’s. It’s not primarily an advertising company, it’s a hardware company. I guess, maybe. So the, the whole role of a privacy officer in both companies wouldn’t you say is different or has been historically different, or-?

Jane (23:07):

I don’t think it should be different. I don’t think it was necessarily different. When I was at Google, I was on- there was a privacy person assigned to every product and every product that touched personal data. And there was a deep commitment to privacy at Google as well. You know, the outcome in the product itself might have been different, of course, because of the way the revenues were generated. But you know, both companies are also focused on personalization of the product. And of course, you know, when I moved to Apple, a lot of the personalization, we were driving a lot of that to do it on device. So it never came up to Apple because Apple didn’t need to actually know what was going on. And that’s because Apple primarily made its revenues from selling devices. But you could do a lot of those- that heavy personal data used, you could drive most of that processing on device. And so your device got to be really, really smart. But Apple didn’t because Apple didn’t need that data.

Scott (24:18):

Why did you do privacy in the first place? I mean, was it because when you were at Digital City, somebody came in and, like you said, you know, asked to give us a privacy policy or was there something about it at the time that you realized this was gonna be a big deal and only gonna get more important?

Jane (24:35):

I sometimes feel like I was in the right place at the right time as it was growing. There were very few people that had experience in privacy. 9/11 came along- that was another moment in privacy. Patriot Act was passed. Congress funded privacy offices. The first one they funded was the Department of Homeland Security. It was a statutorily required office in any of the departments of the US government that were grappling with national security that would impact personal data. And so my resume was passed on to White House personnel for one of those positions. And there just weren’t that many people with experience out there and they needed to fill the job. And that’s how I ended up at the US Department of Justice. It was a required position. It had been funded. DOJ had not hired for it. I had privacy experience because after AOL, I started doing some European privacy consulting with another woman that was not a lawyer that saw privacy as she thought the next big thing.

Jane (25:49):

And so I moved into DOJ and then from, you know, I, from DOJ on, I- while I was at DOJ, we did a lot of negotiations with Europe because the Patriot Act kind of stopped data sharing between the two blocks, the US and Europe over distrust, over privacy. And so we did some, we negotiated what then became the Umbrella Agreement. Took a long- I started the negotiation and I think it took seven years before the Umbrella Agreement for law enforcement sharing was signed. But that really gave me a very good view into European privacy and the European regulatory framework for privacy.

Scott (26:40):

You know, harkening to, you know, brings that point that we’ve only talked about court privacy or privacy with respect to companies and private businesses, but not with respect to the government. And after 9/11, the government’s objective, to grossly oversimplify, was to make sure that we had no privacy. Obviously I’m overstating it, but should the rules and the laws that we’re talking about or you know, for data apply also to the government?

Jane (27:10):

Well, the government has a privacy act. Of course, a lot of the provisions of the privacy act don’t apply to some of the intelligence- there are a lot of carve outs, but the government actually does have a privacy act. Could it be updated? Yes, absolutely. I do think and, and Europe has also done this, I think that you do need to have a privacy law that governs government and surveillance and then you need a commercial privacy act. I think the two are different. I think combining the two would make a lot more, it would just make it more difficult.

Scott (27:47):

Do you think we focus on one more than the other and- or are they just so different? We shouldn’t, they don’t even belong in the same sentence.

Jane (27:55):

I’m not saying that they’re totally different, but there are, particularly in our form of government and redress, the FTC is not gonna oversee the government and enforce any of the privacy requirements of the government. So I think that the fundamentals around privacy are the same. You know, people need to know what’s happening with their data. There’s probably a lot less choices when it comes to surveillance or people. It’s just, it is different. The things that impact data when the government is collecting data, I mean, the government collects an enormous amount of data for taxes and you can’t then tell the IRS they can’t have your data. So there’s really not a choice there. But you do need to be able to correct your data and that’s a provision within privacy, the ability to correct your data. And of course that should apply to government collections and commercials. So there’s a lot of interplay. It’s just very different as far as the business case.

Scott (29:00):

We’re kind of running out of time, but I’d like to ask one question that- it’s not about privacy, but you’ve probably thought about recently. What do you think about the FTCs moves on non-competes? I mean, I imagine you’re not allowed to, you know, develop a Jane Horvath smartphone and sell it, but have you thought about this at all?

Jane (29:18):

It’s not really my area of practice. I’ve only thought of it in the context of, you know, the FTC is quite active now and they’re active in drafting privacy rules. They’re active in a lot of different areas. And it’s interesting to watch and to see, you know, the impact that it could have, the FTC actions writ large could have on clients.

Tom (29:43):

Well, she, she may not really have, I don’t, she- California doesn’t allow non-compete, right?

Jane (29:49):

She’s not a non-compete expert. I think that it’s very difficult to enforce a non-compete in California.

Scott (29:56):

Right. So you may have a smartphone coming out.

Jane (30:00):

<Laugh> Me?

Scott (30:02):

<Laugh>. You’re right. So I think we actually are out of time now. Jane, thank you so much for joining us. I thought this was a really interesting conversation and it’s certainly one that’s not going away anytime soon.

Jane (30:14):

No, thanks so much for inviting me to speak today.

Share This Article

View More Publications by Jane Horvath, Thomas M. Lenard and Scott Wallsten

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.