fbpx

“Privacy Legislation in 2019? Maureen Ohlhausen and Alan Raul” (Two Think Minimum Podcast)

“Privacy Legislation in 2019? Maureen Ohlhausen and Alan Raul” (Two Think Minimum Podcast)

Two Think Minimum Podcast Transcript
Episode 015: “Privacy Legislation in 2019? Maureen Ohlhausen and Alan Raul”
Recorded on: February 26, 2019

Scott: 00:01 Welcome back to TPI’s podcast Two Think Minimum. It’s Tuesday, February 26th, 2019 and I’m Scott Wallsten, President and Senior Fellow of the Technology Policy Institute. Today we’re excited to talk with Maureen Ohlhausen and Alan Raul.

Maureen Ohlhausen is currently Practice Group Chair and Partner of Antitrust and Competition Law at Baker Botts in Washington, D.C. Before that she served as Acting Chairman and Commissioner of the Federal Trade Commission where she directed all aspects of the agency’s antitrust activities for merchant review and conduct enforcement to consumer protection enforcement, to policy formulation on privacy and technology issues.

Alan Raul is founder and lead partner of Sidley Austin’s Privacy and Cybersecurity practice in Washington, D.C. He represents companies on privacy and cybersecurity issues including global protection and compliance issues, data breaches, consumer protection issues and Internet law before the FTC, state attorneys general, SEC, Department of Justice and other government agencies.

I’m joined today by Tom Lenard, Senior Fellow and President Emeritus of TPI. Welcome Maureen and Alan. We last saw you on a privacy panel we hosted at the National Press Club on January 16th, earlier this year. It’s now late February 2019 and this week Congress is holding hearings on federal data privacy. So, I wanted to start with the big question. Will there be privacy legislation this year?

Maureen: 01:20  Scott, that’s a very good question. I would say in my assessment, so I’ve been in this business for quite a while and I would say it’s the best chances that I’ve seen during my time. In the 111th Congress a lot of debate about privacy bills and several bills were floated, but then that sort of fell away. There’s been interest in data security legislation. But again, nothing had passed. I think there’s a lot of energy behind doing something in this space, both on data security and privacy. A lot of it being driven by what’s happening in Europe with the GDPR and the California privacy legislation as well. But whether it will actually get across the finish line is still the big question.

Alan: 02:12  I agree with Maureen, there’s more energy than ever. Best chance ever. But, if what you’re talking about is federal privacy legislation, which being the kind of nitpicky lawyer that I am, I noticed you alluded to the “federal.” Maureen mentioned California, of course Europe. Other states are chomping at the bit to get into the action. Washington State, perhaps leading the charge, others interested as well. So, there’s certainly momentum around the country and around the world. This is probably the most auspicious environment if one favors federal privacy legislation, but the devil will be in the details there and preemption, which I’m sure we’ll get to, will no doubt influenced whether it gets across the barrier or not.

Tom: 03:08  Not everyone knows that we do have some general privacy laws in this country that are administered and enforced by the FTC. Maureen maybe you can just take a minute or two and explain what it is the FTC does and what the FTC’s approach is in enforcing privacy.

Maureen: 03:32  Tom, thanks for asking that because I think that has gotten a little bit lost in the debate and gets a little under appreciated. The FTC is the general consumer protection agency in the U.S. and under its organic statute  the FTC Act, it can prevent deceptive and unfair practices. And so, it is used, it’s deception authority when companies have made promises about what data they would collect or how they would safeguard it or with whom they would share it, and then didn’t live up to those promises. And so, there’s been quite a bit of enforcement in that space. And then on the unfairness prong of the FTC’s authority, they use that quite effectively to say if a company has used data in a way that, even if it didn’t make a promise, but where it causes a substantial injury to consumers it can bring an enforcement action.

In the data security area, it’s that authority where it’s relied on for bringing challenges about companies not taking reasonable precautions to secure and protect data. Where there have been big data breaches or hacks where a company hasn’t taken reasonable precautions, the FTC has acted in that space. Of course, we’re focused a lot on online issues, but privacy is much broader than that. Think back to the Fair Credit Reporting Act (FCRA) that has a lot of privacy elements in it. The FTC enforces that. The Children’s Online Privacy Protection Act (COPPA). There’s health information privacy, which [Department of Health and Human Services] HHS enforces, but the FTC has brought more than 500 privacy and data security cases, both online and offline, through this very general authority that it has. So, there’s the question of does it need more tools, does it need more resources, but it has used the sort of general authority it’s been given fairly effectively and at times aggressively in the privacy space.

Tom: 05:32  Do you think, does [the FTC] need more tools or more resources?  

Maureen: 05:37  I think it’s being asked to do more things right now. There’s this big concern about, “do consumers have control over their data?” Is this a competition issue? I think more resources might be of use, not endless amounts of resources, but yeah, maybe a little bit more and as for more tools one of the questions is, is the remedial authority that the FTC has sufficient? In previous iterations of data security and breach notification law the FTC has been in favor of getting civil penalty authority. So, right now the FTC can get redress for consumers. If you can say there was a harm to consumers and you can identify the amount of money they were harmed, the FTC could get that money back. But it couldn’t penalize companies unless they violated either an order that was already in place or rule that had already been authorized like the Children’s Online Privacy Protection Act.

One of the issues for data security is do companies have the right incentives to protect data sufficiently and would giving civil penalty authority to the FTC allow it to encourage companies to take those kinds of precautions? Because then, data security, when sensitive data gets exposed consumers might suffer things like fraud and identity theft, but it can be very hard to trace back that to any particular data breach. Because there have been a lot of breaches that expose people’s social security numbers and account numbers. Drawing that line between the harm and the breach is difficult. That’s why in previous commissions they’ve been supported, myself included in getting the FTC Civil Penalty Authority for data breach. The other question about privacy, that’s a little, I think more challenging.

Alan: 07:40  I think the question of whether the FTC can seek a penalty in the first instance without, as Maureen speaks, without there having been a consent order in place or a specific rule that was violated. That’s very much in play right now and I think that whereas several years ago perhaps you would’ve seen the business community oppose that adamantly, I think that there’s greater flexibility now, whether the FTC could seek, in appropriate cases with a sufficient showing a penalty in the first instance without having to wait for, let’s say a decree to have been ostensibly violated. And I think that there are a number of problems with the existing consent decrees that the FTC has entered into. It’s understandable that companies seek to settle when the array of the federal government is mounted against them just as companies settle in civil litigation in other contexts.

But, oftentimes what has happened is that in order to avoid continued investigations companies settle and end up being subject to FTC consent decree for 20 years. And that has been a de facto sword of Damocles over many companies, tech companies that in 20 years may not exist or certainly not be recognizable. I think there could be considerable reform of the consent decree process, and as part of that perhaps civil penalty authority in the first instance is something to be considered. With regard to the incentives that Maureen mentioned, especially with data breaches, data security. I honestly think that companies have a tremendous incentives to protect their information. That’s not to say they do it perfectly or that they do it reliably. I do think that the FTC has done a really very admirable job of trying to avoid got you cases as they would put it but rather go after more serious violations. But, today with the plaintiff’s bar being so aggressive and bringing cases on data breaches when there’s the Securities and Exchange Commission that is also in the mix for sanctioning egregious data breach violations that have not been disclosed by public companies appropriately. Where shareholders have filed suit, where there’ve been derivative actions. I think there’s an awful lot of incentive to protect this data and I think companies generally do quite, they try quite hard, whether they do quite well, it’s a function of the of the risks that they face. Just as the federal government and government agencies, including the FTC and including a lot of the intelligence community. They’ve experienced breaches. So, this is a subject that I think it may be remedied, may be enhanced, not so much by incentivizing enforcement, but rather by federal government taking additional steps to protect us all from hackers.

Scott: 10:50  It’s interesting the way you framed this because proponents of the GDPR style solution say that the FTC, the status quo is too weak and Alan, you were basically describing it as too strong.

Alan: 11:06  Yeah.

Scott: 1:07  You mentioned that companies get stuck with consent decrees that last for too long. It might be too harsh. How do you square that? I mean, with their saying that the FTC is too weak and that’s why you need reforms and you’re saying it’s too strong.

Alan: 11:20  I don’t really think that at least not in candid sidebar conversations with European officials that they would say that the FTC is too weak. I think they view the FTC as a role model and as an agency whose enforcement authorities, who’s enforcement prowess, who’s enforcement activities they want to emulate. I think that they would like to really transform themselves and by they, I’m talking about the European Data Protection Authorities that enforce the GDPR. They’d like to be the FTC on steroids and now that they have up to 4% of annual revenue, penalty authority, they view that as the cudgel that will convert them into FTC like agencies.

What the FTC has though, with regard to the Section 5 authority that Maureen talked about as kind of a general privacy and data security enforcement is really a statute that allows the FTC to go after abuses. Real abuses where a private actor has acted egregiously to impose some harm on individuals. For the most part, I think the FTC is sensitive to acting on the basis of harm and abuse, although we can talk a little bit more about that later. Under the GDPR and the Europeans, they don’t quite accept the model that what should be sanctioned, what should be disciplined, what should be enforced against is when a private organization does something wrong that causes harm or injury to citizens or consumers. Instead, you get a GDPR model that is infinite in its opportunities for violation with now a sanctioning authority of 4% that is draconian. So, there’s so many ways you can violate it. They don’t worry about harm and injury that much and they have phenomenal sums that they can pose as penalty. So that’s going to be a pretty dangerous environment.

Tom: 13:31  Speaking about the Europeans. I mean, now in the U.S. with the Federal Trade Commission, I don’t know how much either of you can talk about this, but the FTC is now investigating Facebook for violation of the consent decree that it is under now. If one believes the press, we’re talking about actually billion or multi-billion dollar fines, much, much larger as far as I know than have ever been imposed by the FTC before. You think somehow the FTC feels pressure from what the Europeans are doing and that’s why they’re talking about such large numbers?

Alan: 14:13  I’m going to speak only generally of it and not comment on the specifics of the case. But, do I think that the Federal Trade Commission and other arms of both the federal and state governments are under some pressure to take actions in the privacy sphere? I think the answer to that would be yeah, of course. Privacy issues have captured the public’s attention, and I’ll say even in America recently such that the zeitgeist has changed. Whereas previously, I think one could fairly and reasonably say that Americans were fairly copacetic about sharing information and uses of information.

Things started to change first in Europe, I think with the Snowden revelations, even though that didn’t involve commercial privacy at all, rather an intelligence community. I think the recent, the credit bureau data breach with the fact that just about all of the country where conceitedly by the victim company there, very sensitive data, passport information was included. I think that people start thinking about that. But then in my view, another noncommercial development is what changed the public psychology and that’s the manipulation of the election by the Russians. And doing so in our social media networks based on information that we share with each other and with our social media and they were able to apparently very effectively with the expenditure.

I’m not sure how much money was at stake, but I’m sure it was more in the sophistication than in the dollars. But, I think when people realize that their information can be put to manipulative uses that have consequences about what they believe and maybe how they vote. I think that’s troubling. I think that the sensibilities have changed in the political marketplace, if you will. But to go to your point, Tom, about if they were talking multi-billion and instead of I think $22.5 million was the largest ever before. The French data protection authority known by its French acronym, the CNIL, just imposed I think the largest fine in Europe, which was 50 million euros which is about $57 million. So, if we ratchet it up, if somebody ratchets it up to the billions that would be quite a leap.

Tom: 16:47  I think you’re right. All these issues are kind of conflated together in the public line and then then those of us who work in privacy, to some extent kind of try to cabin off the issue, the issue of the use of private data for commercial purposes and targeted advertising and all of that. But it’s all being conflated together with election meddling and national security issues, which are really different issues. They’re really all different issues. Do you think it’s appropriate to conflate them together or it just happened and that’s what politics is?

Alan: 17:30 I think the Europeans conflated them, if you will, because the data protection authorities in Europe, which don’t have authority over national security basically took out the perceived excesses that were leaked by Snowden and took it out on the U.S. tech companies. How convenient that the European enforcers go after U.S. companies based on perceived national security activities that really benefited not only the United States, but in many ways Europe as well. I think that the conflation really arises again because of the odd jurisdictional organization of Europe. Not odd, but rather the commercial privacy regulators there really taking action based on their concerns over national security issues. But, if you look at where the problems really are, and I want to take the opportunity to praise Maureen in her tenure as Acting Chairman at the FTC and embarking on an informational injuries workshop and study of what is and ought to be an actionable entry to enforce in the privacy realm. And, it’s not easy to say. Now, if your identity or your money has been stolen, no doubt about it, that’s really injurious.

If your voting interests and your beliefs have been manipulated concretely that I would say is certainly actionable on a social level. But I was listening this morning to the House hearing on privacy which was very interesting, and the testimony was very illuminating and erudite. But the one thing that is missing from testimony about privacy legislation is what are the problems? I think we really need to think about that. If you go to a hearing involving Food and Drug Administration, they tell you about the problems. The problems are that people are going to get sick or that medicine is not being approved enough to prevent people from getting sick. You go to an environmental hearing, people are exposed to toxins and pollutants and this could have an impact on their, and you might debate the extent of the entry, but you know what the injuries are.

I listened to the entire hearing this morning and there was very little discussion of problems other than identity theft and you know what I’ll call that, the Cambridge Analytica election interference. And, I say that as a real believer that privacy is important. Because personal autonomy and dignity, it really counts and people that should not have their expectations just rejected or ignored and that’s important. But we really ought to take a hard look at what is injurious, what is harmful, what is actionable and regulation ought to be based on that and I would submit that the Europeans don’t do that. I think the FTC has done a pretty good job of focusing on harms.

Maureen: 20:32  Thank you Alan for that compliment because I do think that is important. What is the problem we’re trying to solve for here? Now, there was this bigger tech backlash going on and privacies become part of that. There’s a reaction against American companies around the world. There is a reaction against companies just being big and seen as powerful and these critiques about privacy I think is part of this larger issue. And then, there is the issue that is, I think a subtext in this that doesn’t get surfaced as much but should. Is that there’s a big fight going on about advertising dollars, right? And tech companies, the tech platforms, they have captured a lot of the advertising dollars that used to go to different traditional newspapers, traditional media. And so, I think that is also driving some of the critiques here. I mean, not totally.

I think there have been privacy issues and privacy concerns, but I think we need to look at it as part of this bigger issue. And one of the things that really struck me, Alan, you mentioned that CNIL decision against Google. Is the fact that if you have a detailed statute, if you have this idea that here is what consent and notice is supposed to look like, why don’t you put out some guidance first? Why don’t you say here’s this very detailed statute, in some ways a GDPR regulation, some ways it’s detailed, but it doesn’t give all that much guidance I think. And if what you’re trying to focus on is giving consumers more control rather than got you enforcement, maybe you should give the companies a little better idea of what you think is sufficient notice. What you think is a sufficient legitimate interest. Because that’s another issue that, with this enormous fine possibly hanging over the heads of companies for essentially not being able to discern well what is this particular DPA. They have to think, is too many clicks through to get to all the information? That was one thing that kind of jumped out at me. Maybe there should be a little more guidance here at what we’re actually trying to do, is to make sure consumers have the tools that they want for controlling their privacy.

Scott: 23:13  Did the FTC have to do something like that when trying to define deception and unfairness, give companies guidance of what exactly those would be?

Maureen: 23:21  The FTC did do that because there had been big debates previously about the FTC’s deception and unfairness authority and they were quite the political issue of their day and so the FTC adopted a deception policy statement and then an unfairness policy statement, which Congress eventually codified. For in fairness rather than saying, “oh, it just means something that some regulator or three regulators don’t like”. By statute it has to be an act or practice that causes substantial injury to the consumer, that the consumer can’t reasonably avoid and that’s not outweighed by countervailing benefits to competition or to consumers. In the policy statement, the FTC tried to give a little more substance to what those types of injuries would be. And that is one of the reasons why I did the informational injury workshop at the FTC because I thought it was important to understand what is the substantial injury.

I think it’s fairly well established and most people would agree. You lose money, yes. We think losing money is an injury. But where else did it go? I think that having your sensitive medical information exposed on the Internet or not on the Internet would qualify as a substantial injury and I think the fact that we have HIPAA suggests that Congress would agree. Most people would agree. I brought a case against a revenge porn website to say exposing those kind of intimate photos, I think most people would agree was an injury. But it’s not a money injury. And so, I think what, but I wouldn’t necessarily agree. Whereas, the Europeans might think this, that I get to control what everyone thinks of me or people can’t say Maureen Ohlhausen has blue eyes and is five foot two, unless I allow people to reveal that about me.

Scott: 25:20  I thought you were like 6’4”.

Maureen: 25:21  Yeah, I am 6’4”. But, I think it’s not an endless amount, well just because one person feels, I don’t want someone to know that. It’s hard to draw the line. But I think it’s important that we at least try.

Tom: 25:42  Notwithstanding everything you all are saying. I think many of the proponents of privacy legislation in this country are looking to the European and some to California as a model. What do you all think the effects of that would be…a competitive advantage, would it be?

Maureen: 26:10  Some of them I consider among several concerns. I think it’s interesting that in Europe they were really pushing towards having the digital single market that they had. The European Union is the attempt to combine all these separate economies to get the benefits of the kind of unified economy we’ve enjoyed in the US. The data protection regulation, the GDPR, was supposed to be part of this bigger push to have a more uniform approach. I would be concerned in US if we start sort of breaking apart our unified market and, ironically, kind of in reaction to the GDPR. I think that’s one concern. What are the other things that I would point to? Just last week I re-read the FTC’s Data Broker Report and one thing that that report identifies is the protections that are already in place, the Fair Credit Reporting Act and other types of anti-discrimination statutes and also identifies the benefits to consumers from data, from the ability of companies to access data and to advertise to them or to create new products and things like that. So, I think we need to be careful about that. So, I would say there may be changes we need to do at the margin, but I would be concerned about doing that without also being cautious about losing the benefits our economy has enjoyed and consumers have enjoyed by having a uniform standard and having the ability for companies to use data to innovate.

Alan: 28:00  Tom, I think that you asked about what impact it would have if we do emulate the EU’s GDPR over here, which I think seems to be the direction that we’re going. Certainly, California and Alistair McTaggart, the gentleman whose ballot initiative is what ultimately resulted in the California Consumer Privacy Act, the CCPA. He was trying to emulate the EU. The Washington state legislation consciously seeks to emulate it at hearings before. Since September there have been hearings where basically members of Congress on both sides of the aisle have talked about EU based models with the rights that it grants. So, there is a likelihood and I’ll say a danger of moving in that direction, which is in my view, overly bureaucratized in the development of too many different hurdles, most of which are not commensurate with the benefits they provide to anyone.

I will say, and to give that the EU and the GDPR its due, the focus on transparency and disclosing what the practices of companies are, I think actually is a positive thing. If you ask me what do I think the real problem in the privacy sphere is as opposed to cybersecurity, it’s that really people don’t understand what’s going on. And that if there were more transparency, yes, there were some practices that probably would be disincentivized through greater disclosure and general knowledge. But people are, I think, largely comfortable with a lot of the commercial practices and the commercial uses of data, but they’re really not all that knowledgeable and there’s, especially in the Internet ecosystem, a lot of companies that nobody knows anything about. And I think it might be fair to say that not only do consumers not know really, what is going on with the advertising ecosystem, I would say that’s also true of a lot of companies.

Sometimes when a development erupts into the media, it turns out that really nobody understood what was going on. I think there ought to be better understanding, but fewer rules, more focus on disclosures. Let the marketplace decide whether they are or they’re not comfortable, but maybe bring some of the companies out of the shadows. And to go to Maureen’s point about guidance, maybe provide some more guidance so that it’s not just about catching people in the got you thing. Just very recently there was what has been styled as a CCPA fix bill, was introduced by a California state senator and together with the support of the Attorney General Xavier Becerra and this is not a fix that industry or companies would like. But, one of the fixes that the Attorney General insisted on was that he not be obligated to provide guidance to companies who want to comply. He said, and he’s quoted, I read this just recently quoted as saying, I don’t give free legal advice to companies. But the original bill said, give guidance to companies so that they can comply. I think that attitude of a really trying to hide the ball is not good for anybody and it can result in excessive enforcement, unwarranted enforcement. If compliance is the objective make it easier.

Tom: 31:34  Speaking of transparency, one aspect of this is the privacy notices. And we do know that almost all consumers just don’t read notices. Even if they were better written and more clearly written, I’m not clear how much that would change. But, do you think companies themselves, just in the way they approach these issues in public are derelict and not explaining better how they use data? They do have a business model, some of them, which is based on monetizing data. That’s not an evil thing. There are many benefits that flow from that. But they kind of act like, “You caught us. We’re actually making money off this data!” Do you think they could explain it better and maybe they are just reaping the costs of never having explained it?

Maureen: 32:29  I think that it would be wise for them to make a little more of the case for the benefits of data availability and data usage in innovation and competition and how much advertising supported content and services consumers are enjoying. I think consumers can understand some of that trade off and perhaps making it a little more explicit would be useful. I do think one of the challenges for the California law is saying, well, consumers can opt out of that, but you still have to give them the same service. I don’t know how that’s going to work. But I do think that it could be beneficial if they were a little clearer about that. I also agree with Alan that GDPR is not all that. I mean certainly it has some very good provisions in it and more transparency might be beneficial. But, I think that, kind of just going back to your guidance point, one of the things that I really liked about the Children’s Online Privacy Protection Act (COPPA) is the fact that it permits companies to come up. It can be an organization, it could be a trade association, or it could be like a consumer group or, or something to come up with a program for getting consent for the collection of children’s data and getting the parental consent and then the FTC reviews it and then they get a safe harbor, if they comply with that.

I think that’s one of the things that should be explored if we are considering a general privacy law. Because it’s the other thing that is a little concerning to me is we’re talking about this in terms of big online platforms, but a general privacy law, is going to affect economy-wide, industry wide. There’s all different ways it interfaces the consumer’s experience and ways data is collected and used and shared and to kind of come up with a clear one-size-fits-all with a big penalty hanging over the head of a company if they don’t get it right. Again, I think perhaps something that gives a little more guidance to companies, whether it’s through this kind of COPPA safe harbor approach. Even the GDPR talks about codes of conduct. I think there’s some interesting thinking to be done in that space.

Scott: 35:06  I thought you made this point at our January conference, which might be obvious to legal people who follow this, but a company violating its own terms of service is by definition a harm, right?

Maureen: 35:19  Yes, it is.

Scott: 35:20  So, I mean the sort of rules that you’re talking about, you can’t come up  with practices and then not follow them.

Maureen: 35:26  Right. Exactly.

Alan: 35:28  That’s a great point Scott. I didn’t want to say that while privacy policies have been much maligned in the notice and choice model, but I don’t think we should sell them short. One, for the very point that you make that companies are self-governed by their privacy policies. If they don’t comply with them, they can get in trouble. It would be a deception assuredly from the FTC, although I would argue that there still would need to be substantial injury perhaps, but we can talk about that some other time. But, I can say from representing companies that the amount of time they spend both drafting their privacy policies and then seeking to adhere to them and internally analyzing and defending internally and justifying any changes from their privacy policy, at least many companies take them very seriously internally.

Now, while it’s true that rare is the consumer who will read a privacy policy. That’s also true about securities filings under the SEC laws. I think people believe that that regime works pretty well. Why? Because there are experts who pour over them, read them very closely, and then disseminate the information to the marketplace. So, there are privacy advocates who read these policies very closely and are acutely sensitive to any changes in these policies and that there is a blogosphere out there that picks up on these things very quickly. I think it would be erroneous to conclude that just because each consumer doesn’t read it before taking some action, that they’re not governing the companies both internally and through FTC or State Attorney General discipline or by contractual or other legal action against them, but also just self-governance. Most companies are legitimate, take these things pretty seriously, and they don’t just slap out a policy and not comply with it. I’m not suggesting, I don’t want to be Pollyannaish here, but big companies take these things seriously. The ones who get the most data and at least in my experience, they’re trying hard to comply with them. They take them seriously.

Tom: 37:55  I’ve been watching the clock, but a couple more things if we have time that I’d like to kind of touch on that are among the two more controversial aspects of what people are talking about in the context of the privacy law. One is preemption of states and the other is…to talk about California. Just today I was reading the California Attorney General was going to expand the California law to include a private right of action. I mean, and of course those two things have both been talked about in the federal context as well. Alan, give us your take on those two things?

Alan: 38:41  On preemption, I think it’s always been one of the factors that one imagines is in a trade off or compromise on legislation between the legislators who are more inclined to a business community and those who are more inclined to the advocacy or consumer protection community. With regard to preemption of state law and the privacy realm, this is inherently digital. While Maureen is certainly right and mentioned a couple of times that privacy legislation and policy doesn’t just affect the large Internet companies and platforms, but nonetheless, this is very much of a digital issue. The information age is less territorial, less geographically finite than the prior commercial ages. So, if ever there were a field where you would think that national standards should govern…

It’s where digital information that travels everywhere and is everywhere at the same time. So, it does seem like it’s naturally an area where a federal standard or national standard should prevail over individual state standards. We’ve seen in the data breach context with 50 state laws plus D.C. and Puerto Rico, Guam and the Virgin Islands and so on, that the proliferation of those standards has not contributed to data security. Rather, it’s contributed to uncertainty, it’s contributed to bureaucratic administrative burdens, legal time. Thank you.

Maureen: 40:23  Yes.

Alan: 40:23  We appreciate that, but it’s really not added substantively at all to consumer protection. That problem would be even worse with privacy. I’m sure this is true for Maureen as well, but I’m working with a lot of companies on California compliance because even though it’s not in effect until 2020, they’re starting to get ready. And the notion that one state, if it’s as large as California could set what will be de facto national standards. That’s not appropriate either. So, I think preemption has got to be part of the equation. I can’t imagine that there’s ultimately a compromise that prevails without there being significant preemption. I mean, the tradeoff will be maybe stronger standards, but I would command all of us, in addition to Maureen’s very excellent informational injuries workshop and addressing that, but also to the administration’s requests for comments that they put out with the NTIA at the Commerce Department looking for new privacy framework.

There is a better way to do this. I think that the administration’s request for comments is a very sound. It really focuses on cost benefit type analysis, taking privacy seriously as a social good. But nonetheless, one that should be balanced against the impacts on innovation and prosperity and where it’s not just rule spaced, it’s outcome spaced. So, are we really protecting people or are we just layering on more rules. I could talk about private right of action too but I’m sure Maureen is interested in speaking about it as well.

Maureen: 41:55  Yes. I completely agree with what Alan said and I do think we need to keep in mind, we want privacy and prosperity. Both values are important to Americans and consumers.

Tom: 42:10  I’ve heard that phrase before.

Maureen: 42:11  Yes. I often like to look at what has worked okay in the past. And when you’re talking about preemption, that doesn’t necessarily mean that there’s no role for the states. So, if you think about, again, the Children’s Online Privacy Protection Act (COPPA), the state AGs can enforce that, but they enforce it at the federal standard, so you get some of the benefits of having the local involvement, if there’s something that’s more of a local kind of impact or even a broader impact, but the benefits of a uniform approach. The private right of action certainly concerned about will this just lead to class action lawsuits that the FTC has weighed in on various class actions where it doesn’t seem to be serving the consumer and choice very well and may be more in the interest of the lawyers as is having 50 different breach notification laws.

Because if you’ve ever tried to do one of those, which I had to do recently for a client and you’re saying, well, okay, there’s this state and there’s this state and there’s this state and who’s data was affected. Well now we add on this state and you might say, well, you pick the most rigorous state and comply with that. But, there are some states that say, well, first you have to tell our state police forensics unit before you tell consumers. And then it’s like, well, what if there’s another state that says, no, you have to tell the consumers? I mean, you can start to get into some real conundrums there and then I don’t think that that’s making consumers any better off to have that type of a balkanized approach in privacy.

Scott: 44:04  I think probably with that, we should we should leave it there. This issue isn’t going away, and we can keep talking about it for a long time and I’m sure we will. Thank you both very much for being on this podcast and we look forward to doing it again some time.

Maureen: 44:20  Thank you.

Alan: 44:20  Thank you.

Tom: 44:21  Thank you very much.

Share This Article

Podcast

View More Publications by Maureen Ohlhausen, Alan Raul, Sarah Oh Lam and Thomas M. Lenard

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.