CloudChat Interview Series
Episode 4: Dissecting Obama’s CyberSecurity Executive Order
Dr. Eugene Spafford, Executive Director, CERIAS Center, Purdue University
CipherCloud’s Dr. Chenxi Wang interviewed Dr. Eugene Spafford, the executive director of CERIAS center, Purdue University. Below is a summary of the interview conversation.
Question: Spaf, as the executive director of CERIAS, the Cyber Security Center for Purdue University. What do you think of Obama’s latest executive order, which deemed that malicious cyber activities are a national threat, and attempted to establish sanctions for individuals perpetrating these activities? How do you difference between what’s considered legitimate research and malicious cyber activities?
Dr. Eugene H. Spafford: The way we’ve looked at this, even before this order came out, is that if we’re doing research on systems, if those systems are currently in use by others, if we don’t have permission to do that research, if we have not considered the potential impact on the users of that system, then that’s not friendly research.
Question: There are a lot of research fall in this borderline category, some of which helped to elevate the collective security capabilities for companies because the vulnerabilities were found. How do you think this E.O. is going to impact that level of research? Is it going to diminish those activities or are we going to see them moving underground more and more?
Dr. Spafford: There’s two parts to this. First of all, I’d push back on calling some of what has been done as research because I’m in a environment where, if we’re going to do research that affects people, we have to go through an internal research board that reviews it to ensure that we are respecting the privacy and the decision-making capabilities of anyone who’s involved.
A lot of the ad-hoc kind of behavior of going out and breaking into systems and running scans against systems without permission, does not fall under the category of research by that measure at all and is frankly of questionable ethics. Has that helped security overall? In some cases, it has. It also has hurt many of us in the profession because it’s created an image of not necessarily being as careful or considerate of others as we should be.
The executive order is problematic in several ways, but having read it carefully and looked at it and talked to a few people in government who’ve been associated with this, the intent of that executive order is really for people outside the US because doing some of the things in that executive order, such as freezing assets or other kinds of aspects, could easily be challenged by law inside the US as not having due process, not following within the necessary guidelines.
The executive order has been put out because there is such a problem with people outside the US who are constantly scanning our systems, trying to break in, running malware where we can identify who they are but they’re beyond our reach. They are not extraditable.
Question: If we’re talking about hackers beyond the US borders, do we have the jurisdiction to even carry out the sanctions proposed?
Dr. Spafford: We actually do have jurisdiction in many cases. They may have assets that are in US banks or US firms, and so those are the assets that could be frozen. There are other kinds of things that are in the executive order that are regularly carried out against citizens of other countries who’ve been transferring nuclear materials, biological materials for weapons making and other kinds of behavior.
For instance, the sanctions we have against Iran. They’re outside our borders but we’re able to execute them because there is a US interest within the borders. It’s an interesting question of, is this the best way for things to occur? Considering that many of these countries are also conducting espionage against the US or are tolerating criminal behavior for political reasons, they’re not willing to cooperate with extradition or investigation, there’s a limit as to what we can do to discourage it. This seems to be a step that may be reasonable.
Question: Section 1 2B in the executive order says, this executive order allows the Secretary of the Treasury, in consultation with the Attorney General and the Secretary of State, to make a determination that a person or an entity has materially provided support of malicious activities. Do you think these three entities mentioned, Secretary of the Treasury, Attorney General and Secretary of State, have the know-how, the necessary technology insight to make the right determination to what’s malicious activity and what’s actually borderline research?
Dr. Spafford: They’re the ones that have the administrative authority. The Secretary of the Treasury is the one who has to carry out many of the sanctions. Secretary of State considers the foreign policy issues and the Attorney General is the one that considers issues of law, about whether laws or treaties are going to apply.
Their opinions are necessary. It’s probably not sufficient. But that’s a case where law enforcement agencies, Intelligence Agency, Department of Defense, Department of Homeland Security would likely identify the instances and the perpetrators and provide that information to those three to make the determination.
Question: Should they have an expert panel, a technology expert panel that’s from the security research community, to help the government organizations make that determination?
Dr. Spafford: For purposes of what’s being done here, I don’t think that’s really necessary. There is a lot of technical expertise in government. This isn’t being done in a vacuum. People who read these orders need to understand that the statement there is not exclusive, it is the minimum. The necessary three parties all have to make that determination, but they’re not limited to that group and in fact they can go out to other organizations.
We have law enforcement. The FBI. The Secret Service. Department of Homeland Security. There are the various intelligence agencies and possibly state agencies. They can also reach out into the private sector. Individual corporations. NIST. They have a lot of resources to draw upon. I don’t think it’s going to be a case where they don’t have those resources available. It will be a concern if they start making determinations without that input. That’s where I think the concern is valid. If it’s only being made by the lawyers and the people at Treasury.
Question: Do we have due process for the security research community to provide input to the government in a way that is constructive and consistent?
Dr. Spafford: I’m not sure that we have any consistent processes within the security community. There are issues that involve both policy and technology where better communications would certainly be warranted. Not only in a case like this where we’re looking at activities and trying to determine the severity, but a discussion is going on now of great concern about whether to put backdoors or access into encryption.
It seems to me that some of the policy makers, particularly in the law enforcement community, are ill informed about the consequences.
Question: Do you think it is even possible to have a universal backdoor in any piece of software that will enable ONLY the authorized party to have access to the information? Certainly that kind of things has been done before…but has it been done at the scale we’re talking about? Has it been done with that kind of impact to society, to privacy? Considering that the government will be working with companies like Microsoft, Intel, who have a huge footprint in the market.
Dr. Spafford: This is not solely a technology issue. It is one that involves policy and human aspects to it. It IS possible to build in a mechanism that can only be opened with a key. We know how to do encryption, we know how to do multi-part keys, etc. The problem is, who has control of the keys and how do we prevent them from leaking? The big problem with this, as I see it, people in the US government are not thinking most of these companies that are involved are international. They’re going to sell in many countries. The law enforcement agencies and governments in all of those countries will want the same access.
The problem is how do you keep a secret with so many entities that are sharing that key? In fact, some of the parties that we’re concerned about may be affiliated with some of those governments. They’re going to use additional encryption on top, which is not going to be easily breakable. It simply is not workable as a regime to expect that there be some kind of mandated backdoor that the keys are only going to be shared with select US government agencies. Not going to happen. Can’t happen.
I hope whoever mandates that is also willing to cover the cost of replacing all the software and hardware they shipped with that in place when it’s broken. Indemnifying all the companies involved if they inadvertently violate the law in various countries because the encryption is broken.
Question: Many of us create technologies that could be use for defense as well as for offense. What happens if a research community has created something in a public domain that inadvertently went into the hands of hackers and created damage down the line? Does this Executive Order provide protection for the creator of the technology in these scenarios, or will the security research community be walking on eggshells from now?
Dr. Spafford: I’m certainly not one of the people who’s going to be applying this act, so I can’t tell you specifically, but my reading of it and my conversation with parties that are involved is that it’s not going to be applied against people operating within the United States because there are protections here under the law, such as constitution, due process, freedom of speech and so on. Applying that within the US does not seem likely.
Question: But security research community is global.
Dr. Spafford: It is and that’s where I begin to have some concerns because the interpretation that they have to reach is, what is the motivation? In criminal proceedings you have to do that anyhow. What’s the Mens Rea? what’s the mind of the perpetrator? If someone has a career of talking about writing tools to break into government sites and writes something, that’s not going to work in their favor.
If it’s somebody who’s in an academic environment or an industry environment where they regularly produce tools to test security, they write about it, they show up at conferences like this, it’ll be very difficult to say that is being written for malicious intent.
Question: The line’s getting blurred a little bit in some cases. This Executive Order has certainly put a lot of people on the edge. In your conversations with research colleagues, does this come up as a concern?
Dr. Spafford: A bigger concern over several years has been the Original Planning and Copyright Act, which does apply in the US and has to do with any research that might be useable to break a copy-protection scheme and that has dissuaded some people from working in that area. It is potentially unconstitutional but there’s been no trial about it. That’s worried people more in my experience.
Now this Executive Order has yet to be applied, that we know of and it’s going to take a little bit of time to see. I haven’t heard from anyone in that academic community yet who’s expressed concern about this specifically, nor within the general industry community where I have contacts.
Question: Does the security industry, as we know today, have the right responsible disclosure practices for things we find that could potentially cause damage? Do we have it right? Should we do something differently?
Dr. Spafford: The whole ecosystem is probably wrong and I won’t say it’s simply in the disclosure arena.
Question: Which aspects of the ecosystem?
Dr. Spafford: In the sense that there are organizations that are producing goods and services that have vulnerabilities and they’re purposely dodging them or trying to hide them, to not find them. Explicitly going out of their way to ignore reports. That’s a problem for public safety. We don’t have appropriate measures in place, a good tradition of being able to hold them accountable.
That’s one of the reasons why much of the disclosure arena and the activity has been to publish without necessarily informing them, to giving them a chance to fix things ahead of time, or how long do we give them to fix it? We’ve grown up into this environment where many individuals, many organizations publish things right away, or within such a short amount of time that organizations can’t adequately develop and fix the vulnerabilities. It’s also led, because of that, to an environment where those vulnerabilities have become valuable to sell to intelligence organizations, criminal organizations and others.
Zero-days, for example. The question of responsible disclosure cannot be distanced from that zero day market, nor can it be distanced from the issue of corporate or government liability for failure to produce and deploy secure systems. It’s not really any one group. It’s a complex ecosystem.
Question: What’s a zero-day cost these days in the underground market?
Dr. Spafford: No idea. I haven’t sold any. (Chenxi: Last I checked, it was $40,000 for a zero day.) Depending on who’s buying and what it’s capable of doing, I suspect that the price is considerably higher. I believe there are several governments, and several criminal organizations and others who are willing to pay very large amounts of money if they’re able to compromise one of their targets.
Question: Do you remember how many zero days were in Stuxnet?
Dr. Spafford: I think it was four. (Chenxi: Four. That’s astounding.) There are a lot more that are out there. If we look at things like Heartbleed, for instance, and we think about how long some of these vulnerabilities have been present, there are underlying fundamental problems in the way that systems are produced, software is produced, security is managed, that of course is at the heart of this show, but we have just begun, just really scratched the surface of what needs to be done.
Question: Thank you Spaf. Before I let you go, what are you working on these days? What excites you?
Dr. Spafford: Two things I’ve been working on. One is more an application area that I am working on with a graduate student. We are how do you put together a system and a process? You could think of it as a security for BYOD. We’re looking at endangered populations that traditionally have gotten very little support.
In particular, we’re looking at groups in shelters who are getting away from abusive relationships. They have that electronic lifeline, but if their location is revealed, it can be fatal. It’s a very interesting environment. There’s no money there because those individuals don’t have funds or resources, so we’re trying to investigate what we can do in a cost-effective manner.
(Chenix: It’s a good cause.)
Very good cause. Then the second one I’ve got some students working on. We’ve developed a formalism for how deception, deceit and misdirection can be used in security. We’re investigating some places where people haven’t looked at using it before. One of the thesis projects now is setting up systems that respond as if they haven’t been patched. As if the vulnerability is live.
(Chenxi: This is like a game theory. Game theory for defense and offense.)
That’s actually one of the other thesis projects. There is a version of game theory called hypergames that deals with imprecise and incorrect information. We’ve been pushing forward in developing some of the hypergames technology.
Question: I was interested in your thoughts behind the American public’s total apathy towards the loss of privacy that’s going on. It’s shocking that all this government warrantless spying keeps rolling out and it’s not being challenged.
Dr. Spafford: I’m distressed by it. Every generation sees a loss of privacy as technology comes along. Depending upon our ages, we remember when things were a different way. Some younger people who are, even now at college age, always had social media available to them. Their notion of privacy is actually quite a bit different than someone like me who has a few gray hairs, which ones are left. It’s a very different kind of view.
We’re being monitored here. Part of it is a perception about privacy and its loss, but I’m more troubled by the fact that the public in general seems to accept a great deal of invasion of our space and our information in the name of preventing terrorism or stopping crime.
Things like the Stingray that, in fact, they’re eve-dropping prosecution rather than revealing that they’re using it. That’s very bothersome to me as part of a rule of law, but there is a policy issue there that the public is not bothering to take on, either at the ballot box or in general. That’s not our community, but the public at large, about how much are we willing to give up in return for that potential increment of safety?
The fact that right now, we have some people in Congress who are trying to stop the renewal of the Patriot Act and yet at the same time, Senator McConnell has just issued a bill to reauthorize it. Where is the debate going on on that? Where it does it occur in government, it’s often behind closed doors.
We need to do a better a job of educating the public about what privacy is possible. Part of the media makes many of the criminal and terrorist activity seem all around us all the time, although they’re very rare. It’s a bigger issue that we’ve done a very poor job with engaging the public.