The New New Internet: 2010 saw the public disclosure of the hacking of Google, allegedly by China, Operation Aurora. What is a very prevalent threat that we might see in 2011?
Kelly Collins: Related to cyber war, I think if some entity was able to create a cyber weapon, like the computer worm that recently targeted nuclear facilities in the Middle East, and if that sort of capability were focused on the United States, not even necessarily on a defense system, but at the global grid, that of course is of significant concern and why the issue of software security has become so much more prominent of late. One point I'd like to share is that when it comes to software security, financial institutions all over the world are more inclined to understand why and how they need to protect their software layer because they are such an obvious target. This is less the case with major infrastructure organizations and agencies. While it's true that 80 percent of our critical infrastructure is privately owned, the government has an advisory role over practically all of it. We're working to help the government ensure that the software layer is effectively defended at as high a layer as possible. That's where I think we have focused so much of our passion and energy is to try to get those critical industries and the many government entities that we work with to understand the importance of protecting their software layer first and foremost in order to prevent these kinds of dramatic attacks.
The New New Internet: As the leader of Fortify's Public Sector division, what's a typical day like for you?
Kelly Collins: Any given day, I may be involved in a policy matter, a technical matter, a hands-on assessment of source code and discussions with our government customers, partners, consultants, and systems integrators. Speaking about the hands-on assessments, it's interesting to watch how people react when they see the results of that first source code scan, especially if an organization hasn't had much of a focus on securing the application layer. In our experience running demonstration scans of DoD applications, even those that have passed certification and accreditation, we'll find that on average 8-10 percent of the code will contain security vulnerabilities that are serious enough that had they been detected earlier, the software would not have been allowed on the network. It's a very dramatic demonstration.
The New New Internet: Recently, Fortify was acquired by HP and I was wondering if you could talk about that process and what that means for Fortify.
Kelly Collins: It's been a very thoughtful process. As far as arranged marriages go it was probably one of the best we could have hoped for. HP is making a very strong investment in security and I think will likely be a leading player in the security marketplace in the future with the acquisitions they have already made of Fortify and ArcSight, TippingPoint, and potentially others moving forward. We're very pleased to be in a position to leverage the gravitas of a company with the size, the reputation, and the brand that HP has. We believe that being part of HP will help with the policy issue as well, because it now apparent that many critical industries are starting to endorse the use of assessments to ensure that their software layer is secure. Our feeling is that with the global scale and focus of HP behind us, we will be able to continue to invest in our technology and expand it to the best extent possible.
The New New Internet: Cloud computing is building a lot of buzz. I was wondering what your thoughts were on the subject, particularly with regard to the security implications of the cloud.
Kelly Collins: I'll make a couple of comments and I would like Rob to comment because he has more depth in this area than I do. When you think of the components that make up the cloud, you'll find that it is populated primarily by applications. Because the cloud lacks the layered defenses of traditional networks and opens up access from essentially anywhere, security is vitally important. Cloud applications are even less protected than those the run behind traditional perimeter defenses. I think you'll find that the security strategies that are ‘good enough’ in closed networks will be woefully inadequate in the cloud. I also want to say that just because an application functions in the cloud outside of a traditional network, the onus of securing that application is still on the organizations that created it or procured it.
The New New Internet: Rob, is there anything you would like to add?
Rob Roy: I think Kelly pretty much covered it. I would say that cloud computing, as a utility, is inevitable and that it will pose a number of challenges from a security point of view for government IT and security teams. One in particular is that by moving applications to the cloud, they will have to go from a closed environment that they know very well “” their own environment that they've had control of for decades, all the way down to the IT infrastructure and the on/off power switches “” and hand things over to somebody else, a third party. There are so many questions that they need to answer that they've honed over the past twenty years creating their environment. They now have to apply their learning and their internal controls to that outside entity. It's just a matter of time but I will go back to Kelly's statement “” going to the cloud does not absolve anyone of security. If anything it amplifies the requirements because they are giving up more control over their infrastructure.
The New New Internet: Those were all of my questions. Kelly, are there any points we haven't touched on?
Kelly Collins: There is one other concept that is related to the cloud, but also to our lessons learned when we work with global financial institutions. These organizations are adamant that they would never accept software delivery from a third party supplier unless that supplier could prove to them that they had done source code analysis of that software and had removed or remediated all critical vulnerabilities. In essence, that was part of their requirements that they had passed down to their supply chain. It was a surprise for our government customers to learn that financial institutions were so stringent about security. While discussing the implications for the cloud, one of our Army customers came up with an analogy. He said, ‘It will allow me to create a concept of a clean cloud and a dirty cloud. The clean cloud would have all of the high-level service level agreements. It would have stronger security stamp of approval.’ From that analogy, the idea came about that government customers should be every bit as demanding of their software suppliers as financial institutions. As a government entity, you should be more empowered to ensure that if somebody is providing you a software deliverable, them you should have some way of ensuring that that software doesn't contain critical vulnerabilities to whatever level you specify. If it doesn't meet muster, then you should either push it back to them to fix it, or you might have to resolve to place it into a ‘dirty cloud’ category where users would be told to ‘handle with caution.’ We found that very helpful in trying to explain to our customers that they probably should be much more insistent about the quality of software they receive.