Expert Input on Breach Detection


SoundOff is a new section of The New New Internet featuring the thoughts and perspectives of top industry experts and thought leaders on the biggest issues facing the cyber marketplace. These executives are drawn from the U.S. government contracting industry.

Expert Input on Breach Detection

We hear a lot about best practices for responding to a cyber attack, but some say that detecting that a breach has occurred can be the real challenge. In the Computer Security Institute’s Computer Crime and Security Survey 2010/2011 respondents were asked what security solutions ranked highest on their wish lists. Many named tools that would improve their visibility—better log management, security information and event management, security data visualization, security dashboards and the like. Many organizations – especially federal agencies entrusted with critical data and the personal information of our nation’s citizens – are particularly vulnerable to cyber attacks. Without the appropriate solutions and tools in place, we are still running defense when it comes to detecting cyber attacks before the damage is done. Here is how one cybersecurity expert responded on the issue.

 Leigh Palmer, VP advanced programs – information technology & cybersecurity solutions, BAE Systems

Leigh Palmer — To be proactive, government entities and agencies must explore the motives behind the unique attacks that they will likely encounter. This examination entails a thoughtful assessment of the data they hold, an evaluation of the data their adversary is pursuing, and most importantly, developing a clear understanding of how the stolen data will be used for financial profit or tactical/strategic gain.

Looking at an enterprise from this risk-based framework allows agencies to spend limited resources, providing greater protection to the most prized (and most at risk) assets. While many attacks are easily thwarted, the process is still a human-intensive task to separate “evidence of concern” from the rest of the attack data collected. Analytic methodology to triage incoming alerts will allow limited human capital to be focused on the issues of most concern.

Top experts on Cloud exit plans

Bill Luti, vice president for cybersecurity, DMI

Bill Luti

A lot of research has been published lately that suggests that while many federal CIOs are working hard to implement the “cloud-first” strategy of moving data to the cloud, most have no entrance or exit strategies for cloud services and cloud service providers. But this is essential because when the time comes for an exit, you’ll find that most of the protections and safeguards you need will have to have been established from the very start of the relationship. Your entrance and exit strategy needs to work in three dimensions: contractual, technical and procedural.

Bill Luti — From a contractual point of view, it’s essential that exit considerations be built into the agreement up front, with a strong emphasis on ensuring your continuity of operations. What are the appropriate Service Level Agreements that need to be in place? Under what conditions does a violation of those SLAs trigger a potential exit? What happens if the cloud service vendor goes into receivership? What assurances do you have that once your data is transferred to you or to some other cloud service provider that it has been completely removed from your old vendor’s servers? What is your liability if some of your data is stolen from the service provider after you’ve left?

Technically, this may seem counterintuitive, but for many government agencies a winning strategy might be to specify that the agency actually owns the equipment being used by the service provider. It might also be useful to specify no multitenancy for your equipment. Again, this may seem to run counter to the idealized view of cloud computing, but it may become very important in the event of a default on the part of the service provider, or to facilitate a clean migration if you need to move your data.

Finally, procedures need specified to cover exactly what happens if you decide that you want to make a transition. It’s unwise to assume that this will be a smooth, easy process, unless it’s laid out in advance. Whom do you contact if your vendor goes out of business? How will data or your equipment be transferred back to you or to some other service provider? How will your service provider make certain that all of your data is thoroughly removed?

The cloud service model offers tremendous advantages. At DMI, we’ve used it to enable incredibly rapid deployment of applications and services for our clients. But for many agencies, this is uncharted territory, and it’s important to go in thinking carefully about all the possible outcomes.

Emily Stampiglia, senior director, federal sales, VCE

Emily Stampiglia

Emily Stampiglia VCE

Emily Stampiglia — The transition to cloud services is a journey rather than an end state, and careful consideration of both risks and benefits must be weighed. The reality is that there are many paths to cloud, and many reasons why CIOs may opt for measured journeys rather than aggressively moving to subscription based Software-as-a-Service models. Most CIOs are now driving virtualization programs as a first step on the path to cloud computing. Virtualized environments enable organizations to start delivering cost-effective IT that can be rapidly provisioned, while still offering high availability and dynamic scaling. Deploying virtualization on a standardized converged infrastructure platform enables rapid implementation with dramatic reductions in operations and maintenance costs.

The biggest barriers to cloud computing stem from uncertainties around security and privacy and other trust considerations like compliance, performance and availability. Federal CIOs must build a risk mitigation plan to ensure their responsibility to protect the organization’s information is fulfilled. The reality of today’s cyber threats makes us realize that information anywhere, whether in existing physical environments or private clouds or on a public infrastructure, has the potential to be hacked if proper precautions are not taken. Unlike commercial data where the risk can usually be measured in loss of dollars, the risk of losing sensitive federal government data can potentially have much more significant consequences. Technologies exist to prevent data breaches; however, it is frequently not a failure in technology but rather human interaction with the technology that causes problems. Government CIOs should explore public clouds for appropriate use cases and should ensure that the cloud provider is providing Service Level Agreement assurances to demonstrate their ability to secure information, satisfy regulatory and compliance requirements and provide performance and availability guarantees.

The cloud provider must have policies to address mandatory reporting of a data breach, and mitigation plans on how they would remediate a spill. For more mission-critical applications or information that is highly sensitive, a more pragmatic approach may be to build a converged infrastructure as a foundation for a private cloud model, where agencies can own and control their own infrastructure but provide cloud services to their internal and community customers.  This path may allow government to better leverage their existing security and governance processes and technologies as they move on their journey to cloud computing.

Michael Mikuta, VP of technology strategy–cloud computing and mobile, HPTi

Michael Mikuta HPTi

Michael Mikuta — It is interesting to consider the conversations about cloud exit strategy given most customers are in their earlier adoption phase of cloud implementations. Being concerned with lock in is nothing new, which HW vendor should we choose, which database technology should we choose, which application development framework should we choose, and so on.

The point is just because it is wrapped in cloud moniker, don’t forget the same principals used in the past still apply and subsequent selection criteria should be leveraged where possible. Portability can be maintained through the proper application of virtualization and decoupling patterns. On the flip side, for those customers utilizing MapReduce as an emergent data processing paradigm, consider how we may be able to share data processing algorithms across clouds ushering in a much more sophisticated mechanism for smart data sharing.  Exit simple search, enter communitywide analytics — across intel solutions, across financial solutions, across security log analysis, etc.


email
Filed in: FedTech SoundOff

You might like:

CFO Joe Donohue on Agilex’s Growth Trajectory, Strategies and His 5-Year GovCon Outlook CFO Joe Donohue on Agilex’s Growth Trajectory, Strategies and His 5-Year GovCon Outlook
IBM, Microsoft Announce Hybrid Cloud Partnership; Robert LeBlanc Comments IBM, Microsoft Announce Hybrid Cloud Partnership; Robert LeBlanc Comments
Army Seeks Technical Support for TRADOC Missions Army Seeks Technical Support for TRADOC Missions
Joe Tucci: EMC to Buy Controlling Interest in VCE Joe Tucci: EMC to Buy Controlling Interest in VCE
© 2014 ExecutiveBiz. All rights reserved.