Printer Friendly

Infosecurity Europe 2011.

A selection of exhibitors papers from Infosecurity Europe, held on 19th -21st April at Earl's Court, London. The event provided a free education programme, exhibitors showcasing new and emerging technologies and offering practical and professional expertise, www.infosec.co.uk

Is Your Stored Data Protected?

Michael Willett, Samsung and Trusted Computing Group

Today, companies of all kinds must protect against a constant barrage of potential and actual data theft and security breaches, including from organized crime attacks on enterprise data storage facilities. Litigation risks, compliance issues, strong data protection laws in the E.U. and pending data breach notification legislation in the U.K. mean that companies have been forced to find ways to protect data-at-rest, even if they've not experienced direct attacks.

The data storage industry, working collectively in the Trusted Computing Group, has standardized and deployed innovative, simple and powerful technology to secure data where it lives - in storage. Encrypting directly on the drive offers many benefits, and standardizing that functionality provides a common interface for management software and for interoperability.

Previous-generation software encryption solutions typically have had cost, complexity, and usability issues. In contrast, new self-encrypting drives (SEDs) put the encryption engine in hardware, directly inside the storage system. From the outside, an SED functions as an ordinary drive, processing reads and writes. But, deep inside the drive electronics, just before the data 'bits' are written to the physical media, an encryption engine applies real-time encryption to the data stream, so that the 'bits' on the media are encrypted and therefore unreadable to an unauthorized adversary.

Conversely, 'bits' read from the media are decrypted before leaving the drive, completely transparent to the end user. In the enterprise, drives are managed by an array controller, or RAID {redundant array of independent disks) controller. RAID enables the connection and communication with the enterprise storage disks. For a data center, the array controller allows the implementation of self encrypting drives.

Research and testing by Trusted Strategies reveals stark differences in performance for SEDs versus software full-drive encryption (FDE). Three leading FDE software products were pitted against an SED, using a series of intensive read/write tests. In a typical test, the SED was 79 percent, 132 percent and 144 percent faster than software-based encryption.

Other benefits of self-encrypting drives include:

* Transparency: SEDs come from the factory with the encryption key already generated on-board and the drive already encrypting. The drives are always encrypting; conversely, software-based keys are provisioned by the user.

* Ease of management: No encrypting key to manage externally.

* Life-cycle costs: The cost of an SED is pro-rated into the initial drive cost; software has continuing life-cycle costs. Additional savings result from higher reliability and lower maintenance of SEDs.

* Disposal or re-purposing cost: With an SED, erasing the on-board encryption key renders the encrypted data unreadable in microseconds. The "clean" drive can be re-used, disposed, or shipped out for warranty repair; software-based encryption often relies on lengthy data-overwriting procedures or even destruction of the drive itself.

* Re-encryption: With an SED, there is no need to ever re-encrypt the data; software-based encryption key changes require whole drive re-encryption.

* Performance: No degradation in SED performance.

* Standardization: The whole drive industry is building to the TCG SED specifications; software is proprietary.

* No interference with processes, like compression, de-duplication, or DLP (data loss prevention); software encryption is necessarily upstream from storage and can interfere with such processes

Eric Ouellet, Senior Vice President of Gartner, has noted, "Many organizations are considering drive-level security for its simplicity in helping secure sensitive data through the hardware lifecycle from initial setup, to upgrade transitions, and disposal".

In summary, self-encrypting drives are widely available, as is management software that supports them and the TCG specifications. Given their availability, benefits and protection against potentially crippling data breaches, SEDs should be part of a top-down review and risk assessment for sensitive and personal corporate data.

Trusted Computing Group exhibited at Infosecurity Europe 2011.

Security Threats: The Human Factor in Espionage

Giri Sivanesan, CISSP and CLAS, Pentura Limited

People will always be the weakest link in the battle to protect corporate information and data from attackers. Organisations have been the target of 'electronic espionage' for as long as information has been held in data format As business becomes dependent on technology to process, store, transmit and manage information, electronic forms of espionage have sky rocketed.

Over the past ten years, the internet has kept its appeal as a low risk and anonymous forum for perpetrating espionage attacks against corporations and governments across the globe. As the capabilities of security technology improve, hackers are now targeting individuals to get at the electronic information they want - this raises an intriguing mix of problems and issues for today's corporate security managers

Where are we now?

Attacks on businesses are increasingly being committed with a similar modus operandi to corporate espionage carried out by foreign states and state sponsored attackers. With the emergence of global markets and global competition, businesses are now the target of espionage, carried out by competing businesses, states or state-sponsored businesses.

In recent years, espionage attacks on private and public sector business have been a dominant feature in the news covering a broad range of subjects from technology houses losing prototype phones to electronic espionage networks dubbed 'Ghost-Net', reported to have penetrated the networks of hundreds of organisations worldwide. All of these have raised the profile of espionage to new heights, forcing public and private sector organisations to question whether their most critical information assets are adequately protected from espionage.

Many people assume the threat of espionage has disappeared, however the Director General of MI5 once said that there were more foreign intelligence officers operating in London now than at any time since the end of the Cold war. With the emergence of global markets and global competition, espionage has evolved.

Who's Vulnerable?

Most businesses assume that espionage is a threat that does not fit on their risk register. They believe that espionage is about stealing state secrets, information about foreign policy, defense or military research; however this is no longer the case.

Espionage may involve covert techniques and sophisticated types of technical and non-technical attacks and as the availability of business and commercial information online increases, this becomes far easier. Attackers can identify particular networks, computers or individuals, often through aggregating numerous disparate pieces of information to target their attacks.

Businesses cannot manage the threat of espionage accurately without understanding the extent of the potential threat they face. As the number of organisations that have been financially impacted by espionage grows, the need to address it becomes ever more acute. Counter-espionage is about identifying the vulnerabilities that might be exploited by a competitor and putting in place the relevant controls to mitigate those risks.

Who's a target?

Organisations supporting Critical National Infrastructure (CNI) such as water, gas, electricity, financial services and telecommunications are by nature of their national importance at risk from foreign state sponsored espionage, such as Stuxnet, which has now found its way onto the black market.

It is important for industries and organisations with high value intellectual property to understand the risks they face. Espionage succeeds by exploiting deficiencies in physical, logical or personnel security controls. Identifying viable 'attack vectors' to use for espionage may be a relatively long term process but in many ways, the most valuable and effective attack vector for an attacker is a person.

People have characteristics that can make them particularly vulnerable and useful to those who want to carry out attacks. The motivation of people who abuse their access to provide confidential information to business competitors is complex and varied; however

Insider knowledge and access can increase the impact of an attack significantly even where the role of the attack vector is only one of facilitation, for example, a cyber attack.

Many types of electronic espionage attacks involve software programs and tools designed to provide an attacker with the capability to gain access to sensitive information. If the stakes are high enough, an adversary will invest time and effort in developing specifically built and coded attack software designed to exploit vulnerabilities in the applications or networks used by their targets. These attacks are unlikely to be detected by commercial anti-malware tools and help ensure that the attackers are able to access information without raising the suspicion of the target.

There are occasions when electronic espionage does not succeed and attackers may need to combine technical and non-technical techniques to carry out espionage. For example, an attacker may target human sources to unwittingly plant spyware or provide confidential technical information to source top secret product designs through existing network security controls. This range of techniques gives the adversary a distinct advantage in targeting organisations that still segregate their physical, IT and information security activities.

Human sources such as disgruntled employees or low wage, temporary staff - who perhaps have less loyalty to their employer - are also more easily convinced to obtain confidential documents as part of elaborate attacks that they may not be fully aware of.

How it's done

The cultivation of human sources begins with a planned acquaintance with the target, which the adversary will try to make appear as normal as possible. There have been espionage cases in the past where cultivation and recruitment of the target has taken place over a matter of weeks, months and in some cases, years. Through any means necessary, the target will be cultivated and prepared for their role as an agent of espionage.

There have been other cases where the process has been much quicker; "cyber recruitment", can be almost instantaneous. In some instances the targets might not even be aware that they have been exploited and may become an 'unconscious' agent for an attacker. This process takes time, meticulous planning and skill but once ready to assume the role, the human source can provide a rich and versatile source of information and intelligence whether they are 'conscious' or 'unconscious' of their role.

A typical example would be:

"You are attending an industry conference overseas as a key member of the research team for a large technology company. During the trip you meet an old colleague that you know personally and hold in high regard. At the conference, your colleague introduces you to a friend who shares similar technology interests and is very flattering with respect to your published work. Over the duration of the event, you get to know him well and he is keen to learn more about your technology research at work."

Question: How can you tell a normal business introduction from premeditated espionage?

For espionage an introduction to the target is often sought through someone with direct access to the target - an access agent- such as the mutual friend cited in the example above. It is far more likely that the target in this example would trust a friend of a colleague more than a complete stranger.

How to avoid it

In a downward market, when employment prospects may be uncertain or rewards less substantial, the risk from insiders being involved in an attack increases if personal income may be under threat. Employees are far more likely to accept cash bribes or gifts as part of a cultivation process. Once proprietary information is in the hands of a competitor, it becomes very easy for them to start eating away at a rival's profit.

The Achilles heel for most organisations is the network printer, most corporate security systems are rendered useless once a user sends a document containing trade secrets to the network printer where anyone can walk past and pick up the documents before the authorised user.

Employment vetting is arguably the most popular way for organisations to mitigate against insider threats. A detailed employment screening and psychometric profile may help to identify personality traits that suggest an employee is susceptible to cultivation. In the majority of cases however, vetting activities are limited to only basic security checks; conditions for a new employment contract rather than an ongoing requisite for employment.

Aside from vetting, many organisations choose to instate segregation of duty controls that require two or more employees to complete a business task. Whilst this may increase the administrative burden, these types of controls can make it significantly harder for an attacker, by requiring the complicity of two or potentially three people. Whistle blowing procedures are also commonly used in large organisations to detect insider threats.

Conclusion

Clear and concise security policies that are accurately aligned to an organisation's security risks should underpin all efforts to effectively manage against insider threats and attacks to exploit an organisation's personnel. Together, with a strong organisational security culture, education, thorough background checks and after care, organisations can develop an effective risk management programme to counter insider and other types of adversarial attacks.

Pentura Limited exhibited at Infosecurity Europe 2011.

Risk Assessment Defines Datacenter Security Requirements

Mina Zele, Ph. D. Security Specialist, Astec

With the growing amount of stored information and increasing complexity of data centres, the job of data centre manager to provide appropriate security measures is becoming a demanding task. Data centre outage due to environmental or technical reasons can have a devastating effect on business, not to mention the sensitivity and value of stored data that is appealing to hackers and internal users. A Data centre manager is often faced with uncertainty, whether or not the data centre receives an appropriate protection in line with business needs, contractual and regulatory obligations. Risk management of data centres is becoming particularly important with the emergence of cloud computing technology. Namely, a customer organization requires cloud computing providers to proactively mitigate the risk that could have compromised their data, A systematic analysis and evaluation of vulnerabilities, threats and risks are strongly recommended in such a case, because it reveals the most critical weaknesses in organizational procedures and technical mechanisms implemented to protect the data centre. The benefit of risk assessment is that the results can be used to justify the costs and prioritization of investments to achieve a better level of information security and compliance with legislation.

Organizations often store data with highly diverse security requirements regarding availability, integrity and confidentiality that are required by business needs, contractual obligations or legislation. Risk assessment used to be a task of IT personnel who lacked the understanding of the relative value of data and consequently misestimate the consequences of data loss, unavailability or disclosure. Accurate risk estimation is the common effort of business managers, IT personnel and information asset owners that can be achieved by close cooperation of all actors. Since risk assessment can be quite a complex process involving the handling of large amount of data and producing complex results, the process can easily become uncontrollable. Therefore, using a special risk assessment tool is usually welcomed by business and data centre managers.

Risk assessment tools are one of the developing fields in information security, and are generally quite expensive, especially because they are mostly just one of the components for information security management. Therefore, it is important to use a tool that is really beneficial to your organization. CISOs and security consultants are strongly supporting the idea of risk assessment tools as "a must"; however, they will only become advantageous if they are adapted to an organizational risk assessment methodology. This includes configuration of access rights on the basis of employee role in the risk assessment process; customization of matrix used to calculate the risks; and choosing relevant threats and vulnerabilities from the available databases. The value of such risk assessment tools is that analysis and results interpretation is made separately for technical staff and business management. The data centre manager is provided with reports showing meaningful information on risk level and proposed actions to mitigate the risks. These reports can be presented to high management responsible for approving the strategy of information security development.

What makes a risk assessment tool really useful for you? First of all the tool should enable intuitive and fast risk assessment that does not require long training. This also implies that annual risk updating will be simple. Experience shows that ranking of input data (threat likelihood and level of threat impact) should be adjusted to human perception and must not be too granular. Qualitative ranking turns out to be more useful and it better describes the actual situation. The tool should support the methodology that is in line with guidelines of ISO/IEC 27005:2008 standard. Finally, it is important that the results are transparent and informative to business and technical staff with the possibility of being exported to other formats.

Therefore, risk assessment is a key process in monitoring and improving the security of data centres, with the goal of prioritization of investments in technical and organizational improvements of security.

Astec d.o.o. exhibited at Infosecurity Europe 2011.

End-point Security Under The Microscope

Stefan Frei, Research Analyst Director, Secunia

Most private or corporate Internet users face security challenges on a daily basis. Unpatched end-points with a plethora of insecure programs installed represent a breeding ground for cybercriminals. Findings revealed in the Secunia Yearly Report 2010 identified that, typically, 50% of users have over 66 programs from more than 22 different vendors installed on their end-points. Vulnerabilities affecting a typical end-point pose a real threat to the end-user's host.

To track the security of typical users, a representative software portfolio containing the Top-50 most prevalent programs typically installed on end-points was built, based on analysis of data taken from anonymous 2010 scan results from ecunia Personal Software Inspector (PSI) users. Analysis showed that the typical software portfolio consists of programs from 14 different vendors: 26 programs from Microsoft, and 24 programs from third-party (non-Microsoft) vendors.

An alarming trend has reared its ugly head: vulnerabilities specifically affecting the typical Top-50 software portfolio have increased almost four-fold in three years, or by 71% in the last 12 months alone, irrespective of the choice of operating system. In fact, results showed that the operating system accounts for only 13% of vulnerabilities on the end-point, on average.

Significantly, third-party (non-Microsoft) programs are found to be the main culprits responsible for this significant increase in vulnerabilities. For example, in 2010 an end-point with the Top-50 portfolio and Windows XP had 3.83 times more vulnerabilities in the 24 third-party programs than in the 26 Microsoft programs, and 5.22 times more vulnerabilities in the 24 third-party programs than in the operating system. The vulnerabilities are relevant as more than 50% are rated as "Highly" or "Extremely critical", providing the attacker with full system access remotely over the network.

Patch complexity has a measureable effect on end-point security. Data from the Secunia PSI also showed that less than 2% of the Microsoft programs were found to be insecure, while third-party programs ranked between 7% and 12%. With programs from 14 different vendors, users have to master approximately 14 different update mechanisms to keep their end-points secured and patched: 31% of the vulnerabilities in 2010 were covered by one "Microsoft update" to patch the operating system and the 26 Microsoft programs; whereas 69% of the vulnerabilities required 13 update mechanisms to patch the 24 third-party programs.

Despite the fact that vendors do not share update processes or procedures, they are, however, only partially to blame. In a majority of cases, users actually hold the power to patch their programs firmly in their hands. In the last two years 66% of vulnerabilities had a patch available on the day of disclosure and could have been fixed on the spot. This highlights the current lack of vendor-user communication and a unified patch process used industrywide, which almost certainly leads to incomplete patch levels.

Patching is often viewed as a secondary security measure below anti-virus and perimeter protection, which in contrast, are often viewed as top priority. Anti-virus has limitations and is not as effective as commonly perceived; because cybercriminals know how to systematically bypass anti-virus detection. A security patch provides better security than any number of anti-virus or other detection signatures as it eliminates the root cause. Therefore both should be used.

As software vendors are still unable to release vulnerability-free software at large, effective vulnerability management is crucial. The lack of effective update mechanisms expose end-users to significant risks as vulnerable software tends to "survive" for a long time before being updated for other reasons than security. Both private and corporate end-users need to become more aware of these risks and embrace the practice of regular updating. Patching mechanisms, such as the free Secunia Personal Software Inspector (PSI), remove the headache from this process by providing automated handling from vulnerability scanning through to security patch installation.

Secunia exhibited at Infosecurity Europe 2011.

End-point Security Under The Microscope

Stefan Frei, Research Analyst Director, Secunia

Most private or corporate Internet users face security challenges on a daily basis. Unpatched end-points with a plethora of insecure programs installed represent a breeding ground for cybercriminals. Findings revealed in the Secunia Yearly Report 2010 identified that, typically, 50% of users have over 66 programs from more than 22 different vendors installed on their end-points. Vulnerabilities affecting a typical end-point pose a real threat to the end-user's host.

To track the security of typical users, a representative software portfolio containing the Top-50 most prevalent programs typically installed on end-points was built, based on analysis of data taken from anonymous 2010 scan results from ecunia Personal Software Inspector (PSI) users. Analysis showed that the typical software portfolio consists of programs from 14 different vendors: 26 programs from Microsoft, and 24 programs from third-party (non-Microsoft) vendors.

An alarming trend has reared its ugly head: vulnerabilities specifically affecting the typical Top-50 software portfolio have increased almost four-fold in three years, or by 71% in the last 12 months alone, irrespective of the choice of operating system. In fact, results showed that the operating system accounts for only 13% of vulnerabilities on the end-point, on average.

Significantly, third-party (non-Microsoft) programs are found to be the main culprits responsible for this significant increase in vulnerabilities. For example, in 2010 an end-point with the Top-50 portfolio and Windows XP had 3.83 times more vulnerabilities in the 24 third-party programs than in the 26 Microsoft programs, and 5.22 times more vulnerabilities in the 24 third-party programs than in the operating system. The vulnerabilities are relevant as more than 50% are rated as "Highly" or "Extremely critical", providing the attacker with full system access remotely over the network.

Patch complexity has a measureable effect on end-point security. Data from the Secunia PSI also showed that less than 2% of the Microsoft programs were found to be insecure, while third-party programs ranked between 7% and 12%. With programs from 14 different vendors, users have to master approximately 14 different update mechanisms to keep their end-points secured and patched: 31 % of the vulnerabilities in 2010 were covered by one "Microsoft update" to patch the operating system and the 26 Microsoft programs; whereas 69% of the vulnerabilities required 13 update mechanisms to patch the 24 third-party programs.

Despite the fact that vendors do not share update processes or procedures, they are, however, only partially to blame. In a majority of cases, users actually hold the power to patch their programs firmly in their hands. In the last two years 66% of vulnerabilities had a patch available on the day of disclosure and could have been fixed on the spot. This highlights the current lack of vendor-user communication and a unified patch process used industry-wide, which almost certainly leads to incomplete patch levels.

Patching is often viewed as a secondary security measure below anti-virus and perimeter protection, which in contrast, are often viewed as top priority. Anti-virus has limitations and is not as effective as commonly perceived; because cybercriminals know how to systematically bypass anti-virus detection. A security patch provides better security than any number of anti-virus or other detection signatures as it eliminates the root cause. Therefore both should be used.

As software vendors are still unable to release vulnerability-free software at large, effective vulnerability management is crucial. The lack of effective update mechanisms expose end-users to significant risks as vulnerable software tends to "survive" for a long time before being updated for other reasons than security. Both private and corporate end-users need to become more aware of these risks and embrace the practice of regular updating. Patching mechanisms, such as the free Secunia Personal Software Inspector (PSI), remove the headache from this process by providing automated handling from vulnerability scanning through to security patch installation.

Secunia exhibited at Infosecurity Europe 2011.

Chink in the Cyber Armour?

Prescott 8. Winter, CTO for the Public Sector, ArcSight

When evaluating our effectiveness in dealing with cyber defence, we have two layers of uncertainty and risk: how effectively are we protecting critical information, and what are the adversaries actually getting and how might they use it against us?

In today's interconnected world, we regularly deal with systems of overwhelming complexity and largely unmeasured risk, and that is before they become potential targets of explicit attack. Multiple risks, including software errors and network management failures, are compounded by the certainty that adversaries seek to cut through our defences, and the resulting uncertainty as to how much of our critical information has been compromised or exactly how such a breach could affect our organisation's future success or failure.

The fate of nations is not decided by wars alone, but by disease, technology, economics--and information, among other salient factors. Increasingly, information underlies and facilitates these other domains as never before. The modern military is largely dependent on huge information flows for all phases of its activities--net-centric warfare is the model of the day. The treatment of disease and public health is now built on prodigious amounts of information, not just for the understanding and treatment of the disease itself, but for all the issues concerning patient status and tracking, insurance and liabilities, etc. Technology and economics, our ability to innovate, create, market and profit, are impossible today without detailed--and protected--information.

What is not so often appreciated is that much of this information is effective only as long as we control it and manage its use. The fact that we have this information and can act on it, while our competitors and adversaries must wait until we have chosen to act, is a decisive advantage in all these domains. The time advantage resulting from knowing something uniquely and first can result in a leadership position in many domains. The continued bleeding of critical information through cyber attack may not convey the impression of catastrophe that we see in physical incidents such as last year's Deepwater Horizon rig disaster, but rather, it can have a profound cumulative effect on our way of life and leadership position in the respective industry.

I would argue that there is a growing awareness in some quarters, but not broadly enough in either government or the private sector - that critical information is now more at risk of intrusion and theft than ever before, and that this matters. If an organisation holds any significant amount of information that is critical to the success of that enterprise and which, if stolen, could confer an immediate advantage on a competitor or adversary, those competitors and adversaries will approach this as a fundamental strategic issue, making them persistent and implacable foes. This growing awareness that adversaries will seek to obtain protected information is leading to efforts to respond more effectively, and to find ways to measure the risk.

The loss of unique control over critical information should be regarded as an unacceptable outcome at the highest levels of any organisation, and the enterprises that understand this are devoting significant resources to cyber risk management and response.

ArcSight exhibited at Infosecurity Europe 2011.
COPYRIGHT 2011 A.P. Publications Ltd.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:SOFTWARE WORLD INTELLIGENCE
Publication:Software World
Article Type:Company overview
Date:May 1, 2011
Words:4625
Previous Article:Come fly with me - how your organisation can avoid a crash landing.
Next Article:Death, taxes and encryption keys.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters