A recent online poll by IT security firm Napatech found that a quarter of respondent companies had suffered a network intrusion. The interactive poll, involving more than 300 attendees at one of our online events, found that 25% of respondents had experienced an intrusion incident, with 44% of these incidents occurring within the past 12 months.
Network intrusion events are not just an irritation, as they were back in the early days of IT networking, but they can be commercially damaging. Unlike the altruistic 1980s, when the `hackers’ tended to be fellow engineers who also had access to the dial-up modems, expertise and other IT resources to gain access to businesses’ online assets, the majority of attacks today are carried out by sophisticated criminal organisations attempting to steal data or hijack computing resources for their own illegal use.
These Napatech online poll results were confirmed by a PricewaterhouseCooperssponsored survey, which found that 83% of smaller organisations had experienced a security incident in the last year, compared with 45% two years earlier. The PwC/Infosecurity survey also revealed that 90% of all organisations had increased their expenditure on IT security technology, whilst smaller businesses are now spending 10% of their IT budget on security issues, compared with 7% two years ago. The report attributes the rise partly to the increasing use of cloud computing (computing services provided over the internet) and social networks within enterprises.
Delving into the study reveals that 15% of large companies noted that Intrusion Prevention Systems (IPS) are under pressure to keep up, both in terms of handling the amount of data and in having the raw horsepower to run more sophisticated algorithms to ensure that their IT resources have not been accessed by an unauthorised outsider in the last 12 months. A quarter had suffered a ‘denial-of-service’ attack, double the number logged in the last survey carried out two years earlier.
The report also found that the rate of adoption of newer technologies has accelerated over the last two years, with most respondents now using wireless networking, remote access and VoIP (Voice over IP) technologies. In addition, the number of organisations allowing staff to have remote access to their systems has also increased with around 90% of large companies now offering this facility. These figures confirm that the number of intrusion incidents is on the rise. This, in turn, is also forcing most organisations to increase the proportion of their IT budget they spend on security technologies.
While many IT security professionals regard IPS to be a natural extension of Intrustion Detection Systems (IDS) technology, it is actually another type of access control mechanism, rather than purely a sister IT security platform to IDS. In fact, the term IPS is actually a lot younger than IDS, and was first used in the late 1990s by Andrew Plato, a technical consultant with a major IT security vendor that developed the industry’s first IDS platform.
In its purest form, an IPS makes a number of access control decisions based on the content of the application, rather than taking a traditional firewall approach of monitoring IP addresses, ports and other connective links. Back in 1998, Plato opined that a good IPS should feature a sophisticated analytical engine, but one that generates as few false positives as possible. Provided this is the case, he said at the time, then a good IPS has a number of advantages over IDS, since it can sit in line with an IP traffic flow and analyse the data stream in real time.
In addition, most modern IPS solutions also have the ability to analyse protocols, such as FTP, HTTP and SMTP, and make decisions on whether to allow – or quarantine – the IP packets as required, even if the data are encrypted.
But are today's IPS platforms up to the task of scanning IP traffic at the high speeds needed in a modern IT environment? The problem facing IT professionals is that, with the internet growing at between 40 and 60% year, according to Atlas Internet Observatory – and against the backdrop of a mobile data explosion – it is important that IPS technology can keep up with this data bandwidth growth and not become the bottleneck in the network.
It is also becoming clear that, on a typical modern network, users are placing a very heavy load on each port of a multi-10 gigabit (G) port system and, while there are IPS products available that are capable of supporting a multiple 10 gigabit/s (Gbps) port topology, providing continuous 10 Gbps throughput on these ports is a something of a challenge.
The most worrying part of this development is how IPS platforms can be scaled to meet the needs of 40G and 100G IPS technologies, which are set be introduced to the IT/networking mix in the next few years.
Until a few years ago, it could be argued that IPS platforms were up to the task, especially since most IPS platforms adopted a core five-stage realtime analysis process that steps through a number of stages as various IT threats are encountered when monitoring an organisation's data streams that flow both in and out of the IT resource. The first stage is to bandwidth throttle any suspicious IP traffic to give the organisation's IT security software a chance to analyse the data stream – let's take the example of an email message stream – and deal with suspect messages and/or attachments in real time. If the data are found to be suspect, but do not conform to known infection signatures, then the second stage is for the message's header to be analysed and, if an infection etc, is found, the data can be quarantined.
Third stage of the analysis process
The third stage in the analysis process involves performing user management and address validation, typically by applying a number of automated checks to verify whether the message comes from a source previously known to be dangerous.
The fourth stage involves applying an antimalware and anti-hacking analysis engine for anything suspicious that has passed the first three analysis stages but does not pass muster. The fifth stage, typically involves using the analysis engine to weed out anything that still looks suspicious for later, manual, analysis by the IT security staff concerned.
However, businesses today are not just increasing their IT security budgets; they are also raising their game when it comes to security strategies. Given this scenario, the key challenge now is scaling these systems to keep up with the increasing bandwidth generated by richer content in emails and on websites, more video and teleconferencing and the transition to cloud computing already taking place.
All these innovative services provide a new and high-speed avenue of attack for hackers. And because of this, network security systems need to react in real-time to contain the problem. To keep up with these high-speed, real-time demands, the traditional approach of network security appliance vendors has been to invest in developing customised, proprietary hardware. However, a new approach is emerging where off-the-shelf standard PC server hardware is being used, negating the need for hardware development.
The Napatech poll revealed that the majority of network security appliances being used are still based on proprietary hardware, but for every three proprietary systems, there are now two systems based on standard PC server hardware. In the past, PC servers have not been powerful enough to meet the demands of security applications like IPS, but the latest generation of PC servers provide significant processing power and a strong roadmap of increased performance to come.
Our researchers have discovered it has now become more economical to build network security appliances based on standard PC server hardware using Napatech’s intelligent real-time network analysis adapters to ensure high levels of performance. But one of the most compelling reasons for considering this approach is the ability to scale performance. At various IT events throughout this year, Napatech has demonstrated a full-throughput 10Gbps IPS system based on eight instances of a standard SNORT application running in parallel. This technique takes advantage of the multiple CPU cores available in modern PC servers. Napatech can support up to 32 CPU cores, so as the number of cores increases and the power of each core ramps up, the ability to scale performance increases.
Indeed, it’s worth noting that CPU chipmakers, such as Intel and AMD, are increasing the performance of their chips by as much as 50% on an annualised basis.
Can the vendors of proprietary network security appliances keep up with this kind of performance roadmap? Does it even make sense to try? If you ask us at Napatech, we will tell you that the time has come to reconsider.
Dan-Joe Barry is vice president of marketing at IT security firm Napatech, based in Andover, MA, US.