Is bigger better when it comes to DDoS attacks?
- 07 December, 2012 15:51
DDoS attacks have remained on the front page again in 2012 for a very simple reason; they continue to attack the largest and most secure networks in the world, from governments’ web properties to Wall Street. Is this simply a function of increasing size of attacks that overwhelm these websites? Yes and no.
Arbor’s ATLAS internet monitoring system shows that without question, DDoS attacks are getting bigger, much bigger.
The average attack in September 2012 was 1.67Gbps, 72 per cent growth from September 2011. The number of mid-range attacks (2 to 10Gbps) is also up 14.35 per cent so far this year.
Furthermore, very large attacks (10Gbps and over) are up by 90 per cent this year over 2011 and the largest attack in 2012 was 100.84Gbps.
This increase in attack size has significant implications not only for service providers, but specifically enterprises that continue to rely on firewalls/IPS to protect them from DDoS attacks.
Because these devices have to keep state information on every session, they can easily be overwhelmed with botnet-based attacks. This often makes them among the first points of failure during DDoS attacks. The larger the attacks get, the more likely these devices are to fail.
All of that said, when it comes to DDoS, size isn’t everything. That is why it is best to deploy a layered defence strategy as a best practice for all enterprises.
The most robust defence is achieved by combining a cloud-based DDoS managed service that protects the network from larger attacks, together with an on-premise DDoS solution.
This will keep services available and to protect existing security infrastructure, such as the firewall and IPS, by detecting and mitigating application-layer attacks at the perimeter of the network.
Recent attacks prove it’s not all about size
Recent bank attacks in the United States show DDoS is not all about size.
These attacks are becoming increasingly complex. They often include multiple techniques and targets. Take, for example, the recent attacks on financial services companies in the US.
These attacks used a combination of attack tools with vectors mixing application layer attacks on HTTP, HTTPS and DNS with volumetric attack traffic on a variety of protocols including TCP, UDP, ICMP and others.
The other obvious and uncommon factor used in this series of attacks was the simultaneous attacks, at high bandwidth, to multiple companies in the same vertical.
Many of the compromised hosts used in these attacks were servers with significant upstream bandwidth at their disposal. The majority of these bots resulted from PHP web applications that were exploited.
Many Wordpress sites, often using the out-of-date TimThumb plugin, were being compromised around the same time. Joomla and other PHP-based applications were also used.
Often these were unmaintained servers that attackers uploaded PHP web shells to and then used the web shells to further deploy attack tools.
Attackers connected to the tools either directly or through intermediate servers/proxies/scripts and therefore the concept of command and control did not apply in the usual manner.
Without question, DDoS attacks are growing larger. More significantly, they are becoming increasingly complex, blending multiple attack tools, techniques and targets. One reason why DDoS remains such an effective weapon is that too many enterprise networks continue to rely on solutions that were designed for other problems, to combat DDoS.
The complex, rapidly evolving attack vector requires purpose-built tools on-premise as well as cloud-based security. This provides comprehensive protection against both large attacks and those that target the application layer.
Until we see pervasive deployment of best practices defences, we can expect to see DDoS in the headlines for many years to come.
Gary Sockrider is a network solutions architect at Arbor Networks.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.
CIOs say cost, complexity impede true mobile gains in enterprise
The enlightened CIO’s guide to running projects
The enlightened CIO’s guide to running projects
Why IT projects really fail
Queensland government to provide 200 services online by 2015
Power of Three: Building Mobile Initiatives Guided by Business Goals, Technology and Governance
The use of powerful mobile devices has become so widespread, industry leaders in almost every sector have embraced mobility solutions as central elements of their IT and business operations. As mobile budgets grow, so does the influence of business units on mobility strategy. However, without a comprehensive strategy that encompass business goals, policies and processes, then initiatives into mobility will struggle to deliver the expected benefits. Click to download!
The “Enterprisation” of Mobile Apps – Moving from Corporate Liability to Business Asset
Many companies have established mobility as a core strategic technology, deploying corporate liable devices and allowing personally liable devices to connect to their networks and back-office applications. Yet few companies have established a realistic strategy of how to make the apps being utilised by end users truly enterprise controlled, secured and managed. Download to find out more!
Multi-Factor Authentication; Current Usage and Trends
In this digital age, validating identities and controlling access is vital, which is why multifactor authentication has become such a fundamental requirement in so many organisations. This survey looks at the authentication landscape in Europe, the Middle East, and Africa, and offers insights into how it is expected to change in the coming years.