As businesses continue to move critical operations online, distributed denial of service (DDoS) attacks are increasing in frequency, sophistication and range of targets. In a 2011 Verisign study, 63 percent of respondents reported experiencing at least one attack that year, while 51 percent reported revenue loss as a result of downtime from the attack. Those numbers are undoubtedly higher today as the size, frequency and complexity of DDoS attacks continue to grow. Mitigation against these types of attacks is challenging and generally requires layered solutions across data centers and the cloud management. The success of these attacks and their ability to damage a company’s infrastructure, revenue and reputation is indicative that many IT managers still haven’t found the right protection formula to proactively mitigate them.
A DDoS attack occurs when a “botnet” (a group of compromised computers) is used to send an overwhelming amount of “bad traffic” to an intended target, such as a company’s website. Computers can become “bots” when they’re infected with a virus or other malware through a compromised website or malicious email. This usually happens completely behind the scenes with the user having no idea their PC is part of a botnet. The botnet is directed by a botnet command and control that tells all of the bots who/what/when/where and how to attack. The target of the attack usually spends so much time trying to handle the bad traffic that legitimate visitors, or customers, are crowded out and unable to get to the site.
Let’s use an e-commerce site for example. Nearly every e-commerce site has an “Add to Cart” button. If a DDoS attacker could script a thousand bots (some botnets have over a million bots) to simulate clicking on that “Add to Cart” button and generate more traffic than the site could handle, legitimate shoppers would have no chance of getting their click in. The key to fighting a complex attack like this is being able to differentiate a real shopper from a bot so the website can service one and ignore the other.
With the ease of access to the internet and prevalence of social media today, unsuspecting computer users are making it easier than ever for malicious actors to target them with malware. This trend has helped provide the perfect environment for DDoS attacks to grow both in size and complexity. In fact, attacks of 100 Gigabits per second (Gbps) have been recorded. To put that into context, the largest recorded DDoS attack was 2 Gbps in 2002. Considering that most websites have less than 1 Gbps of network bandwidth, even small attacks today can quickly prove devastating.
In addition it’s not just web infrastructure the attackers are targeting, but increasingly the Domain Name System (DNS) infrastructure as well. Arbor Networks’ 2012 Worldwide Infrastructure Security Report indicated that 41 percent of respondents experienced DDoS attacks against their DNS infrastructure. The DNS acts like the global phonebook for the Internet; it’s critical to the functioning of the web. Computers don’t understand domain names, only numeric IP addresses, so when someone wants to visit “www.example.com”, their computer asks, or queries, the DNS system, “What is the IP address of www.example.com?” The DNS replies with the IP address, and the web surfer is on their way. If the DNS system is unavailable, users are unable to find their desired sites and those sites can potentially experience significant damage to online revenue streams, reputation and brand.
We’ve seen this with several recent attacks against financial institutions and others that used new malicious code to attack the DNS sub-system of the victim organizations. This type of attack brought the targets down in two ways; bandwidth exhaustion and by overwhelming processing capacity.
Bandwidth exhaustion, a result of continuous querying of the victim DNS by a botnet, causes the network pipes of the target’s DNS server to become saturated so that no legitimate requests can make it through, resulting in an error message for legitimate users. This situation also becomes complicated by a ton of very large DNS packets being sent, creating a situation in which the target DNS server has to inspect and answer each packet, thus overwhelming it’s transactional capabilities. So, even if the attack doesn’t saturate the bandwidth, it still saturates the computing resources of the target. Pretty sneaky.
DDoS attacks, while previously a nuisance, are now a fact of life on the web. They aren’t going anywhere, so enterprises need to have a “battle plan” for combating this ever-evolving threat. In 2013, we expect to see more enterprises trying to block harmful traffic before it reaches the network or application to eliminate the many risks associated with cyber-attacks, like data breaches and network downtime. As the traditional solutions on which many enterprises have relied for this – like over-provisioning bandwidth and firewalls – have proved costly and ineffective, companies will turn to cloud-based DDoS protection and managed DNS services to enable rapid deployment, provide transactional capacity to handle proactive mitigation, and eliminate the need for significant investments in equipment, infrastructure and subject matter expertise. Taking the cloud approach will help businesses trim operational costs while hardening their defenses to thwart even the largest and most complex DDoS attacks.
These cloud-based solutions will be critical in the future (they arguably already are). As companies and individuals become more reliant on the internet for critical processes and everyday tasks, downtime, no matter the reason, will not be an option. Managed DNS and DDoS protection services are the keys to a frontline defense against cyber-attacks and helping to ensure your customers can always reach your site and aren’t driven to the competition.