Why Firewalls Don’t Weaken a Website Against Attacks

A couple of days ago SecurityNewsDaily offered an article titled “Firewalls May Weaken Websites Against Attacks” (http://bit.ly/f54HWT ) written by that site’s managing editor Paul Wagenseil.  Let me start by saying that SecurityNewsDaily is a good site and I look at it almost every work day.  This article discusses Arbor Networks recently released 2010 Infrastructure Security Report and more precisely some guidance in the Arbor report that “placing firewalls – software or physical devices that filter incoming and outgoing Internet traffic – in “front” of Internet-facing servers just creates bottlenecks that make DDoS attacks more efficient.”.  The article goes on to quote someone who I’ve known for a long time; Arbor Networks Roland Dobbins as stating that firewalls are placed in front of servers because “folks” are “programmed” to do that.  Needless to say I don’t agree with this guidance.

I do agree with Roland that network and network security staff are often trained to position firewalls in between server and the Internet.  I think there is good reason for that.  Firewalls are network devices that are designed to examine network traffic; protocols and connections being created, used and torn down between network devices and apply policies to that traffic.  Those policies might be to allow; to deny; to route or rate limit; or to log particular identified connections and or types traffic.  Firewalls are specialized devices that are inserted into those designs to server a specialized function.  Can we do this same type of filtering on an web or application server?  Probably depending on the server operating system.  Should servers have policies to filter traffic?  Definitely.

Roland goes on to make the case against that putting a firewall in front of a server because the firewall becomes the target of a DDoS.  Can this happen?  Yes.  Definitely.  But again using the best practices for managing and monitoring firewalls an typical admin will recognize that DDoS quickly.  Recognizing a DDoS quickly is important because the sooner the conditions are recognized the sooner network staff can work with their upstream providers to try and mitigate the attack.

One of the best documents I’ve read about time proven mitigating DDoS is “Closing the Floodgates: DDoS Mitigatrion Techniques” by Matthew Tanese and published back in 2002-2003.  A more recent (2009) paper, “A Survey of Botnet Technology and Defenses” (http://www.merit.edu/networkresearch/papers/pdf/2009/catch09_botnets_final.pdf) covers newer, cooperative techniques for combating the botnets that generate DDoS traffic.

As DDoS attacks grow more potent it’s not reasonable to think that a server or a firewall alone can stand up to these attacks.  Going forward there is much to learn and do prepare to fight the next generation of DDoS attacks. We’ll need all of tools in order to combat these threats.