Over the years firewalls have become an integral part of any network looking to properly protect itself from the threats that lurk outside. Firewalls are successful because they sit at the edge of your network, and are the first point of entry for any outside traffic. By standing guard at the edge of your network, firewalls are able to inspect the incoming traffic looking for anything suspicious. Since firewalls are usually based on a rule system, administrators are able to progressively add new rules as time goes on to protect their networks from the latest threats. This ability to constantly update your system in order to handle the ever growing landscape of new vulnerabilities is what makes firewalls irreplaceable in any enterprises stack. Firewalls are commonly categorized into network firewalls, and host-based firewalls. Network firewalls sit at the edge of a network and monitor all of the traffic incoming and outgoing from that network. Host-based firewalls sit on individual machines, and monitor only that particular machines incoming and outgoing traffic. Another distinction in firewall applications is dedicated hardware based firewall systems versus virtualized software based firewall systems. While our goal isn’t to define one is better than the other, but rather to provide information about each to be able to than conclude with is better for your specific objective. Information Security over the years has been ever growing, and with that the technology evolves. When researching security, even outside the realm of computers, and information we find that usually there is no one size fits all solution. And that having proper, and well executed security comes down to being able to identify the specific threats you face in your current environment. Firewalls not only protect us against these threats, but also allow us to gather intelligence to be able to better identify what threats we do face.

Firewalls began to appear in the late 80s as simple packet filtering applications. These firewalls systems would inspect all the incoming and outgoing network packets and apply the predetermined rules to filter out and drop suspicious packets. Before this system administrators would rely on routers as security as ways to protect their networks. But as networks grew, and become more complex a dedicated solution was needed and thus firewalls were born. Firewall technologies continued to improve, with the next generation featuring stateful filtering. This consisted of all the features provided in the previous version, but added a deeper look at the packets, and overall inspection. Stateful firewalls allowed systems to retain all connection information and identifying if the packet is part of the start, middle or end of a connection. This development once again allowed system administrators to gather more intelligence over what was coming in and out of their networks. Thus providing them with a better chance at combating the threats they face. The technology was once again improved around the mid 90s to include what is called application layer filtering. This was an improvement to the way firewall applications handle the connections of various types of applications. And again garnering system administrators with more information to be able to protect their networks from outside attacks. Throughout the generations of firewall applications, and where we are now we have seen the technology improve more and more. One key thing to notice is that with each generation, and improvement we see firewall applications do deeper packet analysis, and provide more information than before about the activities going on inside the network. If you look at any organization trying to provide security to either a network, or a particular person you will find that the man with the most information wins. This makes sense because it’s very logical to see how one isn’t able to protect oneself from the threats they face if they have little information about them to begin with. This is why you have whole agencies in the public, and private sector whose goal is to simply gather information. Sun Tzu is famously quoted as saying “If you know the enemy and know yourself you need not fear the results of a hundred battles.” We can relate to this when thinking about network security in the sense that if we know our own network, and we know the threats we face than we need not worry. Firewalls not only act as a mechanism to protect us, but also yield the crucial information we need to know our enemy, and how they operate.

Throughout the years system administrators have been choosing ways to best apply firewalls to their networks. Computer operating systems have also incorporated firewall like software natively to help protect against threats that might slip past the network firewall. But as we have seen over the last couple years, host-based antivirus and firewall systems usually can be easily bypassed with advanced malware techniques that are becoming more common. In the beginning you had hardware based firewall systems coming out, running on proprietary hardware. This required administrators to maintain separate hardware to run firewall applications instead of it being built into existing hardware infrastructures. While this isn’t necessarily a bad thing, some might argue that the separation creates more security. It did limit the ways in which system administrators were able to protect their networks. But as firewalls began to evolve we have seen newer solutions move away from proprietary hardware. And with the growth of cloud computing, we have seen some enterprises move their systems to these platforms creating new types of firewalls that live in the cloud.

When we talk about hardware based firewall applications we are speaking of a computer that’s main purpose is to exist as a firewall. It usually doesn’t run your normal operating system, but has a proprietary built one that is best equipped to handle the task at hand. Not only is the software very specific, but usually the hardware is also chosen to best accompany the software. In other words you couldn’t take one of these machines, and install Windows on it. These hardware based systems come in different forms some being all contained in one system such as a Unified Threat Management system or are compartmentalized into different hardware devices performing specific functions. For example one piece of hardware might watch over your incoming and outgoing emails and inspect those packets. Another might perform inspection of packets coming from a certain application and work as a content filtering application.

With the growth of the Internet we have also seen the evolution of the cloud. Traditionally companies have personally owned all of their network equipment and either kept it in a server room on site or rented space in a data center to locate it. In either option the company has been the owner of that hardware. With a cloud environment, instead of owning the server, you are renting a server that only exists in the software realm. Companies like Amazon, and Google have been able to create systems were single computers can create what is called virtualized servers within themselves. So instead of having one piece of hardware and one server. You now have one piece of hardware, and any number of servers. These virtual servers are isolated from one another, and all act as independent computers or servers. This is beneficial in a couple ways but some I would highlight specifically that it is potentially cost saving. Especially for smaller companies who don’t have the money to purchase their own equipment. It also allows for much more flexibility in the sense of the cloud providers usually offering many different hardware solutions, and configurations. This really gives system administrators, and software engineers a lot of space to choose the systems, and applications that will best suite their objectives. Another key point to mention about virtualization is the idea of isolation. Like I mentioned previously, one piece of hardware runs multiple virtualized servers who all think they are their own boss. This allows you to create individual environments that are all isolated from each other. It should be noted though when we mention isolated we acknowledge that this isolation or what is sometimes called sandbox can, and has been exploited. Thankfully though we aren’t at a point where that is common place, and with proper configuration, and diligent monitoring virtualized servers provide secure environments to host, and run all sorts of applications.

This creates an interesting shift from how things operated in the past. It now allows companies to choose virtualized software based security solutions. A company might move to a cloud based hosting solution from an on-site solution. Now that the company's network has been segmented into a virtualized environment with the different servers all connected via a virtualized network it makes sense to have a firewall application that also lives in the virtualized environment to watch over things. But this what might seem like nifty technology doesn’t come without potential downsides. Being that virtual networks are similar to their physical counterparts they tend to be susceptible to similar attacks. For example if someone gets into one of Joe’s company servers in the virtual network, he is also able to attack the other machines in that network as well. That virtualized environment might also be directly routed to a physical network and would create other holes that could be exploited to a potential attacker. You also have to take into consideration that some servers in that virtualized environment might only talk amongst themselves. You could label this as Virtual Machine(VM) to Virtual Machine traffic. If you have a firewall sitting on the edge of your network, it might not properly pick up this type of traffic. Which in turn might create a situation where the system administrator doesn’t have the proper information he needs to identify a threat that is lurking within his network. And finally even though our virtualized servers are sandboxed, they are all still running on the same piece of hardware. Which brings potential issues if one VM is using lots of hardware resources it could cause the other VMs on that piece of hardware to lag behind. While the virtualization software applications are designed to handle and adequately maintain resource management, some issues still arise. As we all know software isn’t ever perfect, and unforeseen issues arise. Being aware of all the weakness is the network will only help us to better protect ourselves from all sorts of malicious actors.

The state of security has been ever changing over the years, and I’d argue that with the recent rise in computer attacks happening around the world it is changing ever more rapid in the last couple years. Hacking has always existed, but hasn’t always been in the spotlight like it is today. And with that extra attention, creates a rise in technologies being produced to fight this ever growing problem. Firewalls and intrusion detection systems become our first line of defense against outside attackers. And with the evolving landscape of attacks, we have seen firewalls evolve alongside to be able to handle the wide array of attack vectors we see today. While we have seen virtual environment solutions continue to grow, we also see hardware based firewall solutions aren’t going anywhere. We do see some vendors push software solutions, but I would say that the majority are pushing hardware based solutions. This isn’t exactly surprising, while software solutions might give consumers the flexibility to choose their own hardware and environments, that doesn’t necessarily translate into a positive, more secure thing. Companies are holding more and more pieces of critical data, in the public and private sector. And with attacks becoming more advanced, and even sometimes state sponsored, firewall vendors feel most confident with their products when they have full control over the hardware that inside the box, as well as the software running on it. This allows vendors to have more control over their solutions, and in turn reduce hopefully reduce the attack surface. A similar example would be the case of the iPhone. Apple is able to control not just the software, but also the hardware of their phones. This control allows them to reduce the attack surface of their devices in turn creating a more secure product. As noted by recent headlines quoted as saying the FBI paid around $900,000 to get into the iPhone that belonged to the San Bernardino terrorist. I can guarantee it doesn’t cost that much to get into an Android phone, not even close. The coming years provide a unique, and interesting opportunity for firewall vendors. With all of the new and old technologies all battling to provide the most secure solutions. And while virtualized firewalls give us an idea of what the future might look like, they aren’t close in sophistication to be able to take a large market share from hardware based firewall applications. But that doesn’t mean they don’t have a place in your network stack. A good systems administrator needs to look at all the tools in his toolbox, and decide what is best to meet his requirements. And I’d argue that many might suggest utilizing both solutions in some capacity to best equip yourself with the information, and knowledge to fight against the latest hacks that lurk on the horizon.