Presence on the Internet involves dealing with an ever-shifting landscape. New technologies emerge while others wither away. Protocols rise and fall. Traffic patterns change. 10 years ago, the file transfer protocol (FTP) accounted for most Internet traffic. Today, most Internet traffic is predominantly HyperText Transfer Protocol (HTTP). This is so strong, in fact, that most people mistake the World Wide Web to be THE Internet, while in actuality it is only a subset of it.
This is because today, almost all of the average user’s interaction with the Internet is through a browser. Even the entry point of email has become, for many, through webmail. The Web 2.0 trend only makes the reliance on HTTP even more predominant.
Times are Changing
As technologies evolve and change, so do hacking attack methods. The proliferation of the Web as the medium of choice for communication means hackers concentrate their efforts there. Network firewalls, which were once constructed to block dangerous protocols, particularly remote procedure calls (which enable programs to communicate), now find themselves helpless as these same protocols reemerge, encapsulated inside HTTP. This opens up new avenues to explore and exploit. HTTP, once merely used for text, is now a generic protocol that can transfer anything – of course text and images, but also complicated AJAX-based Web 2.0 transactions, containing sensitive data such as credit cards or other financial and personal information.
Websites – The New Battleground
The evolution of the Web medium, and the escalation in attacks, calls for different methods of defense. A traditional network firewall cannot perform the deep inspection of traffic that is now necessary. This has been capitalized upon by hackers – the amount of new HTTP-borne attacks uncovered virtually every day is staggering. From their hiding places and through distributed networks of zombies and infected proxies, hackers unleash a spate of attacks that leave no Web server untouched in their quest to infiltrate, exploit and extort.
The average website is targeted anywhere from twice to 200 times a day by miscellaneous worms and crawlers that attempt a slew of diverse attacks – some for well known exploits, others for recently discovered, and as a result unpatched, faults. Since these attacks are automated, their numbers only grow, and the attackers never tire. And the network firewalls remain inert. Traditional firewalls are really only effective to cover the third and fourth of the seven layers in what computer experts call “The OSI network model” for communication (see figure below). These are the layers dealing with establishing the network channel for communication. The content of the communication remains uninspected. This is much like knowing about a phone call – but not being able to listen in.
Thanks to the Applicure team for a wonderful piece!