Here at the Akamai Edge conference in Washington D.C., we're talking to customers about the latest attack techniques and how we're staying ahead of the threats. One example of what we're watching: a method where attackers are able to use vulnerabilities in PHP applications to exploit superglobals — pre-defined variables in PHP — to launch malicious code.
A few months ago, Akamai Senior Enterprise Architect David Senecal wrote a post about ways to identify and mitigate unwanted bot traffic. Here at the Akamai Edge conference in Washington D.C., discussions around that continue — specifically, how to squeeze the maximum usefulness out of bots and other web crawlers. Yesterday, I continued a discussion I've been having about that with Matt Ringel (@ringel on Twitter), an enterprise architect in Akamai's Professional Services team.
There was an interesting story in eWeek yesterday about "one of the largest attacks in the history of the Internet" taking place last week. It describes a 9-hour barrage against an unnamed entity that swelled to 100 Gigabits of traffic at its peak. But does it really qualify as one of the biggest in Internet history? It's an impressive barrage, to be sure. Reading the article reminded me of a post Akamai CSO Andy Ellis wrote back in March about a 300 Gbps attack against SpamHaus.