Recently, I caught wind of an issue which was reported by the client as follows…
Customers are getting error screens stating that their request was blocked.
At first glance, it smelled like an issue at the WAF (web application firewall).
A quick call with our hosting provider later, we confirmed that requests were, indeed, violating the WAF’s “max header size” policy. Let’s take a look at the what and the why.
Digging in, we identified that some code was recently deployed that was maintaining a list of coupons “clipped” for a given user in a cookie. The cookie was later used as a cache key identifier for an area of dynamic content. Turns out this list could get pretty massive…some users had hundreds of coupons “clipped” to their profiles. The sheer size of this cookie, which was presented to the server in the headers of each request, was leading to legitimate customers being blocked by the WAF.
From what I learned, there is a critical buffer overflow remote code execution vulnerability in IIS version 7.5. As such, “max header size” is a common feature for many WAFs. In our case, the WAF was configured to block requests with more than 4kb of headers.
As I dug into the issue further, I learned that max header size is not only a concern from a security standpoint. In fact, most web servers impose their own set of size limits on HTTP request headers. There are a few threads about this on stack overflow. This one lists out the size limits for some of the most popular web servers
I wrote a little script to test this out for myself…
#!/usr/bin/env php <?php $url = $argv; $bytes = $argv; $ch = curl_init($url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_HTTPHEADER, [ 'Junk: ' . bin2hex(openssl_random_pseudo_bytes($bytes / 2)), ]); $response = curl_exec($ch); $info = curl_getinfo($ch); echo 'Response Code: ' . $info['http_code'] . PHP_EOL; curl_close($ch);
It takes two parameters, the first it the URL to send the request to. The second is the number of bytes to send in a header.
Here are a few examples playing around with it…
$ ./send-request http://www.amazon.com 4000 Response Code: 200 $ ./send-request http://www.amazon.com 8000 Response Code: 400 $ ./send-request http://www.google.com 4000 Response Code: 200 $ ./send-request http://www.google.com 8000 Response Code: 200 $ ./send-request http://www.google.com 16000 Response Code: 413
We attacked the issue from several angles. First and foremost, we ran the value of this cookie through
gzencode before saving (and later
gzdecode when reading) to drastically decrease its size. Additionally, we were able to identify and eliminate a 3rd party service that was no longer in use that was further contributing to bloat of the request headers due to cookies.
We didn’t put this in place, but I also became interested in learning what options are available for monitoring the size of the headers in the HTTP requests coming to your application. With Apache, it looks like the best option is the
%I directive which comes with
mod_logio. Putting this directive in a
LogFormat declaration, you can log the size of the request header AND body for each request. Then, you can use a log processing solution such as the ELK stack to monitor this data over time.
While this isn’t a perfect solution, it at least gives some visibility into what’s going in.
The bottom line is, be careful about limitations imposed by intermediaries, or even your web server when it comes to HTTP headers.
I hope that this article helped a few people avoid or diagnose issues the HTTP request header limit. If you have any comments, feel free to drop a note comments below. Of course, as always, you can reach me on Twitter as well.
Hi, I'm Max!
If you'd like to get in touch with me the best way is on Twitter.