For those of you, like me, that have to support old sites for your clients,
dealing with the vulnerabilities of old code can be quite a hassle.
Especially now that the best documented and known exploits can be completely
automated. One of our clients was recently subject to such an
attack. Unfortunately, when the site was originally developed, no real
security was built into the code. One user posted all SQL requests, no
matter if it was coming from the public side or the admin side. All requests
came directly from the page, meaning that every page had the code, and every
one would have to be touched to really fix it.
As we were already redeveloping the modern replacement to the site, the client
wanted us to spend as little time as possible on the old one. So a true
security audit was out of the question. This, of course, is still the right
way to solve the problem, but right isn't always in the budget. So that
leads us to a couple tools to help avoid the problem until we could release the
replacement site.
The first is a tool from Microsoft called URLScan.
URLScan has a lot of features, but what we used it for here was to limit the
length of query strings. Since the attack strings were almost always
longer than a regular POST or GET, we just had to limit the length of the
strings for most of those attacks to fail. Take a look at it, there's lots of
neat tricks URLScan can do.
The big gun we used was an ISAPI filter written by
Rodney
Viana. It's designed to scrub GET and POST requests of anything that
would look like an attack. It has been a life saver, especially when the
attacks were happening hourly.