In the last few days, an article was published, referenced in another thread, which made Perl look bad. (The problem had to do with processing CGI parameters, and list vs. array context.) But, actually, this is a very common problem with many ... dare I say, most? ... web sites. The problem is that, when subjected to a barrage of nonsensical (or contrived) inputs, many sites fail disastrously and repeatedly.
Many web sites (and a growing number of mobile apps, with regards to their host interfaces) are, simply, untested with regard to their ability to operate in any path that deviates, however slightly, from what the site’s designers “expected” it to do. (Maybe) they ran through all the buttons and screens. (Or, maybe not.) But what they never did was to seriously beat it up. Unfortunately, that’s exactly what rogue attackers will do. (Yes, one of the first things that they will do is ... test your code.)
HTTP, leave us not forget, is a completely stateless protocol. The server receives a request consisting of a URI (including any GET parameters), a set of cookies, and possibly POST parameters. It produces an HTML (or AJAX) result, and promptly forgets. Everything. Well, the thing that many developers completely overlook is that these inputs can be anything. Although you may expect that they consist only of what the corresponding <form> would generate, you must never assume this. The inputs to your web page ... can ... be ... anything. Therefore, attackers (or, testers) will subject your site to URIs in unpredictable sequence with unpredictable or deliberately-altered parameters. They will look to exploit things that you assumed. And usually they will find them.
Consider what happens if the rogue simply duplicates a GET string. In many cases, the variables become multi-valued ... arrays ... since there is now more than one value for the variable. Or, what if the rogue adds another variable to the string, such as &is_admin=1? (You’d be shocked how often that turns you into a god ...) Mispelin a variable-name can cause an error-dump that might include literal SQL text and server-IP addresses. And so on.
One of the very first things we do at Sundial, with any new web site that we are asked to evaluate, is to put it into a test-bed and “hit it with both barrels.” I can count on the fingers of one hand how many sites survived the onslaught entirely unscathed. And yet, the automated test-tools are readily available (or, can be built in Perl). The builders of the site could have made the thing bulletproof ... they simply didn’t bother.
Perl has many unique features, like “taint mode,” that if properly used can greatly strengthen your software. But these are only as good as your ruthless and unforgiving test plans. Don’t send your software out into the world unprepared.
Replies are listed 'Best First'. | |
---|---|
Re: Crash-Test Dummies: A Few Thoughts on Website Testing
by Anonymous Monk on Oct 08, 2015 at 22:12 UTC | |
Re: Crash-Test Dummies: A Few Thoughts on Website Testing
by stevieb (Canon) on Oct 12, 2015 at 19:20 UTC | |
by Your Mother (Archbishop) on Oct 12, 2015 at 21:56 UTC | |
by Anonymous Monk on Oct 12, 2015 at 23:53 UTC | |
A reply falls below the community's threshold of quality. You may see it by logging in. | |
A reply falls below the community's threshold of quality. You may see it by logging in. |