in reply to Untainted done right!

From the perlsec manpage (read it!):

Laundering data using regular expression [sic] is the only mechanism for untainting dirty data

(it goes on to qualify this statement, but the mechanism that provides the exception to this rule goes deeper than we need to)

And tr/// doesn't really fall under 'regular expressions'. You have to use Perl's facility for capturing parts of pattern matches, i.e. you need to capture only those bits of data that you want using parentheses within regular expressions.

PERL DOES NOT KNOW WHICH DATA IS SAFE, it only knows when to no longer mark the data as tainted. So the programmer has to know what sort of input is safe, and what sort of input is not. What is 'dangerous' ? Well, shell metacharacters, usually. So this might be a start (adapted from Programming Perl, 2nd ed. p. 358):

sub untaint { my $data = shift; if ($data =~ /^([-\@\w.]+)$/) { $data = $1; return $data; } else { die "somebody tried something nasty, I think: '$data' contains + questionable characters.\n"; } }

The regex tests the string to see whether it contains anything other than @, -, a dot, or a word character. If it doesn't, it untaints it (by setting the data to the text captured within the parens in the regular expression), and will die with a warning if there's something not good with it. Depending on your needs, you could do various things, of course; you'll need to be handy with regular expressions, though!


Philosophy can be made out of anything. Or less -- Jerry A. Fodor

Replies are listed 'Best First'.
Re: Re: Untainted done right!
by Corion (Patriarch) on Feb 12, 2001 at 23:47 UTC

    There's one small caveat with your code. If you are using CGI::Carp _and_ malicious data fails, the data will be printed out back to the browser, data, which could have been construed just to be sent back to the browser (for example, the Slashdot-Cookie-Stealer worked that way, by giving you an URL that was like<SCRIPT>... evil JavaScript ... , which was then printed out by Slashdot back to the browser, and then run in the browser.

    My strategy would be to just log taint-failed data into a file, as you never know exactly how that data got to your machine. Of course, maybe using HTML::Entities could prevent such misuse, as the text will then come back literally instead of interpretable ...

      Good point. I just want to mention that one shouldn't be echoing such messages back to the browser in a production environment. i.e. turn this off when you go live with it:

      use CGI::Carp qw(fatalsToBrowser);

      the likes of which are required for this to be a problem.

      Philosophy can be made out of anything. Or less -- Jerry A. Fodor