I need to randomize large integers (> 32K). I looked at some modules, but I just need to call $foo = rand $bar repeatedly without setting up objects & parameters. So, I came up with the following scheme to extend the range of rand to around 1G. Basically, it breaks $bar into blocks of nearly equal size (within +1 -0) and calls rand twice, once to select a block, and again to select the number within the block, then it totals everything up. It appears to DWIM and seems a reasonable approximation of rand (the distribution of numbers generated compares closely to rand), but I don't have the expertize to test it rigorously.
It's to be used for simulation, and doesn't need to be perfect. Does this seem like a reasonable approach? Am I overlooking any potential problems?
sub bigrand{ my $in = shift; my $limit = 32768; if ($in <= $limit){ return int rand $in; } my $blockSize = int($in ** .5); my $blockSelect = int rand int ($in/$blockSize); my $leftOver = $in % $blockSize; if ($blockSelect < $leftOver){ return ($blockSelect * ($blockSize + 1)) + int rand ($blockSize + 1); }else{ return ($leftOver * ($blockSize + 1)) + (($blockSelect - $leftOver) * $blockSize) + int rand ($blockSize); } }
Update: It's been an education. Up to now, rand is one of those functions I've just used and taken for granted. Thanks especially to BrowserUK for the explanation and links to other nodes, and to tachyon-II whose module Math::Random::MT::Perl I ended up using.
In reply to Randomizing Large Integers by hangon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |