in reply to Increasing throughput of random data

Nice tricks, but if compressibility can serve as measure of entropy, then your output is not very random. (Output here is single buffer with newlines inserted after every 64 hex digits. Perhaps close enough for simulation. And ~50% compression of hex digits as opposed to raw bytes indicates pretty much "white noise" i.e. randomness.) The Math::Prime::Util::GMP claims its random_bytes to be the fastest of all, maybe it's not too slow for you. And, I agree, rand/pack combo of built-ins is very slow for the case, indeed.

use strict; use warnings; use Benchmark 'cmpthese'; use Math::Prime::Util::GMP 'random_bytes'; use IO::Compress::Deflate; sub original { my $buf = random_bytes(32); my $ret = ''; # Negate bits for my $bytes ($buf, ~$buf) { # Reverse bits for my $bits ($bytes, pack('b*', unpack 'B*', $bytes)) { # Reverse nibbles for my $hex (unpack("H*", $bits), unpack("h*", $bits)) { # Reverse hex string for my $str ($hex, scalar reverse $hex) { $ret .= $str . "\n"; # Rotate hex string $ret .= substr($str, $_) . substr($str, 0, $_) . " +\n" for 1 .. 63; } } } } $ret; } sub random { ( join "\n", unpack '(H64)*', random_bytes(32768) ) . "\n" } cmpthese -1, { original => \&original, random => \&random, }; print "\n"; my $z = IO::Compress::Deflate->new( \my $s1 ); $z->print( original() ) for 1 .. 1000; $z->close; my $ratio = 100*( length $s1 )/( 65*1024*1000 ); printf "original output compressed to %.1f%%\n", $ratio; $z = IO::Compress::Deflate->new( \my $s2 ); $z->print( random() ) for 1 .. 1000; $z->close; $ratio = 100*( length $s2 )/( 65*1024*1000 ); printf "random output compressed to %.1f%%\n", $ratio;

...

Rate random original random 2589/s -- -65% original 7440/s 187% -- original output compressed to 5.2% random output compressed to 57.7%