in reply to Serialise to binary?

My binary data is already compressed, so when I compress JSON or Data::Dumper output, it still ends up substantially bigger than storable.

This is highly unlikely. If you serialize without whitespace and compress and it is substantially bigger than storable…? This is not my forté but post your code and I’m sure someone can show you where it’s gone sideways.

Replies are listed 'Best First'.
Re^2: Serialise to binary?
by Anonymous Monk on Oct 26, 2015 at 05:29 UTC
    An example is when there are a huge number of scalars having random contents. Here compressed storable has 33% over head, where as compressed json has 70%+ overhead.
    use strict; use warnings; use Storable; use IO::Compress::Gzip qw(gzip); use JSON::XS; my (@data,$serial,$gzserial,$json,$gzjson,$i); for($i=0;$i<100000;$i++) { push @data, chr(int(rand(256)))} $serial = Storable::nfreeze(\@data); $json = encode_json(\@data); gzip \$json => \$gzjson; gzip \$serial => \$gzserial; print scalar(@data)."\n"; print length($serial)."\n"; print length($gzserial)."\n"; print length($json)."\n"; print length($gzjson)."\n";

      Oh, nice! I was about to argue that one character scalars making quotation marks more than 60% of the data rigged the test in favor of Storable but I upped the "word" size and the difference remains at about 30% in favor of Storable. Sidebar: on my box at least, Storable sees *negative* change from zipping: i.e., the zipped Storable is slightly bigger than the raw nstore .