in reply to Re: Serialise to binary?
in thread Serialise to binary?

An example is when there are a huge number of scalars having random contents. Here compressed storable has 33% over head, where as compressed json has 70%+ overhead.
use strict; use warnings; use Storable; use IO::Compress::Gzip qw(gzip); use JSON::XS; my (@data,$serial,$gzserial,$json,$gzjson,$i); for($i=0;$i<100000;$i++) { push @data, chr(int(rand(256)))} $serial = Storable::nfreeze(\@data); $json = encode_json(\@data); gzip \$json => \$gzjson; gzip \$serial => \$gzserial; print scalar(@data)."\n"; print length($serial)."\n"; print length($gzserial)."\n"; print length($json)."\n"; print length($gzjson)."\n";

Replies are listed 'Best First'.
Re^3: Serialise to binary?
by Your Mother (Archbishop) on Oct 26, 2015 at 14:59 UTC

    Oh, nice! I was about to argue that one character scalars making quotation marks more than 60% of the data rigged the test in favor of Storable but I upped the "word" size and the difference remains at about 30% in favor of Storable. Sidebar: on my box at least, Storable sees *negative* change from zipping: i.e., the zipped Storable is slightly bigger than the raw nstore .