in reply to Re^4: Serializing a large object
in thread Serializing a large object

Thanks. How long did that take using your current solution? A load time and size fr compressed nstore data; and the time taken to perform the search and produce the results would be useful.


Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
RIP an inspiration; A true Folk's Guy

Replies are listed 'Best First'.
Re^6: Serializing a large object
by daverave (Scribe) on Oct 09, 2010 at 14:58 UTC
    use strict; use warnings; use 5.012; use FastBioRanges; # use PerlIO::gzip; # I just noticed I forgot this 'use' but it worked + fine... how? use Storable qw (retrieve_fd); use Time::HiRes qw(gettimeofday tv_interval); my $time = [gettimeofday]; open( my $fastbioranges_fh, "<:gzip", 'fastbioranges.store.gz') or die; my $fastbioranges = retrieve_fd($fastbioranges_fh) or die; close($fastbioranges_fh); say "loaded in ", tv_interval($time), " seconds"; my $n = 1000; $time = [gettimeofday]; #say "start\tend\tcover"; for ( 1 .. $n ) { my $start = int rand(877878); my $size = int rand(7000); my $end = ( $start + $size ) % 877879 + 1; my $cover = $fastbioranges->num_ranges_containing( $start, $end ); # say "$start\t$end\t$cover"; } say "$n queries in ", tv_interval($time), " seconds";

    loaded in 0.385292 seconds 1000 queries in 0.005204 seconds
    Recall usually my objects are 5-10 time larger and the number of queries is in the millions. The querying is not optimized (all the 'my $...', 'rand' etc.) but still it's very fast.
      # use PerlIO::gzip; # I just noticed I forgot this 'use' but it worked fine... how?

      Because using ':gzip' causes perl to look for and load the appropriate module.