in reply to Re: Re: Re: Re: Re: Clustered Perl Applications?
in thread Clustered Perl Applications?

I'm passing around lots of data because I have a few stages of processing, and I'm processing a few terabyte on small machines.

I have to send chunks of structured data of about 10kb - 1mb.

I'm beginning to really like the easy lightweight idea of REST.
Even if i have to use POST.
  • Comment on Re: Re: Re: Re: Re: Re: Clustered Perl Applications?

Replies are listed 'Best First'.
Re: Re: Re: Re: Re: Re: Re: Clustered Perl Applications?
by perrin (Chancellor) on Jul 05, 2003 at 23:11 UTC
    I still think you could simply fetch the data in chunks from MySQL and store the result there, avoiding the need to pass it around in your control protocol.

    You will have to use POST to pass any significant amount of data. That shouldn't be a problem. The HTTP modules handle POST just fine.

      I would really like to use MySQL directly, but most of the data can be condensed to a tree before sending. And at 1mb of data it seems to be a good idea to compress it that way.

      Example:

      The table:
      testid1, yada1, foo1
      testid1, yada1, foo2
      testid1, yada2, foo3
      testid1, yada1, foo4
      testid1, yada2, foo4

      The structure:
      $data{'testid1'}{'yada1'}{'foo1'}