in reply to read a huge mysql Table into memory

Just allocate more memory for MySQL and don't complicate things. Are you really experience problems with database speed? Then maybe you should review structure of database. If your provide more details about your data and database there are chances you'll get a better answer.

  • Comment on Re: read a huge mysql Table into memory

Replies are listed 'Best First'.
Re^2: read a huge mysql Table into memory
by hiX0r (Acolyte) on May 30, 2009 at 17:52 UTC
    thx for the fast answer! the reason I did not post "source data structure" is that I wanted to see if Perl has solutions...

    The data field I am concerned about is a mysql "text" field called "description" and may contain long text. The other is just the according id. The rest is index and fetched via sphinx.

    And regarding speed: never fast enough :) as "first come first serve" is the fixed sorting order and we have to be faster than the others :)

      Well, that's pretty simple table and I'm sure you have index on id column, so 200 queries per second shouldn't be a problem for mysql if you just selecting records by id. Anyway as you already saw perl requires too much memory to store your data, so it seems you're tied with database solution.