in reply to Re: Out of memory!
in thread Out of memory!

Moving a ton of data from the database server to perl to do simple comparisons is generally the wrong way to go. Learn SQL enough to do your work in the database when possible.
That, specially the second sentence, is only true op to a point. It doesn't scale. Certainly, if you have just one process accessing the database, let the database work. But what if you have a 100 servers accessing the database, it may actually pay to do as little work as possible in the database.

Replies are listed 'Best First'.
Re^3: Out of memory!
by roboticus (Chancellor) on Aug 12, 2010 at 12:07 UTC

    JavaFan:

    While I can imagine cases where it's better to do the job on a different machine in perl, I'll stand by my statement in this case. If I have 100 servers accessing the database, I certainly don't want one task sucking up all the network bandwidth by exporting and importing entire tables instead of just the rows it wants. That's why I put the weasel words "simple comparisons" in there. Index scans are normally cheap, especially on frequently-used tables where the index is mostly cached in RAM anyway.

    Having said that, I totally agree with your sentiment that you need to be cognizant of other database users. We definitely don't want someone to be an inconsiderate database user, frivolously consuming resources with no regard to database performance. Large-scale comparisons, for example, shouldn't be a frequent operation anyway. Proper structuring the applications can normally keep mass comparisons down to a minimum, such as for periodic maintenance.

    ...roboticus