in reply to What is a "big job" in the industry?

Each character and position of every line of the first file has to be checked against each character and position of the second file.
I'm sceptical about that claim. I'm guessing you won't need to check the position if the characters don't match and vice versa. That can make a LOT of difference. Especially, since the 600 lines file should easily fit in memory for fast lookups.

Anyway. The project I'm working on now has jobs that take from about a day to several weeks to run, maybe more. Not all of it is in production now, though, and for some sub-systems, we're mostly generating data on the fly. I guess we'll generate a couple of dozens of terabytes of data. I think it's a pretty "big job" :-)

General tips:

1. buy a machine with plenty of ram and load & parse all the source data you'll need more than once in memory. It's amazing how much you can do with 4Gb of RAM. Especially if you can also use C/C++ (or existing C-based CPAN libraries) for tight storage and inner loops.

2. if that won't fit, but your algorithm is going to go more-or-less sequentially through the data, cache as much of the source data as will comfortably fit in memory.

  • Comment on Re: What is a "big job" in the industry?