I work for a firm that does marketing for 90+ car dealerships around the US. A big part of what we do to prep for data mining is standardizing and importing data from assorted dealership systems, some archaic and some not (ADP, R&R, Advent, Arkona, Quorum, Scorekeeper, etc.), and this sometimes requires processing service files of up to 200-300 MB with hundreds of thousands of records. Theoretical maximum could be even larger. Input format might be CSV or more of a vertical text format (key value), depending on how we're acquiring the data, but it's always text and never fixed-length. We use custom Perl scripts / mySQL for the most part, and we recently upgraded to a pretty fast server with 4 GB RAM (Cari.net, their pricing and service is pretty good and we also had our previous server there). OS is of course some popular Unix variant that I forget.
EDIT: We also import sales, leases, and a variety of other stuff, but the service file is just the largest part of that. I imagine the databases in uncompressed form could run upwards of 500 MB to a GB each over time.