Since you're sorting the file contents you must have enough memory to hold and sort the entire set of log files. If you can avoid sorting the data and holding the complete set of files in memory that would help with any future memory issues you might have. Perhaps read in enough to detect a change of timestamp and sort just on that slice if data. I'm assuming the log files are written with timestamps and more or less in sequence.
When the data is written to the DB there is presumably a timestamp field for the data in the DB. That could be used to search for the latest records already in the DB. The latest timestamp records could be deleted upon restart of the script and replaced from the logs to ensure a complete set of records for that timestamp. Subsequent timestamp data would simply come from subsequent log data.
Either that or you've got to record how may records through the files you have processed. Perhaps the database updates could be used for that as well.