in reply to out of memory

Why do you need to keep the whole file in memory? Cannot you process the file in chunks? This seems like an XY Problem: What are you trying to achieve?
لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ

Replies are listed 'Best First'.
Re^2: out of memory
by shan_emails (Beadle) on Feb 14, 2013 at 12:09 UTC

    I need to put that content into database table

    is any other way to handle this scenario

      When loading data into a database, I recommend using the bulk loading tools available with the database. The easiest approach is to use Perl to write the available data to a new file in a format that is suitable to the bulk loading tool, either using fixed width or delimited rows of text, or SQL statements.

      If you are adding more than one row, you can probably process the file chunk by chunk and insert corresponding rows to the database one by one (or create a file the database can bulk-load). See also DBI.
      لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ

      You haven't provided some example file content. Essentially read the file one record at a time, insert the record into the database. This is of course a simplistic approach, and is perhaps not best suited to your system (number of records, database type etc) so creating a load file based upon your input may be a better idea.