in reply to Sharing a 300MB variable.

You may want to consider why you want to share a 300MB file in memory. Personally, I'd consider a file based or db based sharing system for anything this large.

You can easily run out of memory and degrade performance depending on how you are accessing data.

How are you processing the file? You don't want to slurp the entire file into a variable and then process it.

my @data; @data = <FILE>; # this reads in the entire file

If you slurp the entire file, this means each process has a copy of the 300MB file (i.e. 5 processes x 300MB = 1500MB + Perl runtime memory x5)!

Please note that runtime memory requires for large datastructures can be many times the size of data on disk. I had a program work with a 100MB hash that took over 2GB of memory and swap!

I would suggest you look into DB_File and BerkeleyDB and use a tied variable to a DB file.

Using a variable tied to a file isn't bad in terms of performance (especially in comparison of your system slowly thrashing itself to disk).