wdsaeghe has asked for the wisdom of the Perl Monks concerning the following question:
Hello monks,
a collegue of mine stumbled upon something curious: in his script (let's call it main.pl) he has 56 subs he needs.
In scenario 1 he put each sub in a different file, each appropriately named getSynthesedata.pl, transformOrdToCSV.pl.
He then loads all the files using 56 different "do" statements:
do 'getSynthesedata.pl'; #times 56 for each other file
In Scenario 2, he catted all those files (and nothing more) to one script called all.pl, and only loaded that file:
do 'all.pl';
Now, loading the 56 different files takes less than one second, loading all.pl takes about 5 seconds. We can't really explain it.
Maybe it has to do with multicore?
Does 'do'-ing a small file has certain benefits?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Loading one big file is slower than loading 56 different small files?
by BrowserUk (Patriarch) on Jul 30, 2010 at 10:07 UTC | |
|
Re: Loading one big file is slower than loading 56 different small files?
by JavaFan (Canon) on Jul 30, 2010 at 09:36 UTC | |
|
Re: Loading one big file is slower than loading 56 different small files?
by jethro (Monsignor) on Jul 30, 2010 at 09:54 UTC |