xiaoyafeng has asked for the wisdom of the Perl Monks concerning the following question:
Asking this question roots from When Perl is not applicable which makes a war on blog perl site. ;) On the contrary to the author's opnion, I does belive perl can handle large data, but I'm more interested in how perl handle it. and did some tests:
env: Win XP, perl 5.12.3, 512M memory available: 350M C:\>perl -e "@a = (1..1_000_000_000_000);" Range iterator outside integer range at -e line 1. # of course! C:\>perl -e "@a = (1..1_000_000_000);" Out of memory! # fair enough ;) C:\>perl -e "@a = (1..9_000_000);" Terminating on signal SIGINT(2) #took too long time, and I have to ki +ll it :(
When I ran the third one, perl ocuupied almost 340M memory at first. after 3 or 4 secs, it seems realize there is no enough memory for applying, then the utilization decreses to about 60M and hang. :(
So How perl handle applying of large memory? seems perl would do some swap when memory is not enough, but it isn't productive. It there a better way to make perl smarter? (i'd rather perl tells me "out of memory" than long time waiting).
Any cents? </code>I am trying to improve my English skills, if you see a mistake please feel free to reply or /msg me a correction
|
---|