in reply to How to find a memory leak - appears to be "system" calls that are responsible

You don't post code so in generalities. It will almost inevitably be faster and more memory efficient to use system tools like gunzip, tar and rm directly than do the same in perl. Highly optimised C should beat Perl every time both in memory and speed terms. There is simply less overhead and at the end of the day things like Compress::Zlib are just wrappers on the standard system libraries and calls. Tools like top and Devel::DProf Devel::Leak and Devel::LeakTrace are available to diagnose issues.

If you post example code I am sure you will get lively interest.

cheers

tachyon

  • Comment on Re: How to find a memory leak - appears to be "system" calls that are responsible

Replies are listed 'Best First'.
Re: Re: How to find a memory leak - appears to be "system" calls that are responsible
by TilRMan (Friar) on May 15, 2004 at 07:56 UTC

    gzip and tar, almost certainly, but I'd bet unlink() is much faster than rm. As a shell command, rm means a fork() -- or two for a shell! And of course rm must ultimately do the unlink() as well.

    On the flip side, my first impression on gzip vs Compress::Zlib was dead wrong. ++ for setting me straight.

    Ah, the trauma of learning Perl . . .

      That's a good point about rm. I tend to use rm -rf /dir/path/* for convenience. The first operation is a shell expansion on the files/dirs which are them fed to rm as as single list so the worst case is 2 operations. There are real limitations on shell expansion with long lists (yes I do know the workarounds). Anyway in this context I am sure (but untested, so I'm not THAT sure ;-) it would be way faster than unlink/rmdir with File::Find. For single files or known lists I do use unlink. Now I have a rationale! Thanks.

      cheers

      tachyon