in reply to Re^2: Use of undef slowing down perl?
in thread Use of undef slowing down perl?

No, wrong. You want to compare the average and total speed of insertion before and after that undef(%foobar) operation. This is clearly something I'd use Benchmark for. Perhaps there's some big time hit when the hash is first re-used but it goes away as the hash is used more.

Replies are listed 'Best First'.
Re^4: Use of undef slowing down perl?
by ikegami (Patriarch) on Oct 12, 2005 at 17:53 UTC

    The problem is related to the available memory size. About available_MB_of_mem * 10000 iterations need to be run before the undef, by my estimates. Any other amount will change the outcome and hide the problem.

    The question isn't *if* there is something the first pass, the question is *what* is different.

    True, you could use Benchmark to repeat the bit after the undef, but that's not what you were suggesting initially. You were suggesting using it for *both* snippets.