in reply to nytprof Profiler gives diverse results

Perhaps you can process that output so that for each sub called, its different run-times are recorded and see which vary and which don't. For example I can see that YAML::Tiny::* does not vary at all. (btw some subs are not present in all tests).

Replies are listed 'Best First'.
Re^2: nytprof Profiler gives diverse results
by stevieb (Canon) on Apr 04, 2020 at 16:54 UTC
    "btw some subs are not present in all tests"

    That's the first thing I noticed, and that definitely doesn't appear to be a fair profiling comparison.

Re^2: nytprof Profiler gives diverse results
by boleary (Scribe) on Apr 04, 2020 at 17:11 UTC

    yeah that makes sense, I'll put them in a spreadsheet

    As for missing subs...some of the sub durations seem to vary slightly,
    so they trade places in the top 15 report,
    but after anything taking 100ms or less I don't think I care

      if your subs can be broadly categorised as disk-intensive, memory-intensive, cpu-intensive you can see which of these 3 groups varies substantially against the others (standard deviation is an indication of variation). For example that YAML* sub being invariant in all three runs while others with similar run-times were not. Perhaps perlperf can help you.

      Also, are you 100% sure that your program is deterministic? Are there no random choices? For example, even iterating over the keys of a hash is non-deterministic. What if the algorithm benefits if the longer keys are processed first? Also, regex are also working internally with non-deterministic algorithms (that's my understanding on when a regex consists of a "choice") and therefore may show variability in termination (see https://stackoverflow.com/questions/36420517/is-it-faster-to-use-alternation-than-subsequent-replacements-in-regular-expressi).

      If you want to exclude the possibility of non-deterministic behaviour and other program-specific factors, start by doing some benchmarks yourself. See Benchmark on how to profile subs yourself easily.

      People from my social circle constantly harass me with their windows-10 too-slow problems. Sometimes I find that they have background updates sucking up resources (sometimes it's a cryptic generic windows process name).

      bw, bliako

        Thanks for the questions!

        The loop that is taking all the time is mostly cpu intensive with a decent memory footprint (but no where near my laptop limits)
        Its runs a large loop of loops performing string compares, index functions and regex on each element.

        I am reasonably sure that the test code is deterministic.
        There is some usage of the keys function to order the way we go through the loop,
        but on hashes with very few keys and the order of the keys really don't affect anything.

        I think my data-point about the effect of opening gmail on a freshely rebooted laptop is the most revealing item.
        I've been playing with my power profile, but It doesn't seem to be doing any good... the fan goes up and down throughout the longer tests.