in reply to Re: Optimize my foreach loop / code
in thread Optimize my foreach loop / code

Thanks for your reply.

It is this loop that takes most of the time. Unfortunately, the input image array can grow as much as 400k elements (files). I am hoping there is a better way to improve on the hash assignment and/or regexes.

Best!

Replies are listed 'Best First'.
Re^3: Optimize my foreach loop / code
by davido (Cardinal) on Aug 26, 2016 at 07:17 UTC

    As I tried to illustrate, you will not find optimizations for this existing work-flow that attain an order of magnitude of improvement. It would be improbable that you could even cut the time in half.

    What if you build an index from each file as it comes in, rather than doing a huge chunk of files all at once? Gather whatever meta-data you need on each file as it arrives, and shove that data into a database that you can query as needed. This will spread the computational workload over a longer period of time, and make tallying of results very fast.


    Dave