in reply to Splitting large module

I would guess the slowdown simply happens because of the increased disk IO due to opening and reading 85 files instead of a single file.

How large in absolute terms is the performance difference? If it really is large, maybe you need to choose a different disk layout to make loading the files faster? If your script is just running as a short-lived process, maybe you can change that to a longer-lived process by passing in more work to process in one go?

Replies are listed 'Best First'.
Re^2: Splitting large module
by Anonymous Monk on Jul 11, 2018 at 17:13 UTC
    I think that all of the USE directives will be resolved at compile time and that this would not account for the perceived difference in performance.

      There is no real distinction in Perl between "compile time" and "runtime", as Perl needs to load all files each time a program is run.

      I can very well imagine a high-latency (or flakey) network connection (maybe NFS) that makes loading files quite slow. This would slow down program startup at least linearly for each file that needs to be opened and read.