Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Re: Use of do() to run lots of perl scripts

by LanX (Saint)
on Mar 03, 2021 at 15:48 UTC ( [id://11129081]=note: print w/replies, xml ) Need Help??


in reply to Use of do() to run lots of perl scripts

Let's suppose you are always forking and the RAM consumption of all those forks (which means cloning the run-time engine, which needs 1.5MB++ on my system) is no issue.

Let's suppose further it's the start-up time that matters.

Like already explained does do FILE still imply overhead for

  • A loading the file from FS
  • B compiling that file
  • (B2 compiling all modules used in that file)
  • C running that code

but do is just a glorified eval `cat FILE` mechanism.

So your MASTER process could just keep all those scripts and used modules° in a big hash $file{SCRIPT}

Now after forking to child THIS_SCRIPT_1 you only need to eval this script directly to compile it and you have deprived yourself already of point A the FS overhead.

(As a further optimization you could now empty the hash %file to release memory of the child, tho I'm not sure if this would pay of)

And now - provided that this script is started many times - you can fork again from THIS_SCRIPT_1 after compilation and THIS_SCRIPT_2 is executed with a clean start-up context and terminated at the end.

Every time MASTER needs this particular script to be run again, he needs to communicate to child1 which forks again to a grandchild which is run again with precompiled code, solving the overhead of point B.

Now it's up to you to decide if you want to keep all these 500 child-forks of level 1 constantly alive by reserving 1-2GB of RAM for it, I'd rather make it depended of a frequency count of the MASTER. (And I still doubt that compile time is an issue nowadays, but YMMV)

I still think you are most probably reinventing the wheel here, because such strategies have certainly already be discussed in the context of web-servers.

But it could solve your issues by buying time with space. and you will still need to benchmark all of this.

Cheers Rolf
(addicted to the Perl Programming Language :)
Wikisyntax for the Monastery

update

°) used modules B2 is a bit more complicated, anything with hooking into @INC could be used. You said these scripts are "simple", do they always use the same modules? In that case require the common ones in MASTER, this will do the compilation (hopefully without global side-effects)

see also App::FatPacker et al...

Replies are listed 'Best First'.
Re^2: Use of do() to run lots of perl scripts
by afoken (Chancellor) on Mar 03, 2021 at 18:35 UTC
    do is just a glorified eval `cat FILE` mechanism.

    ... without creating a sub-process and running cat, of course.

    Alexander

    --
    Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
      of course that's why I said "mechanism" and linked to the docs.

      do "./stat.pl" is largely like

      eval `cat stat.pl`;

      except that it's more concise, runs no external processes, and keeps track of the current filename for error messages. It also differs in that code evaluated with do FILE cannot see lexicals in the enclosing scope; eval STRING does.

      Cheers Rolf
      (addicted to the Perl Programming Language :)
      Wikisyntax for the Monastery

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11129081]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (5)
As of 2024-04-18 14:43 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found