Reading between the lines of what you've told us, you have a pre-written perl script that you want to be able to run on behalf of networked users on a single machine, with concurrent access, but no sharing of data between the instances? And you aren't a perl programer :)
It really will depend on how the pre-existing perl script runs, but assuming that the script returns the results via stdout?
If this is the case, cloning an interpreter for each request, or building a pool of clones would probably work ok. I haven't done enough with it embedding -- nothing beyond the simple examples in perlembed -- to be able to predict the performance. Pre-cloning a pool and returning a "busy...try again" message if the pool is fully utilised, ought to be fast enough, if the loading isn't too extreme.
Personally, I would probably use a thread-pool design using threads or maybe a pre-forking design written in perl using perl's win32 pseudo-fork support, as I find perl so much more productive that C/C++.
In reply to Re: Re: Re: Externally managed threads using embedded Perl
by BrowserUk
in thread Externally managed threads using embedded Perl
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |