|Don't ask to ask, just ask|
distributed compilerby vxp (Pilgrim)
|on Jul 23, 2004 at 15:35 UTC ( #376904=perlquestion: print w/replies, xml )||Need Help??|
vxp has asked for the wisdom of the Perl Monks concerning the following question:
I've a 26 node cluster at my full disposal. 3.06Ghz CPU, 2 of them per node. I thought it might be an interesting idea to write a client/server set of scripts that would basically act as a distributed compiler.
Say you have a large project to compile (example: linux kernel). Its a collection of small(ish) .c files, basically. Master node would run the preprocessor locally, so there would be no need to have the identical header files across the compute nodes (nevermind the fact that compute nodes _are_ identical.. but why not do this for the people who dont have identical 26 machines at their disposal).
Then it would take the .c files and distribute them over to the compute nodes, which would gladly compile them and return them over to the master node when done, for linking. That is it, in a very brief nutshell.
What do you, fellow monks, think about this idea? Good? Bad? Any CPAN modules you'd recommend to use for this task? Any wisdom to share at all, about the idea?
May perl be with you.
Back to Seekers of Perl Wisdom