vxp has asked for the wisdom of the Perl Monks concerning the following question:
Say you have a large project to compile (example: linux kernel). Its a collection of small(ish) .c files, basically. Master node would run the preprocessor locally, so there would be no need to have the identical header files across the compute nodes (nevermind the fact that compute nodes _are_ identical.. but why not do this for the people who dont have identical 26 machines at their disposal).
Then it would take the .c files and distribute them over to the compute nodes, which would gladly compile them and return them over to the master node when done, for linking. That is it, in a very brief nutshell.
What do you, fellow monks, think about this idea? Good? Bad? Any CPAN modules you'd recommend to use for this task? Any wisdom to share at all, about the idea?
May perl be with you.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: distributed compiler
by valdez (Monsignor) on Jul 23, 2004 at 15:41 UTC | |
by elusion (Curate) on Jul 23, 2004 at 16:00 UTC | |
|
Re: distributed compiler
by pbeckingham (Parson) on Jul 23, 2004 at 15:46 UTC | |
by vxp (Pilgrim) on Jul 23, 2004 at 15:49 UTC | |
by pbeckingham (Parson) on Jul 23, 2004 at 16:18 UTC |