You can place all of your modules in the same file:
package SomePackage;
my $var = 1;
sub some_sub {
...
}
package main;
print "Main stuff here."
&SomePack::some_sub;
By including it in your source, you don't have to use it, since it's already there and all of your name spaces set up appropriately. The only caveat: in the example above, $var is available by your main code (in the 'main' package) as well. Variable scoping is done by files, so things that one package might be assuming is private will not be if you do this. A package statement does not begin or end a block. As a result, variables in your program may clash with others in another module. | [reply] [d/l] [select] |
If you're really stuck and need to include a module but can't get anyone to install it for you, perlfaq suggests keeping your own directory.
In the case where I've had to provide a module available to me under Linux but not packaged for Windows yet, I've used the 'use lib' pragma and kept the module in a directory under the parent of my program.
I just bundled everything together and installed it in one directory on the customer's machine.
This only tends to work if the modules are Pure Perl. XS and other compiled modules aren't so nice. | [reply] |
you could try to play with
perl -MO=Deparse,[other options] yourscript.pl
with some luck you could even get something working...
| [reply] [d/l] |
there has already been one note that this is a 'bad idea'. check here RE: modules out of the box
this kind of approach becomes a maintenance nightmare, and almost defeats the purpose of modules IMHO | [reply] |
No doubt that there are alot of bad implications of this.
Still, there are cases where it is useful. For example, in
times where you cannot assume that all the modules exist
in all machines.
For cases like this, i'd like to have a program (preferably
a perl-script :) that reads a program and inserts all the modules
in somehow. This will result a "static version" of the script
that can be used in other environments.
Anyway, thanks for the link.
| [reply] |
MakeMaker?
I have not delved deeply into it, though it or some
packaging utility sounds like it could be useful.
I'd certainly like to know if there is a pure Perl
solution to installing a package of many files, managing
the process and not depending on any unix programs at all.
Thinking about it some more, I looked at these:
ExtUtils::Install - install files from here to there
ExtUtils::Command - utilities to replace common UNIX commands in Makefiles etc.
ExtUtils::MakeMaker - create an extension Makefile
ExtUtils::Installed - Inventory management of installed modules
ExtUtils::Manifest - utilities to write and check a MANIFEST file
ExtUtils::Packlist - manage .packlist files
The last one seems to be use a .packlist file instead of a
MANIFEST file, but easily lets you check every file in the
.packlist exists. Suppose you could just to a file test
if it is simple, but personally if there is an easy to use
module (I think MakeMaker probably is it, anybody?) that
would scan a bunch of directories on my local machine,
skip files with certain extensions, pack it up into an
archive, and reinstall it on a target machine while letting
you do things like globally replacing ip numbers and
pathnames in perl code, and hyperlinks in html files,
it would be pretty useful. I'd rather not get hip deep
but if anybody has experience with the above, maketool,
etc. would also like to know your experience.
It is a
recurring problem and ultimately an interactive or simply
configured platform-transparent utility would be super.
If it compiled packages in a local folder too for me that
would be swell.
..Oh, maybe CPAN.pm could do this too.. erm. Last time I
worried about this, I went through CPAN and MakeMaker
docs, and finally ended up mashing up something for the
moment. Still a need.
| [reply] |