Yes, this can be done. The trick is to build the directory
structure that Perl expects but to do it under a directory
that you have ftp access to. One handy way to
do this is to first do the install somewhere
on your local machine
and then use ftp to copy the whole thing under your
main directory on the remote machine.
Then all you need to do is
tell Perl where to look for your custom-installed modules.
You can mess directly with @INC, which
is an array of the places Perl looks when asked to
'do', 'require', or 'use' a file, but it is
better to use the 'use lib' pragma:
use lib "/home/mydir/modules"; # or whatever
Put this on a line before any 'use'
statements that call on your newly installed modules.
Perl will then include your custom directory
and in any appropriate
sub-directories in the list of places it looks
in to find modules.
You may find suggestions that you just locally extract the
needed .pm file, drop it into the same directory as
the Perl script that needs it, and then just
say,
use lib ".";
...and this will work in many cases.
But do not do that. It works for the simple,
single-module needs. But it can quickly get messy with the
module hierarchies (the ones that have "::" in the name)
and you are
much better off to just set up the directories right and
proper from the beginning.
Note that in some circumstances, setting up the directories
on your machine and copying to a remote one (particularly
if they are different operating systems) will not work
because setting up the module involves some
machine-dependent issues. The docs for some modules
mention that they are 'pure Perl' which is a good sign
for your purposes.
Here are some links to help with the task of
setting up modules without shell access:
Install Perl Modules Using FTP Without Having Shell Access?
(be sure to read the whole thread
and associated links).
rBuild hack
I found these and other links by typing
"install ftp" into the search box in the upper left of this
page. |