Although hosting companies won’t let you define your own Perl modules, you can simply define your own CPAN library-directory, install your local CPAN materials into it, and then employ the use lib statement to put that directory at the front of your search path.
See: perldoc FindBin.
There are copious prior threads on “how to install Perl as a non-root user,” etc., which will show you how to configure CPAN locally so that, when you install and update libraries, they go into your local directory.
When I first signed-up with my hosting provider, I was very startled to discover that they were running a very old version of Perl. Fortunately, they upgraded their systems.
I would also echo the notion that... you probably do want to get your own virtual-machine, instead of using shared-hosting. The reason for this is, “memory space.” Hosting companies generally set very small limits on how much memory a particular CGI-request can allocate. Not so bad for PHP (which puts most of its goodies into its code-segment, compiled into its executable), but it can be a problem for Perl. (I made this discovery, shall we say, “at a most inconvenient time.”)
When you have your own virtual-machine, you can do anything you need to do with it. Both the expense and the actual workload are usually similar to that of shared-hosting, but you get a lot more bang for your buck. Furthermore, it is much more difficult for the other subscribers to be able to nose-around with your stuff ... which, as it happens, they tend to do frequently. :-O
Incidentally, I employ the use libs trick on all of my sites, giving them their own CPAN-module space separate from “the system’s” space ... even on a virtual machine. I do the same thing on the development machines. The reason for this is: isolation. The Perl modules (and other accoutrements) associated with website-X are all stored in a directory-tree that is local to website-X. Thus, when it is time to update website-Z, I’m confident that I can do so without breaking website-X. “Global” changes, likewise, are much less likely to unexpectedly “bring all of my sites down.” (When several different sites really do use exactly the same library base, I use rsync (locally...) to equalize what the (still separated!!) directories contain.) You do have to be mindful of which “system-wide” packages might be being used, so that you do not inadvertantly put new wine into old wineskins...
| |
I'd recommend a "Virtual Private Server" (VPS). This gives you (eg) root access to a linux/apache based server that runs under a Xen share on a larger machine. You ssh in and do whatever you want. They run around $20-25 a month for a small slice, meaning you get like 8gb of storage and 128mb of RAM, sufficient to run enough apache threads to deal with mild traffic. The bandwidth will deliver to a few high speed broadband connections simultaneously. The storage is a complete normal linux filesystem that must include the OS (the experience is identical to logging in to a genuine dedicated server); for a mod perl server this is <2gb. The provider will install a minimal base distro OS (ubuntu, debian, etc -- there may be a choice) with ssh server running, then you log in and use apt or yum (or whatever you want) to install apache, mod perl, ftp, etc, etc. If you pay for an "unmanaged" VPS do not expect them to help you with any configuration, so you should be comfortable and competent with linux/apache. Again, this is identical to working remotely with root access on a dedicated server, which short of messing with the hardware, you can do everything you could do if the box were sitting in front of you. Since this is a 24/7 server, you get a dedicated permanent IP address.
I will not recommend anyone in particular but I will recommend against "VPS village/Grok.net" as I have had a very unpleasant experience with them in the past involving hardware failure and extreme negligence. | [reply] |
Although I'm not the Anon Monk who posted the question, thanks so much for the detailed reply, halfcountplus. Very helpful.
I'm guessing that since a VPS looks just like another server, you get to choose your own fq hostname, and can set up TLS certs and everything as you wish.
VPS's really seem to be a good middle ground between shared hosting and running your own co-located hardware. Though, I have no idea how difficult it is for the ISP to manage (haven't used Xen before).
| [reply] |
| [reply] |
There are approximately eleventy billion web hosting companies out there, so without knowing a bit more - for example, where you are, where you want your data to live (I would not, for example, want my server to be in China, because I wouldn't be able to reliably send email from it), how much you can afford to spend, ...
There are also approximately eleventy million VPS companies out there, but again we can't make a good recommendation without knowing an awful lot more.
However, working on the assumption that you're like me (and I know that *I* am) then I recommend Bytemark.co.uk and Hetzner.de. I use Hetzner for CPANdeps and the cpXXXan, both of which involve a lot of unusual modules. The cpXXXan in particular involves running the sort of code that would make the sysadmins of a shared host have kittens.
| [reply] |
I'm working with IntoVPS. Never had any problems with the service and it is well above average for the goodies they offer.
If you opt for a shared hosting solution, I use Dreamhost and I can install any Perl modules into my home directory, even compile C and XS.
| [reply] |
I recommend Pair Networks - click the add banner at the top of this page.
I've used them for several years (the $29.95/month "Web Master" plan) to host an ecommerce site, and have never had any problems.
I use cpan to install modules to a local ~/libs directory, and all my cgi scripts point to that with the "use lib" statement.
| [reply] |