Beefy Boxes and Bandwidth Generously Provided by pair Networks
Welcome to the Monastery
 
PerlMonks  

Self Extracting Archives with Perl (almost)

by deprecated (Priest)
on Jan 21, 2001 at 07:00 UTC ( [id://53280]=CUFP: print w/replies, xml ) Need Help??

I run a small network of servers running the OpenNap server. We've been having some issues lately with script kiddies attacking the servers (we have 7 nodes at present), and it has become difficult to maintain homogeneity among the servers. So I decided I needed a good way to get files from the hub server (mine) to the nodes on the network in a uniform fashion. Since I cant assume that all the other server owners will be able to use rsync or cvs or bitkeeper or any of the other source-distribution systems, I began to wonder what the best way to get them these files was. Of course, I thought, use perl! So I thought, well, I can make a tarball, and uuencode it, and then send it out through a CGI program. But that leaves me with the problem that somebody may not be able to use UU(en|de)code or tar very aptly. These things happen, unfortunately. So since the tarball is uuencoded, I can actually stick it IN a perl program and use a heredoc to stuff it into a scalar, which can then be decompressed and everything. So I have created a script that creates a new script with the encoded information in it. It also adds relevent codelets to the new script so that it should be able to decode and detar its files. So Ill give you the code here, and then continue on below it:
#!/usr/bin/perl -w use strict; use IO::Scalar; use Convert::UU qw(uuencode uudecode); use Archive::Tar; sub slurp { local $/; my $fh = shift; open FH, $fh; $fh = <FH>; close +FH; return $fh; } my $textfile = "text"; my $binfile = "binary"; my $generated_code = "archive.pl"; my $tarball_name = "tb.tar"; my $uud_tarball_name = "new.tb.tar"; my $tarball = Archive::Tar -> new($tarball_name); $tarball -> add_files ($textfile, $binfile); $tarball -> write ($tarball_name); my $uu_data = slurp ($tarball_name); my $uue_data = uuencode ($uu_data); ##### ## here we are dealing with the generated file. # open CODE, ">$generated_code" or die "$!"; print CODE << 'CODE_LABEL'; #!/usr/bin/perl -w # automatically created perl code including UUencoded tarball. use strict; use IO::Scalar; use Convert::UU qw(uuencode uudecode); use Archive::Tar; my $encoded_tarball = << 'TARBALL_LABEL'; CODE_LABEL close CODE or die "not proceeding because we couldnt close CODE\n$!"; ##### ## here we have finished writing the initial code segments to # the generated file. { # because we are localizing $/ we dont want this to escape local $/; open CODE, ">>$generated_code" or die "unable to append to $generated_code : $!"; print CODE $uue_data; close CODE or die "error closing CODE\n$!"; } # we're done being foolish ##### ## we are now re-opening our generated code to append more code # to it so it can de-uu and de-tar its file. open CODE, ">>$generated_code" or die "unable to re-open CODE : $!"; print CODE << 'CODE_LABEL'; TARBALL_LABEL # begin decoding sequence... my $decoded_tarball = uudecode ($encoded_tarball); tie *BALL, 'IO::Scalar', \$decoded_tarball; my $uud_tarball_name = "tarball.decode.tar"; my $new_tarball = Archive::Tar -> new($uud_tarball_name); $new_tarball -> extract(\*BALL) or warn "error extract()ing : " . $new_tarball -> error(); CODE_LABEL __END__
Documentation for used modules:
IO::Scalar
Archive::Tar
Convert::UU

So. This actually does almost everything I want it to. It does choke, however, and doesnt extract the tarball. I think what I'm running into is a lack of understanding the way Archive::Tar works. I'm actually pretty stunned that the resulting code goes through perl -c without issues. The problem I'm having is that Archive::Tar isnt reporting any errors, and its not actually extracting the files. One thing of note here is that Archive::Tar lists an extract_files method in its documentation, but the extract method I found from looking through the source is much better. The filenames listed at the top are the first 10 lines from /usr/dict/words, and a copy of `which cat`. Relatively small files. Where do I go from here?

I have posted this in CUFP because, gee, this is one damn cool use for perl IMHO. This should even be portable to MacOS and Windows. One final plan I have for this script is to include some weak encryption so that I dont have to worry about script kiddies downloading our users database and hacking us. But given the large amount of encryption modules available for perl, that part shouldnt be very tough.

thanks, fellow monks.
dep.

Update:

I have stopped using Archive::Tar and Convert:UU. Both modules are fine examples of how to do very specific things. They however dont play very well together. Not using either of these modules means I dont really need to use IO::Scalar, either (although that has to be one of the coooooolest modules I have ever played with. Our own japhy suggested pack which does UUencoding all by itself and doesnt require an extra module.

An anonymous monk suggested using shar which I'm embarassed to admit I knew nothing about. It's a good idea except that it's not as portable as I'd like. Also, well, at this point I'm pretty attached to my idea and would like to keep this 100% perl. :)

I will post the full code when I'm done with it. I'm moving along.

--
i am not cool enough to have a signature.

Replies are listed 'Best First'.
Re: Self Extracting Archives with Perl (almost)
by strredwolf (Chaplain) on Jan 21, 2001 at 11:35 UTC
    There's a few problems I see that you haven't accomidated for. I'll list a few of them:

    First, you really can't assume they have all the modules you need. Infact, you can't really assume they even have the latest version of Perl! Whoops! Bugger, eh?

    But instead of thinking "I need modules" think "What would the user have?" Definetly tar, yes. That's a standard UNIX program (unless using MS-something or MacOS). But not gzip in older (read: anchent) UNIXen. Lets assume gzip.

    If you check unpack, you'll find that it can do uudecoding of a line itself, since version 4.036. Very intresting, eh?

    So all you need to do is take a line, decode it, and feed it to a opened pipe to "gzip -d -c | tar xfv -". Not only does it reduce a ton of complexity, it eazes things up a bit for the enduser (who may be a clueless moron for all we can tell).

    --
    $Stalag99{"URL"}="http://stalag99.keenspace.com";

      Let me say 'me too!'

      More specifically:

    • What are you doing to ensure that the Kiddies are not going to compromise your comms link?
      Using an email saying "execute this attachement" is just asking for trouble, IMO.
    • What is your common code base?
      Personally, I'd insist on a real compression engine and a decent encryption program (eg. pgp).
    • What is available out of the box?

      While this is a Really Cool Piece O'code, (I'm impressed) this is not a new problem. And various specialised (hopefully debugged) and compiled (hopefully faster and self containing) programs such as PGP and PKZIP exist to solve exactly this sort of problem.

      Perl (as a script) is a neat way of ensuring that the correct commands are run.

      The question you should be asking (as the admin) is: what is the life time cost of each of the possible solutions?
      Put it another way: 'How much time can I spend supporting my scripts?'

      It's an interesting Perl excercise, but is it really a nail?

      Finally, "never underestimate the bandwidth of a stationwagon full of magnetic media."
      A self extracting CD deals with a lot of Bozo user problems, and alleviates most of the security issues.


      Butlerian Jihad now!
Re: Self Extracting Archives with Perl (almost)
by Anonymous Monk on Jan 23, 2001 at 22:45 UTC

    Not to be a heretic, but you could step outside of Perl and just use the "shar" program to create a shell archive. This utility already does near-exactly what you're trying to do, and is (somewhat) common.

    Here's the HP/UX "shar:"

    usage: shar [-AabCcDdefhmorstuvZ] <file|dir> ... -A: supress warning messages for optional acl entries -a: assume files are shippable, don't uuencode -b: use basenames instead of full pathnames -c: use wc(1) to check integrity of transfer -C: include a "cut here" line -d: don't recurse on contents of directories -D <dir>: must be in <dir> to unpack -e: don't overwrite existing files -f <file>: file containing list of directories and files or - to read filenames from standard input -h follow symbolic links instead of archiving them -m: retain modification/access times on files -o: retain user/group owner information -r: must be "root" to unpack -s: use sum(1) to check integrity of transfer -t: verbose messages to /dev/tty instead of stderr -u: assume remote site has uudecode(1) for unpacking -v: verbose mode -Z: shrink files using compress(1)

    The GNU version is substantially similar.

    If your "targets" don't have access to a POSIX-like shell, though, they'd need one. (i.e. are they stuck on Windows?) The Bourne Again Shell (bash) from Cygwin comes to mind...

    For example:

    (hpux) $ shar myfile.pl > myfile.shar (nt) $ ftp (hpux) Connected to (hpux) 220 (hpux) FTP server (Version 1.1.214.4 Mon Feb 15 08:48:46 GMT 1999) + ready. Name ((hpux):(user)): 331 Password required for (user). Password: 230 User (user) logged in. Remote system type is UNIX. Using binary mode to transfer files. ftp> ascii 200 Type set to A. ftp> get sample.shar 200 PORT command successful. 150 Opening ASCII mode data connection for sample.shar (138630 bytes). 226 Transfer complete. 140877 bytes received in 0.14 seconds (1006264 bytes/s) ftp> quit 221 Goodbye. administrator@(nt) ~ $ head -3 sample.shar # This is a shell archive. Remove anything before this line, # then unpack it by saving it in a file and typing "sh file". administrator@(nt) ~ $ sh sample.shar Compiling unpacker for non-ascii files x - myfile.pl

    I've assumed a binary file here (ergo the line about non-ascii), but transfered it in ASCII mode, just to make sure that the translation before a 64-bit UNIX host and a 32-bit Win32 host would work OK even over an "unclean" link, like contained in an eMail message. Something like this should work for you, then:

    #!/usr/bin/perl -w my $recipients = 'admin@site1 admin@site2'; open MAIL, "|mailx -s 'updated files' $recipients"; print MAIL<<HEADER_END; Here are today's updated files. Save the rest of this message to a file (for example, "updated.shar") +and then run it in a shell (for example, "sh updated.shar") to extract. HEADER_END print MAIL `shar @ARGV`; print MAIL `cat ~/.signature`; close MAIL;

      (same anonymous, firewalled silly author here...)

      ...all of which is not to say that your use isn't exceedingly kewl! I'd love to see your final version posted up here...

Re: Self Extracting Archives with Perl (almost)
by Anonymous Monk on Jan 26, 2001 at 01:05 UTC
    I like the idea of a Perl archive (even if shar already exists :). One thing I'd do just for the sake of aesthetics is to spit out the UUencoded text after a __DATA__ token in the extractor. That is, let all the extraction machinery sit at the top as a human-readable script and read from its own DATA handle...

    CJW

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: CUFP [id://53280]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (3)
As of 2024-04-25 09:45 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found