in reply to Re: Copy folders
in thread Copy folders

#!/usr/bin/env perl use strict; use warnings; dircopy("$ARGV[0]","$ARGV[1]"); sub dircopy { my @dirlist=("$_[0]"); my @dircopy=("$_[1]"); until (scalar(@dirlist)==0) { mkdir "$dircopy[0]"; opendir my($dh),"$dirlist[0]"; my @filelist=grep {!/^\.\.?$/} readdir $dh; for my $i (0..scalar(@filelist)-1) { if ( -f "$dirlist[0]/$filelist[$i]" ) { fcopy("$dirlist[0]/$filelist[$i]","$dircopy[0]/$filelist[$ +i]"); } if ( -d "$dirlist[0]/$filelist[$i]" ) { push @dirlist,"$dirlist[0]/$filelist[$i]"; push @dircopy,"$dircopy[0]/$filelist[$i]"; } } closedir $dh; shift @dirlist;shift @dircopy; } } sub fcopy { my ($i,$data,$cpo,$cpn); open($cpo,"<","$_[0]") or die $!; binmode($cpo); open($cpn,">","$_[1]") or die $!; binmode($cpn); while (($i=sysread $cpo,$data,4096)!=0){print $cpn $data}; close($cpn);close($cpo); }
#!/usr/bin/env perl use strict; use warnings; dir_del("$ARGV[0]"); sub dir_del { my @dirlist=("$_[0]"); until (scalar(@dirlist)==0) { opendir my($dh),"$dirlist[0]"; my @filelist=grep {!/^\.\.?$/} readdir $dh; for my $i (0..scalar(@filelist)-1) { if ( -f "$dirlist[0]/$filelist[$i]" ) { unlink("$dirlist[0]/$filelist[$i]"); } if ( -d "$dirlist[0]/$filelist[$i]" ) { push @dirlist,"$dirlist[0]/$filelist[$i]"; } } closedir $dh;rmdir "$dirlist[0]";shift @dirlist; } rmdir "$_[0]"; }
sub fmove { my ($i,$data,$mvo,$mvn); open($mvo,"<","$_[0]") or die $!; binmode($mvo); open($mvn,">","$_[1]") or die $!; binmode($mvn); while (($i=sysread $mvo,$data,4096)!=0){print $mvn $data}; close($mvn);close($mvo);unlink("$_[0]"); }

Sorry, this is junk. Here is why:

Reinventing the wheel, poorly:
  • File::Copy implements copying of files. it is in core (part of the perl distribution) since perl 5.002, about 20 years ago.
  • File::Path implements removing directory trees, it is in core since perl 5.001.
  • File::Copy::Recursive implements copying directory trees since at least 2008. It is not a core module, but it is a pure perl module. Everybody can use it, even without a working C compiler, just by copying it to some directory where perl can find it.
Misbehaving code:
  • Missing error checks on mkdir, opendir, closedir, readdir, sysread, print, close, unlink, rmdir. All of these functions can fail, and any program should check if they failed.
  • Missing symlink handling. -f and -d use stat, not lstat, so symlinks to files and symlinks to directories are treated like regular files and directories. This (a) breaks any symlink "copied" by your code and (b) makes your code run into an infinite loop as soon as a directory contains a symlink to a directory above the current directory.
  • Missing special file handling. Your code completely ignores device files, pipes, sockets, and FIFOs.
  • Missing permission handling. Your code completely ignores file permissions. All files and directories "copied" by your code will have default permissions. With the usual umask of 022, this means all executables will loose their execute permissions, and, to make things worse, the copies of all private files that had their group and world read permissions removed will be group and world readable.
  • No attempts to make sure files aren't group and world readable while copying.
  • Repeated (implicit) stat calls in the -f and -d tests. Better call lstat once, then use the special filehandle _ as argument for -d, -f, and the other required tests.
  • sub fmove does not even attempt to use rename to move the file, which usually works as long as source and destination reside on the same filesystem, but stupidly copies the file content around, breaks permissions, timestamps, and ignores almost all possible errors.
  • The "recursive directory deleter" won't work reliable, as it attempts to rmdir directories at the end of the until loop before the next round of the loop deletes the subdirectories. Of course you could not see that because you did not check for errors from rmdir, and I guess you did not try to remove a whole directory tree.
  • To make things worse, the final rm "$_[0]"; hides this problem for the case of a directory with only one level of subdirectories.
  • And once again, the "recursive directory deleter" does not handle symlinks properly: Symlinks to directories elsewhere will be followed, so random directories outside the directory passed to sub dir_del will be deleted. The symlink itself will stay, because rmdir can't remove symlinks. Symlinks to special files are ignored, they won't be deleted. Symlinks to plain files are accidentally handled properly, they will be deleted due to the -f test.
  • Talking of special files, everything not a plain file, directory, or symlink to a plain file won't be removed. It can't remove subdirectories that contain device files, pipes, sockets, or FIFOs. Symlinks to files are handled "accidentally", because -f treats them as files, and unlink can handle them.
Ugly code:
  • Inconsistent indenting. If your editor is too stupid and you are too lazy for proper indenting, use an indenting program to make your code readable.
  • More than one command per line. Why?
  • Useless quoted variables almost everywhere: dircopy("$ARGV[0]","$ARGV[1]");, my @dirlist=("$_[0]");, my @dircopy=("$_[1]");, mkdir "$dircopy[0]";, ... Just omit the quotes.
  • Two identical arrays where one is sufficient in sub dircopy (@dirlist and @dircopy)
  • Iterating over array indices instead of array elements in sub dircopy: for my $i (0..scalar(@filelist)-1) is better written as for my $name (@filelist), using $name instead of $filelist[$i] inside the loop.
  • Reading the entire directory list into RAM may be problematic with huge directories and low memory conditions. while (my $fn=readdir $dh) reads one filename at a time.
  • Missing filename in error messages of open.

Alexander

--
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)

Replies are listed 'Best First'.
A reply falls below the community's threshold of quality. You may see it by logging in.