in reply to Re: request criticism web page with images in dir (File::Find File::Spec)
in thread request criticism web page with images in dir (File::Find File::Spec)

Thanks for your suggestions.

I switched to file globbing. It seems simpler and is built into perls greater than 5.6.0

I eliminated one of the map lines as the glob now returns the full path. I think this sped things up a bit

I switched to mod ing by three as you suggested. % is probably 100 times more expensive than addition and eliminated but 1/100000 of a second isn't much more than a 1/1000 and the modulo is easier to read.

I left the error checking in. --A few times it was handy to check to see that stat actually succeeded

I left the sort in, I do want the most recent images first. I also suspect that the $a $b variables supplied by sort are quicker than $_[0] and $_1

I left the printed comments and the print "content-type: text/html\n\n"; in as the former help me figure out problems when looking at HTML source and the later <large>:-></large> is needed for the HTML to display.

#!/usr/local/bin/perl require 5.6.0; use warnings; use strict; use File::Spec::Functions; use File::Basename; my $gHTMLPath='/go/pics/thumbs/'; my $gthumb_dir=catdir('..','go','pics','thumbs'); ################################## # # # no user servicable parts below # # # ################################## if ($gthumb_dir=~m|/|){ $gthumb_dir.='/'; }elsif ($gthumb_dir=~m|\\|){ $gthumb_dir.='\\'; }elsif ($gthumb_dir=~m|:|){ $gthumb_dir.=':'; # we're on a Mac ??? not tested }else{ die "on unknown OS or couldn't get path at start \n" } # FILE: listthumbs.pl REVISION DATE: 09-13-2001 # # CGI to list display jpeg, gif & png files in a directory ##################### # Declarations # ##################### sub get_files($); # get a list of jpeg files; sub size_cmp(); # used in sort function compares two values sub main(); # ###################### # Run our program # ###################### main(); ###################### # Definitions # ###################### sub size_cmp(){ my @a_stat=stat($a) or die "couldn't stat\n $a \n$!\n"; my @b_stat=stat($b) or die "couldn't stat \n $b\n$!\n"; #7 is size so # sorting by size puts the duplicates next to each other #9 is modification time # sort reverse order return $b_stat[9] <=> $a_stat[9]; } sub get_files($) { my @result=glob("$_[0]*.{jpg,jpeg,png}"); if (0==@result) { die "couldn't glob in get_Files()"; } return @result; } sub main(){ my @pics=get_files($gthumb_dir); @pics=sort size_cmp @pics; for (my $i=0;$i<@pics;$i++){ $pics[$i]=basename($pics[$i]); } print "content-type: text/html\n\n"; print "<!-- start code by $0 -->\n" ; my $count=0; print '<TABLE>'; print '<TR>'; foreach (@pics){ $count++; print "<TD width=200 align=left>\n"; print '<IMG SRC='. $gHTMLPath . $_ .'><BR>'."\n"; print ; print "</TD>\n"; if (0==$count% 3){ print "</TR>\n"; print '<TR>'; } } # end loop print "\n</tr>" if 0 != (3 % $count); print '</TABLE>'; print "there are $count thumbs<BR>\n"; print "<!-- end code by $0 -->\n" ;


--mandog

Replies are listed 'Best First'.
Re: Re: request criticism web page with images in dir (File::Find File::Spec File::glob)
by broquaint (Abbot) on Sep 14, 2001 at 15:23 UTC
    Perhaps even -
    sub get_files { return sort { (stat($a))[9] <=> (stat($b))[9] } glob($_[0]."/*.{jpg,jpeg,png}") or die($_[0]." contained no images\n"); }
    And if you're worried about speed, then the modulo operator is probably the least of your worries, as pretty much all operations are going to be costly (but if you were really looking for speed, then you'd be using C ;o) This should eliminate a lot of unncessary operations ie saving an array into memory but only using one element, and having to enter the size_cmp function for every comparison.
    HTH

    broquaint

      I suspect that my code is I/O rather than CPU bound. Eleminating the unneeded map operation seemed to drop a wallclock second off the runtime. Moving from a 330Mhz PC w/ IDE drives to a 200Mhz with SCSI drives dropped another wallclock second.

      I think I could probabably eek out at least a few precious milliseconds by implimenting your suggestions, but if I really were concerned about speed I'd probably do more detailed benchmarking and look for the bottlenecks....

      Since my site gets maybe 20 vistitors on a busy day and I'm moving to a database backed approach and I need to spend time on other things (sigh).....

      btw, I'm not sure C is a speed win, even purely in terms of execution time. Programmer skill is much more highly corrilated with speed than language choice. (see paper by Lutz Prechelt

      Thanks again for all your help.



      --mandog