arw01 has asked for the wisdom of the Perl Monks concerning the following question:

I am reading a dbase file using the XBase.pm module sequentially in order to pick out a few items I want to be placed into a table of available items and published on the web. The dbase file is Approach based with an adx index which XBase cannot handle, yet. Aprox. 8000 records, this bit of code takes 56 seconds to complete running on the web server itself. As I say, the index is not available to speed up the reading by being selective.
my $now_string = strftime "%a %b %e %H:%M:%S %Y", localtime; print "starting $now_string\n"; # ************* Open the files, read the entire database, write the HT +ML file open OUTFILENAME, ">$outfilename" or die "Cannot open $outfilename for + write :$!"; #open file # or report why we cannot open + the file print OUTFILENAME "<HTML><HEAD><META HTTP-EQUIV = 'Pragma' CONTENT='no +-cache'> <TITLE>UTS Available for Sale</title></HEAD><BODY><CENTER><H +1>UTS Trucks Available for Sale</h1> generated $now_string</center +><HR>\n"; print OUTFILENAME "<TABLE BORDER=0><TR><TD>Unit</td><td>Model</td><td> +Make</td><td>Year</td>"; print OUTFILENAME "<td>Spec</td><td>Engine/HP</td><td>Trans</td><td>Re +ar Axle / Ratio / Susp.</td><TD>Miles</td><TD>WB</td>"; print OUTFILENAME "<TD>Color</td><TD>Vin</td><TD>Location</td><TD>Cust +/Desc</td><td>Price</td></TR>\n"; my $table = new XBase '/mnt/fdrive/UNITS.dbf' or die XBase->errstr; print "File $dbasefiletoread ",$outfilename," opened successfully\n"; for (0 .. $table->last_record) { # print "inside loop\n"; #print "$dropit\n"; my @data = $table->get_record($_); if ($data[22] && $data[91] eq "UTS" && $data[95] eq "WPI" && ! +$data[0]) { no warnings 'uninitialized'; print OUTFILENAME "<TR><TD><a href=\"specs/",$data[55] +,".html\">$data[55]</a>","</TD><TD>",$data[1],"-",$data[26],"</TD><TD +>",$data[2],"</TD><TD>",$data[3]; print OUTFILENAME "</TD><TD>",$data[86],"</TD><TD>",$d +ata[9]," / ",$data[61],"</TD><TD>",$data[21],"</TD><TD>"; print OUTFILENAME $data[15]," / ",$data[17],"</TD><TD> +",$data[81],"</TD><TD>",$data[16],"</TD><TD>"; print OUTFILENAME $data[20],"</TD><TD>",$data[94],"</t +d><TD>",$data[64],"</td><TD>",$data[89],"</td>"; print OUTFILENAME "<td>",$data[37],"</td></TR>\n"; } } print OUTFILENAME "</TABLE></BODY>\n"; close OUTFILENAME; print "File $outfilename written successfully\n"; $now_string = strftime "%a %b %e %H:%M:%S %Y", localtime; print "finished $now_string\n";

Replies are listed 'Best First'.
Re: Need to speed up read of dbase file to html page
by Joost (Canon) on May 31, 2002 at 14:41 UTC
    I don't know anything about XBase.pm, but my question would be "why?". You are already writing to a file, so you need not run the job every time a request comes in. Anyway, here's some pointers:

    • If the data changes a lot, but your html need not be absolutely up-to-date, you could try generating the html file every 10 minutes or so by running a cron-job.
    • I noticed that you have a LOT of columns in your data, and you don't use all of them, maybe you could try retrieving only the columns (and rows) you need. You will probably need DBI and the DBD::XBase module (supplied with XBase.pm) - It might be faster, but then again it might not - i've never tried any of them.
    -- Joost downtime n. The period during which a system is error-free and immune from user input.