Here's the new code that's making it run a lot faster now. The script reads a category descriptions file with 4000 lines, and retrieves the description for each of the categories that are to display on a page.
open(FILE, "$catdesc"); my @desc = <FILE>; close(FILE); chomp @desc; %category_descriptions = {}; ## Create a hash with the category names to display. foreach $directory_name (@subdirectories) { my($date_a, $directory_name) = split(/\t/,$directory_name); if($directory_name ne '') { $category_descriptions{"$FORM{'direct'}/$d +irectory_name"} = 1; } } ## Set the description for each category foreach $line (@desc) { my ($catname, $catdescription) = split(/\t/, $line); $catdescription =~ s/^\s+//g; # trim leading blanks... $catdescription =~ s/\s+$//g; # trim trailing blanks... next if (!$catdescription); # skip line if no description if($category_descriptions{$catname} == 1) { $category_descriptions{$ca +tname} = "<br><$font>$catdescription</font><br>"; } }
I'd really appreciate to know if there's even a faster way of doing this. Having it all migrated to MySQL would be great but the system would need too many modifications, that it's just as easy to change the whole thing.
Thanks,
Ralph
In reply to Re: Speeding up large file processing
by Anonymous Monk
in thread Speeding up large file processing
by ralphch
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |