in reply to Speeding up large file processing
Here's the new code that's making it run a lot faster now. The script reads a category descriptions file with 4000 lines, and retrieves the description for each of the categories that are to display on a page.
open(FILE, "$catdesc"); my @desc = <FILE>; close(FILE); chomp @desc; %category_descriptions = {}; ## Create a hash with the category names to display. foreach $directory_name (@subdirectories) { my($date_a, $directory_name) = split(/\t/,$directory_name); if($directory_name ne '') { $category_descriptions{"$FORM{'direct'}/$d +irectory_name"} = 1; } } ## Set the description for each category foreach $line (@desc) { my ($catname, $catdescription) = split(/\t/, $line); $catdescription =~ s/^\s+//g; # trim leading blanks... $catdescription =~ s/\s+$//g; # trim trailing blanks... next if (!$catdescription); # skip line if no description if($category_descriptions{$catname} == 1) { $category_descriptions{$ca +tname} = "<br><$font>$catdescription</font><br>"; } }
I'd really appreciate to know if there's even a faster way of doing this. Having it all migrated to MySQL would be great but the system would need too many modifications, that it's just as easy to change the whole thing.
Thanks,
Ralph
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Speeding up large file processing
by BrowserUk (Patriarch) on Jul 15, 2005 at 14:26 UTC | |
by ralphch (Sexton) on Jul 15, 2005 at 16:35 UTC | |
by BrowserUk (Patriarch) on Jul 15, 2005 at 17:11 UTC |