I'm trying to go through some large (2.8 GB) log files from IIS. Each line looks like:
80.129.152.192 - - [09/Dec/2001:00:23:26 0100] "GET /informationRoot/foo/bar/foo.gif HTTP/1.1" 200 631
I want to find out the number of unique visits for each log file. My code takes the IP from each line and counts it and adds it to a list (if it's not allready there, in case it just skips the line). This is not a good solution and it's extremely time consuming.
Here's the code anyway:
if ($ARGV[0]) { go(); } else { die "\nUsage: stats.pl [filename] [filename} ...\n\n"; } sub go() { foreach $filename (@ARGV) { $file = $filename; open (FILE, $file); $i = 0; my @list=""; while (<FILE>) { /(.*)\s-\s-/; $ip = $1; if (notInList($ip)) { $i++; addToList($ip); } } print "\nVisits for $file is $i\n"; } } # Subfunctions sub addToList($ip) { push @list,$ip; } sub notInList($ip) { foreach $tmpip (@list) { if ($tmpip eq $ip) { return 0; last;} } return 1; }
Edit dws code tags</code>
In reply to Unique visits - Webserver log parser by ciryon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |