doh, I was referencing in scalar context, not enough coffee yet today I guess. I tried the following but I am still not getting what I expected, not sure why this is occurring, I copied all my logfiles which are in html format to a webdirectory, I tried storing the file list in an array and then using www.mechanize to grab and parse the files. I expected this code would print out the url but instead I get a Hash address. I removed the rest of the files as retrieving one would be enough if I can get that to work properly I can easily add the rest to the arraylist.
use OLE; use LWP; use WWW::Mechanize; use Mail::Sender; $url="http://operations/idslogs/latest.html"; # Create a new instance of WWW::Mechanize # enabling autoheck checks each request to ensure it was successful, # producing an error if not. my $mechanize = WWW::Mechanize->new(autocheck => 1); # Retrieve the page @pagelist={'aps_2009-11-10_11-30-44-204.html'}; foreach $page (@pagelist){ $url="http://operations/idslogs/$pagelist[0]"; #$mechanize->get($url); print $url; # Assign the page content to $page my $page = $mechanize->content; my $ipcount=0; my $match_count=()=($page=~/Memphis/g);{ print "Logins for Memphis $match_count\t"; } my $actris_count=()=($page=~/ACTRIS/g);{ print "Logins for Austin $actris_count\t"; } my $sef_count=()=($page=~/South East Florida/g);{ print "Logins for SEF $sef_count\t"; } }
Are Updates supposed to go in the top or in the bottom? anyway I tried to break down my example to something easier for me to understand and I am still having trouble with it. here is what I have now, which should open the files and print their lines,its not doing that either. strict bitches about everything and basically annoys the be-jesus out of me. I know I should get used to it and love it and all, but its frigging annoying when you make mistakes like I do to be constantly reminded of them
use IO::File; $dir= shift || '.'; opendir DIR, $dir or die "Can't open directory $dir: $!\n"; while ($file= readdir DIR) { next if $file=~/^\./; open (FH, ">", "$file") or die "Can't open file $file: $!\n"; my @file_lines=<FH>; foreach my $line (@file_lines) { print $line "\n"; } }
I have some logfiles, I would like to process but its not working as expected, I probably missing something simple.. here is my attempt.
use:OLE; use IO::File; $dir= shift || '.'; opendir DIR, $dir or die "Can't open directory $dir: $!\n"; while ($file= readdir DIR) { next if $file=~/^\./; print $file; open (FH, "$file") or die "Can't open file $file: $!\n"; my @file_lines=<FH>; #close FH; foreach my $line (@file_lines) { chomp(); my $match_count=()=($line=~/Memphis/g); my $actris_count=()=($line=~/ACTRIS/g); my $sef_count=()=($line=~/South East Florida/g);{ } } } print "Logins for Memphis $match_count \t", "Logins for Actris $actris +_count \t", "Logins for SEF $sef_count\t";

In reply to parsing a directory of log files. by learn2earn

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post, it's "PerlMonks-approved HTML":



  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Titles consisting of a single word are discouraged, and in most cases are disallowed outright.
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, details, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, summary, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • You may need to use entities for some characters, as follows. (Exception: Within code tags, you can put the characters literally.)
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.