jeff061 has asked for the wisdom of the Perl Monks concerning the following question:

Hopefully I don't get to long winded here.
Right now I have a script that does the following:

1: Opens "|" delimited syslog.log, parses out events that I've listed in an external text file
2: Exports this data into a SQLite Database(while its doing this a 0 is entered in a "logged" column in the database, for all new entries)
3: Exports all rows with a 0 in the logged column to a | delimited file(eventually some of the 0's will be ones, more later)

There is also a autonumbered coumn that serves as the key. Now i'm pretty happy with this so far. However i have a deadline coming up, and I'm still not knowledgable enough to be able to finsh this by myself, here is what i have left to do and would be grateful for any help.

The final output from the DB is in this format:
rowid(auto number)|Hostname|Date Down|Time Down|Error Message|Error Type(warning or error)|

Now some of these events will be for when a critical service goes down, and some will be for when a critical service comes back up. What i need to be able to do is have a file in the following format:
Hostname|Date Down|Time Down|Error Message|Error Type|Date Up|Time Up

Basically i need to take a copy of all downtime events, and paste the date and time off the corresponding uptime event. I currently have nothing that defines what event is a downtime event, what is an uptime, and which ones go together.

That is the first part i need help with. After a downtime and uptime is logged i need to be able to mark in the DB that this record is no longer current. What i plan on doing is while the downtime and uptime is being matched also take the rowid and export all these rowid numbers to a csv file. I can then export this data to a standalone table within the db. I have a handy DBA here that can help me use this table and run a select statement that will update the logged column for all matching rowid's with a 1. Thus this data will never be exported again, seeing as how only rows with 0's are. However i still need help getting these rowid's to a file.

And finally current script, probably messy i'm sure(but it does what i said it can)

use dbd::sqlite; use dbi; $logdir= "d:/logs/"; $dbdir= "d:/maint/syslog/"; $logfile= "sysmonitor.log"; $errlst= "errorlist.txt"; &FltErr; sub FltErr { chdir($logdir); open (FH, "< $errlst"); my @errors = <FH>; open (msg, $logfile ); open (FILEHANDLE,">NTDTM.txt"); while (<msg>) { chop($_); foreach my $error (@errors) { chomp ($error); if ($_ =~ $error) { print FILEHANDLE "$_ \n"; } } } close (FILEHANDLE); &sql; } sub sql{ chdir($dbdir); ###Inputs NTDTM,txt into DTM.DB sqlite database, sets all values in "L +ogged" field to 0 my $dbh = DBI->connect("dbi:SQLite:dtm.db","",""); #con +nects to DTM.db database my $sql=""; my $sqlupdate =""; $sql .= "copy DTM_Log from 'd:\\logs\\ntdtm.txt' using delimiters '|'" +; #imports parsed log file into sqlite DB $sqlupdate .= "update DTM_Log SET Logged = 0 WHERE Logged is null"; + #Updates all new records with 0 in logged field my $sth= $dbh->prepare($sql); my $sthupdate= $dbh->prepare($sqlupdate); my $sthoutput= $dbh->prepare($sqloutput); #print "$sql $sth \n"; #debug $sth->execute(); $sth->finish(); $sthupdate->execute(); $sthupdate->finish(); ###Select statement that returns records that have not sent to zeus my $sqlout =""; $sqlout .="SELECT rowid, Host, Date_Down, Time_Down, Event_MSG, EVT_Ty +pe FROM DTM_Log WHERE Logged = '0'"; #filters out logged events my $sthout= $dbh->prepare($sqlout); $sthout->execute(); my ( $rowid, $Host, $Date_Down, $Time_Down, $Event_MSG, $EVT_Type); $sthout->bind_columns (\$rowid, \$Host, \$Date_Down, \$Time_Down, \$Ev +ent_MSG, \$EVT_Type); ###Outputs previous select statement to file, using | as delimiter while( $sthout->fetch()){ open (OUT,">>ParsedDTM.log"); print OUT "$rowid|$Host|$Date_Down|$Time_Down|$Event_MSG|$EVT_Type\n"; + #outputs fields from select statement close(OUT); } $sthout->finish(); $dbh->disconnect; }

Got some good help(thanks dragonchild) last time i posted here. Always grateful for anymore i can get.

Replies are listed 'Best First'.
Re: Log file manipulation and SQLite
by jZed (Prior) on Oct 29, 2003 at 03:11 UTC
    I'm afraid your description is a bit too tangled for me to follow, not sure which of the various csv files is needed and how the sqlite database fits in. It seems to me that your best bet is to treat *all* of the data as a database. Use DBD::CSV to read the text files and write the pipe-delimited files. You can even do joins of the two input text files so you aren't reading them in tandem. This method would allow you to define the tables and the relations between them, even if some of the tables consist of only a single field. If you can come back with a schema - a list of the columns in the various data sources and the relations between them, I (and probably others) can help get that into a form that will have DBD::CSV and DBD::SQLite talking to each other.