in reply to Opening multiple log files

G'day hahazeeq,

Welcome to the Monastery.

"From this I am getting nothing."

You've added checks throughout for failed I/O - this is good. If files weren't created, you would have received messages from these checks. Not posting these is less good: we can't see what went wrong.

The first problem that leaps out at me is @output.

This should probably be a scalar, not an array. A lexical variable (my) would probably be better than an alias to a package variable (our).

Then later in the code

@output = $timestamp .'.sql'; open(my $fh, '>', @output) or die "Could not create file '@output': $! +";

becomes

$output = $timestamp .'.sql'; open(my $fh, '>', $output) or die "Could not create file '$output': $! +";

There's also potential problems with

open(my $fn, '<', $filename) ...

which possibly needs to be

open(my $fn, '<', "$directory/$filename") ...

Add temporary print statements to your code to perform your own troubleshooting. Check if you're reading and writing to the correct pathname, as opposed to the correct filename.

Without seeing your error messages, I'm really just guessing at problems. If you need further assistance, please see these guidelines for what to post such that we can help you most effectively.

-- Ken

Replies are listed 'Best First'.
Re^2: Opening multiple log files
by hahazeeq (Novice) on Jun 15, 2015 at 07:10 UTC

    Thank you Ken. I've corrected the errors as suggested and confirmed that it is writing to the correct path name which is the current path.

    I'm trying to find out how can I execute this exact same script for all of the files(many) in the same directory. The files that I want to include are those with the extension of '.log'. To be specific, the files are log files and I am trying to capture data(succeeded) from each of the file. I have managed to do it for one file now I am trying to do it for each file with a the `.log` extension in the current folder.

      The builtin glob function will probably do what you want. Just make sure you check the "Portability issues" noted in that document.

      Something like:

      my @logfiles = glob "*.log";

      then just iterate @logfiles, processing each in turn.

      Because you said "... all of the files(many) in the same directory ..." you may exceed a maximum size. See GLOB_LIMIT in File::Glob for details.

      If you are likely to approach that limit, a better option (although, a bit more coding) would be to use readdir.

      You'll need to skip everything except normal files with a .log extension. Something like this (untested):

      while(readdir $dh) { next unless -f and /\.log$/; # Process log file here }

      See File Test Operators if you're unfamiliar with -f.

      -- Ken

        I see. Alright, I won't be using modules cause I'm still learning so I want to do more of the coding first. I understand the code you have provided would read every file in the directory which have the .log extension.

        However, is there anyway anyway to get the file name as well? If you notice at the second part of my code I am trying to get the file that I have opened(which is formatted in a timestamp)to use it to print in the file I'm going to create. I'm not sure how to do that with your codes.