in reply to hash referencing...best approach?

sub make_session { my ($u,$a,$s)=@_; %session_hash={ user=>$u, dept=>$a }; $s=\%session_hash; }
You've muddled several things, here. The hash should be assigned with a list, not a block. There's no reason to create $s, since you're just throwing it away. You can return the reference by just saying
\%session_hash;
as the last line. You don't say where the result of make_session goes, and you don't declare %session_hash as a lexical in the sub, which means you're going to be returning a reference to the same thing every time.

while (my $line2=<HTTP>) { chomp;
I'm guessing that here you want to chomp $line2.

I think I know what you want to do, but I'm not sure: Create a hash entry for each unique session, and associate a filename with it. Then, when you're reading the log files, you'll parse out the session name, and write (append) the line onto the associated file. Is that right?


The PerlMonks advocate for tr///

Replies are listed 'Best First'.
Re: Re: hash referencing...best approach?
by Anonymous Monk on Nov 24, 2003 at 21:32 UTC
    "I think I know what you want to do, but I'm not sure: Create a hash entry for each unique session, and associate a filename with it. Then, when you're reading the log files, you'll parse out the session name, and write (append) the line onto the associated file. Is that right? "

    This assesement is pretty accurate. What I want to have at the end of processing is a series of dept. specific access files:

    tools_access.log
    help_access.log
    ...

    Even better would be the full line from the access log file with the user_code and dept_code added to the end of it so we could not only track dept ussage but user usage within each dept.

    I don't know if that was any clearer or not...
      Ok, let's see if this gets you started. I'm going to write this as mostly pseudocode comments. You get to fill in the code.
      my %session_hash; my %departments; # for each of the 6 session files, # open and read line-by-line # parse out user_id, session_id, dept_code $session_hash{$session_id} = $dept_code; $departments{$dept_code}++; # # for each apache log # open and read line-by-line # parse out session_id # append the line to the file associated with $session_hash{$s +ession_id}
      If you have a relatively small number of departments (keys %departments), you can keep all the output files open for writing. Otherwise, you'll need to open for append each time you want to write a line of output. (You could also hold some number of lines in memory and write them out every so often, for a little less open and closing action.)

      HTH.


      The PerlMonk tr/// Advocate
        I have about 40 possible departments, so I just wrote a small sub routine

        sub write_it{ my ($dept,$output)=@_; my $output_file=$dept."_access.log"; open DATA,">> $output_file")||die ("unable to open $output_file $!\n") +; print DATA $output."\n"; close DATA; }


        This works pretty great, I was able to read through one of the access logs and create the specific files in about 4 minutes. The one problem I have now is appending the user_code to the end of the access log line.

        I tried to change the way I 'write' to the session_hash while reading the session logs to :
        $session_hash($sessio_id}{$user}=$dept_code;
        But this just screwed me up later down the line when I reading through the access logs. For this part I currently have:
        open (HTTP,$access_log)||die ("unable to open $access_log $!\n"); while (my $line2=<HTTP>) { chomp $line2; my @fields=split /\s+/, $line2; my $session=@fields[6]; my $session=substr($session,(index($session,"?")+12),(inde +x($session,"|"))-(index($session,"?")+12) ); if (length($session) ==52) { &write_it($session_hash{$session},$line2); }
        I tried to incorporate the user_code part into this and ended up getting the hash address everywhere. In other words, my file names became hash addresses and my user_code values where null. Surely I am missing something minor here. Thanks for all the help thus far, it has proven most superb.