in reply to opening a file in a subroutine

I see no reason this should hang. You said "a program like this". Does this exact code hang on the file in question? How big is that file? Have you somehow used up all your memory and the OS is thrashing?

--- demerphq
my friends call me, usually because I'm late....

Replies are listed 'Best First'.
Re: Re: opening a file in a subroutine
by abhishes (Friar) on Feb 09, 2003 at 14:46 UTC
    thanks for your reply.

    The file is only one 1.5 MB. the first function gets completes within a matter of seconds. Then the second function call takes a very long time to complete. I was wrong with I said that the program hangs... but the second and the third function calls take 5 minutes each to complete. This doesn't make sense to me... because if the first call took just 5 - 6 seconds why did the second and third take such a long amount of time?

      My guess is that the code that you are actually using looks more like
      sub read_file { my $file=shift; open my $fh,$file or die "$file : $!"; my @lines=<$fh>; return \@lines; }
      In which case its storing the data in memory. Which is perhaps overflowing your available physical ram. When that happens the OS starts swapping the memory out to disk (obviously a slow operation), which if it happens enough leads to a condition called thrashing where the OS is basically just moving memory back and forth from the disk, and the time taken for the repeated swapping completely overwhelms the time your code takes. Causing your code to look like it hangs. Try using some memory monitoring tool, or do some program analyiss to see exactly how much memory you are actually using. Hashes for instance consume far far more memory than they look. As does careless array manipulation. For instance if you have
      my @array; $array[100_000_000]=1;
      then perl will have to allocate sufficient RAM for all 100 million slots, even though they arent used.

      Without seeing real code, (as i said I dont think what you posted properly reflects what you are doing) its hard to say for certain.

      Regards,

      UPDATE: Since you are on Win2k you should take advantage of the Task manager and the Profiler that come with the OS. (Start -> Settings -> Control Panel -> Administrative Tools -> Profiler)

      --- demerphq
      my friends call me, usually because I'm late....

        Hello Demerphq,

        The code which I have posted is quite close to what I am doing. my appliction has couple of log files. I tried to write a generic fuction which will open a log file and return all its contents. However I realized my application slows down a lot when the second call to the log reader function is made.

        Then I wrote this dummy program. The function here and the one in my application are the same.

        I tried to optimize the code by passing a reference to the @lines array to the function and using the same reference everywhere. but that did not help. My machine has 512 MB physical ram and 756 MB swap pagefile. So I don't think that hardware or swap is an issue. Just to check if the OS is s*wing up, I wrote the same function in C#. There all the 3 function calls took 5 seconds each. So why is it in perl that the second and third calls take so long? I am still confused.

        my updated code looks like
        use strict; use warnings; print "opening file 1\n"; my @lines; myfile(\@lines); @lines = {}; print "opening file 2\n"; myfile(\@lines); @lines = {}; print "opening file 3\n"; myfile(\@lines); @lines = {}; sub myfile { my ($line) = @_; open my $fh, "xml.log"; @{$line}= <$fh>; close($fh); }
        regards, Abhishek.