txfun has asked for the wisdom of the Perl Monks concerning the following question:

Monks, Need help where my perl script stops writing data to a file. Script splits a 700 MB into a file of 30K lines each into a directory.
Files in the directory are read one after another , each line in each file is processed , converted and outputed to another file which is approximately 13MB size. (reason:13 MB file is inputted to a system which works optimally @ 13 MB Size).
Script stops processing after 138 files, specifically fails when print FileHndlr, $formattedline
print FileHndlr , $formattedline or warn 'unable to write!';
fails after 138th file. based on feedback from perl tutorials , $!++ was done before the while in order to do autoflush
always fails @ 138 * 13 MB each. Please advice on how i should approach this issue and how to resolve this. my script looks something like
$|+;; my @InpFiles = glob (*.dat"); foreach my $inpfile (@InpFiles) { my $outfilename; my $outputfile = ""; open(HNDL, "$inpfile"); open $outfilename, ">", $outputfile; $|++; while(my $tmpline = <HNDL>){ print $outputfile "$parsedstring\n" or warn 'couldnot writ +e to file'; print $outputfile "$parsedsecondstring\n" or warn 'couldno +t write to file'; } } $|++; close( HNDL ); close( $outfilename ); undef $outfilename; }

Replies are listed 'Best First'.
Re: Perl - Linux - Unable to Write Files
by graff (Chancellor) on Mar 12, 2015 at 03:38 UTC
    If you have a version of this that runs at all, it must be different from what you've posted here, because the OP script has syntax errors and won't compile.

    If the script as posted were runnable, it would not create any output files at all, because you can't open an output file with an empty string as the file name. Apart from that, if the output file name were not an empty string, the script as posted would simply be creating one output file for each input, and would just copy all the input file content to the output file.

    If you can post a version that actually compiles, and does something like what you describe as your problem, maybe we can help you.

Re: Perl - Linux - Unable to Write Files
by trippledubs (Deacon) on Mar 12, 2015 at 03:41 UTC
    change
    print $outputfile "$parsedstring\n" or warn 'couldnot write to file';
    to
    print $outputfile "$parsedstring\n" or warn "couldnot write to $inpfil +e: $!";

    When you operate on a bunch of things and it breaks you need to know which thing, in this case maybe something can be learned by knowing what it broke on and if $! is set to something that helps diagnose the problem.

    $| is a flag variable so it only matters if it is 0 or not zero. $|++ is a cool way to make it not zero, in a loop it is not cool at all, your making it non zero over and over and over. Why do you need to autoflush anwyays? I assume there is some benefit because it is default behavior.

    $|=20; print $|;
    Still says 1 so perl is actively refusing to obey.. subverting.. kind of makes me feel uncertain about the balance of power here.
Re: Perl - Linux - Unable to Write Files
by Anonymous Monk on Mar 12, 2015 at 03:56 UTC

    $outputfile is always empty ... thats not good ... and you're not checking to see that open failed ( autodie can check for you)

    Try this, it comes with error checking

    #!/usr/bin/perl -- use strict; use warnings; use Path::Tiny qw/ path /; for my $infile ( glob "*.dat" ){ my $outfile = "$file.bak" ; path( $infile )->copy( $outfile ); }
Re: Perl - Linux - Unable to Write Files
by pvaldes (Chaplain) on Mar 12, 2015 at 22:27 UTC

    Script stops processing after 138 files

    If this is a fixed number, sounds like maybe you are running out of filehandles (but 138 is not too much), or maybe running out of memory. Just my two cents. Check your ulimit

    Updated: You could be able to open more files if needed using a cache. This should not be necessary, but take a look to Cache::Cache or CHI

      sorry, i could not post the exact compiling code. The issue was that disk was full after 138 files. However i also found an issue with my code, where when i split the file, i am not doing close FILEHNDL