Your "if ... elsif ... elsif ..." structure is really not sustainable if you ever need to adapt to input files of arbitrary length. Personally, given a stable (but potentially changeable) csv file (or other database-like source), I would be inclined to use CGI and have a process that delivers a user-specified quantity of data rows starting at a user-specified point, with a button to move back and forth by pages -- similar to what is done here at the Monastery (Nodes You Wrote Perl Monks User Search being a very good example).
But if you just want to generate a set of static html files from your data, I'd do that like this:
That should avoid the problem you were having with so much of the data being absent from the output files.#!/usr/bin/perl use strict; use warnings; my $input = ( @ARGV and -f $ARGV[0] ) ? shift : "C:/csv_in/test.csv"; open( $ifh, "<", $input ) or die "$input: open failed: $!\n"; my $out_num = my $out_count = 0; my $ofh; while ( <$ifh> ) { if ( ! defined( $ofh ) or $out_count == 5 ) { if ( $ofh ) { # print closing html stuff (page trailer, etc)... close $ofh; } my $out_name = sprintf( "C:/html_out/output_%d.html", ++$out_n +um ); open( $ofh, ">", $out_name ) or die "$out_name: cannot open: $ +!\n"; # print opening html stuff... $out_count = 0; } # extract fields from data row and stuff it into html... print $ofh ... $out_count++; } # print closing html content to current $ofh ...
(Update: I had forgotten that "Nodes You Wrote" was a personal tweak to one's personal "nodelet" set -- which is easy to set up, just follow the original link above -- whereas "Perl Monks User Search" is the direct link to the facility.)
In reply to Re: Only last record is written to the output file instead of all records
by graff
in thread Only last record is written to the output file instead of all records
by valavanp
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |