in reply to string gets front truncated


i dont know what you are trying to achieve, but even i had the same requirement for a XML file to calculate checksum after stripping all XML tags and bringing data into one line and calculating checksum crc32 on it , this was a requirement from one of our client

Why dont you try the below?
# call as: ConvertFindLaw.pl PilotLifeFindLaw.htm use 5.010; use Switch; use strict; my $wholeFile; my $INPUT_FILE = shift; my $FH; open(FH, $INPUT_FILE) or die "Bad $INPUT_FILE: $!"; ; while (<FH>) { # print $line; # All lines printed chomp $_; # This makes difference , if want it in one line $wholeFile .= qq~$_~; # Just use quoted qq } # Build a long string $wholeFile .= qq~\n~; # This is needed, when you say u want it in one +line, there should one new line character after doing appending in wh +ile loop, now when you do wc -l on this one, it will show one line close(FH); print $wholeFile; # front truncated here

* You should remember , if you are putting the resultant string in file and want to open it and see, i guess some editors have limitation in terms of number charcaters it can display in a single line, for ex: AIX5 , my vi editor cannot display more than 2048 characters in a single line.

As said by some of the monks , unless you specify what you are going to achieve, its difficult to suggest the solution.

Replies are listed 'Best First'.
Re^2: string gets front truncated
by hsfrey (Beadle) on Jul 30, 2008 at 04:08 UTC
    > i guess some editors have limitation in terms of number charcaters it can display in a single line, < I'm not reading it in an editor - I'm just printing it out in the DOS box. And again, whatever I'm going to use it for, shouldn't this be allowed in perl? BTW, it doesn't seem to be caused by a memory shortage - I killed about 6 programs that were running simultaneously, and the front-truncation happened in exactly the same place.
      And again, whatever I'm going to use it for, shouldn't this be allowed in perl?
      Yes, it should and is. I think everyone responding is in no doubt that the problem is something other than the front of the string magically disappearing.

      Try putting this in place of the print (in the chomp-less version):

      printf "total of %d bytes read (%d including carriage returns)\n", length($wholeString), $. + length($wholeString);
      and comparing that to the length of the input file as reported by the dir command?
      I'm just printing it out in the DOS box.
      I've seen console windows omit bits of the output when flooded with data. Have you tried doing ConvertFindLaw.pl PilotLifeFindLaw.htm >tempfile and looking to see what's in tempfile?
      In agreement with ysth, the problem is most likely with your DOS window.

      DOS windows have a limit on the buffer size it can display, try increasing it, but still the maximum value might not be enough. So either use Windows Powershell or print the string to a file.