Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi monks, I wondered if you could help. How can I get the following bit of code to read the file contents into a scalar rather than the @info array shown??
while (<$file>) { push (@data, "$_\n"); } print @data;
I need to preserve the entire "$_\n" and need it in a scalar as that is what is used throughout the program. I have also tried :
$data = do { local $/; $_ };
with no success. thanks

Replies are listed 'Best First'.
Re: slurping into scalars
by rob_au (Abbot) on Dec 16, 2002 at 10:07 UTC
    You are *so* close on this one - I think the following is what you are after ...

    $data = do { local $/ = undef; <$file> };

    What this line of code does is clear the input record separator, $/, which is a newline character by default and delimits Perl's concept of what a "line" is. By undefining this variable, the first read of this file handle, <$file>, slurps the entire file into the variable. This behaviour is documented in perlvar.

     

    Update

    See particle's post in this thread here for an alternate method which also harnesses the magic of @ARGV allowing for the slurping of entire files into either an array or scalar using only a single line of code - For example: [@data|$data] = do { local( $/ ) = wantarray ? $/ : undef; local( @ARGV ) = @_; <> };

     

    perl -le 'print+unpack("N",pack("B32","00000000000000000000000111110111"))'

      Is it possible to alter this so every new-line is printed to a new-line?? for some reason this merges all lines together. ;-)
        This is not normal behaviour and where we will need to see some code to understand what is happening. Altering the value of the input record separator, $/, in the manner described has no effect on the substance of the data read into the scalar variable.

         

        perl -le 'print+unpack("N",pack("B32","00000000000000000000000111111000"))'

        Is it possible to alter this so every new-line is printed to a new-line?? for some reason this merges all lines together.
        $_ = join("\n", <FILE>);
        Adds a second \n to every line. This may be what you are looking for. Seems a bit odd though.

        I have done this kind of thing (with some differences) when doing quick and dirty text->html output. Say you want to put <p> tags when you have blank lines but otherwise leave the text untouched then the folowing code works:

        do {local $/ = $/.$/;print join('<p>', <FILE>);};

        Dingus


        Enter any 47-digit prime number to continue.
Re: slurping into scalars
by thinker (Parson) on Dec 16, 2002 at 10:15 UTC
    Hi,

    You want something like
    #!/usr/bin/perl -w use strict; open FILE, "DATA.DAT" or die $!; my $scalar; { local $/=undef; # slurp mode on $scalar=<FILE>; } print $scalar;
    hope this helps

    thinker
Re: slurping into scalars
by particle (Vicar) on Dec 16, 2002 at 18:53 UTC

    although rob_au and thinker have provided valid responses, i prefer to use a subroutine and stick it in a utility module. besides providing reusability, it does not pollute your namespace with filehandle names. here's a sample from my Utl module:

    package Utl; require 5.006_001; use strict; use warnings; ## a library of categorized utilities and functions our $VERSION = 0.03; our %slurp = ( ## slurp a file to a scalar ## pass filename (as scalar) ## returns contents of file (scalar context) to_scalar => sub{ local( *ARGV, $/ ); @ARGV = @_; <> }, ## slurp one or more files to an array ## pass filename(s) (as scalar) ## returns contents of file(s) (list context) ## returns number of lines (scalar context) to_array => sub{ local *ARGV; @ARGV = @_; <> }, ); ### more utilities follow...

    you'd use it like so:

    my $file = $Utl::slurp{to_scalar}->( $filename ); ## or for an array... my @file = $Utl::slurp{to_array}->( $filename );

    use it in good health.

    ~Particle *accelerates*

Re: slurping into scalars
by Juerd (Abbot) on Dec 30, 2002 at 19:54 UTC

    use File::Slurp; my $data = read_file $filename;