jjohhn has asked for the wisdom of the Perl Monks concerning the following question:

Given a file of strings, one sring per line, I want to write a script to make the file usable by SQL: each string should lie within single quotes, a comma should follow each quoted string (except the last), and the whole bunch should be enclosed by parens. I'd prefer to read the file in line by line, but I am wondering if that will let me acomplish what I want to do. For a file:

one
two
three

I want:

('one',
'two',
'three')

this sed script seems to work:
sed -e s/\(.*\)/'\1',/g -e 1s/\(.*\)/(\1/ -e $s/\(.*\),/\1)/$s/\(.*\),/\1)/
but I was hoping perl could do it more readably or simply. The last line, where the comma should be missing, is hard to achieve. This is what I have so far:

use strict; my $line; while(<>){ chomp; s/^/'/g; s/$/',/g; if($.==1){ s/^/(/;} print $_,"\n"; $line=$_; # this is where I am stuck; I need to test if we are at the last line. }
but you can see that the last line isn't quite right. Any thoughts?

Replies are listed 'Best First'.
Re: wanting to make a file of strings usable by SQL
by BrowserUk (Patriarch) on Dec 27, 2005 at 03:24 UTC
    this is where I am stuck; I need to test if we are at the last line.

    Why not do your last line processing after the while loop?


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      Within the while loop I believe I need to test if I am at the last line of the file; if I am, I exit the loop and process the last line outside the loop. sed avoids that dificulty because it lets me overwrite my last line, removing the trailing comma and adding the closing paren.

      I am thinking that reading the whole file into an array might turn out to be the simplest way to do this after all.

        Try this

        #! perl -slw use strict; chomp( $_ = <DATA> ); print "($_,"; while( <DATA> ) { chomp; $_ = "'$_'"; last if eof(DATA); print "$_,"; } print "$_)"; __DATA__ one two three four

        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.

        No, you just leave the line unterminated at the end of the loop and then check at the next iteration whether there was a previous line that needs terminating. The only time when that isn’t true is on the first line, which is a very simple condition. This leads to my trivial code below.

        Makeshifts last the longest.

        I am thinking that reading the whole file into an array might turn out to be the simplest way to do this after all.

        Well, if your input files are not huge and you don't make this into a (bad) habit, that should be fine:

        my @lines=map {chomp; "'$_'"} <>; $"=",\n"; print "(@lines)\n";

        Mind you: it can be done in less obscure ways, and it is certainly not the best one. Just wanted to show it to add one entry to the TMTOWTDI bag...

        PS: even more obscurely:

        $"=",\n"; print "(@{[map {chomp; qq|'$_'|} <>]})\n";
Re: wanting to make a file of strings usable by SQL
by helphand (Pilgrim) on Dec 27, 2005 at 03:44 UTC
    use strict; my @lines = <>; chomp @lines; my $sql = '(' . join(",\n",( map { "'$_'" } @lines)) . ")\n"; print $sql;
      ++helphand for using join.
Re: wanting to make a file of strings usable by SQL
by Aristotle (Chancellor) on Dec 27, 2005 at 04:41 UTC
    print '('; while( <> ) { chomp; print ",\n" unless $. == 1; print "'$_'"; } print ")\n";

    Update: $. == 1 of course, not $. == 0

    Makeshifts last the longest.

Re: wanting to make a file of strings usable by SQL
by pKai (Priest) on Dec 27, 2005 at 11:59 UTC
    I'd prefer to read the file in line by line, but I am wondering if that will let me acomplish what I want to do.

    As previous posters showed, it is possible and comfortably even.

    If you want to stick to your original approach, you could even use eof to to go from where you were stuck.
    Though it's unnecessary complicated to stick too close to the sed-emulation.

    As (I assume) such SQL argument lists are typically short (comparably), it shouldn't be a problem to handle the whole list in memory, as demonstrated by Skeeve.

    Personally, I'm a map maniac, so my typical solution to this looks like:

    print '(', join(',' => map {chomp; qq('$_')} <>), ')';

    You also might want to meditate about what will happen, when your input data contains single quotes, eventually invalidating your SQL syntax

Re: wanting to make a file of strings usable by SQL
by Skeeve (Parson) on Dec 27, 2005 at 07:00 UTC
    { local $/; $_= <>; } chomp; s/\n/',\n'/g; print q"('",$_,"')\n";

    s$$([},&%#}/&/]+}%&{})*;#$&&s&&$^X.($'^"%]=\&(|?*{%
    +.+=%;.#_}\&"^"-+%*).}%:##%}={~=~:.")&e&&s""`$''`"e