in reply to Re^3: How to optimise " csv into mysql " using Text:CSV_XS and DBD::Mysql
in thread How to optimise " csv into mysql " using Text:CSV_XS and DBD::Mysql

Ah, quite right... seems to be trusted data. But still quite right.

  • Comment on Re^4: How to optimise " csv into mysql " using Text:CSV_XS and DBD::Mysql

Replies are listed 'Best First'.
Re^5: How to optimise " csv into mysql " using Text:CSV_XS and DBD::Mysql
by chacham (Prior) on Aug 06, 2015 at 16:29 UTC

    seems to be trusted data

    Trusted data does make a good case for a quick and dirty solution, certainly if it is an ad hoc, temporary solution. However, it may engender the use of dynamic sql elsewhere. Or worse, stick around for a while and even get copied into other scripts because "it works."

    Personally, i often find that taking the time (and pain) of doing it correctly teaches me a few things.

      to be honest I don't exactly get the point. What actually wrong with the way I'm using placeholders (building the list).? The thing is, that I'm running this piece of code to import multiple files with different column names, and column counts. So building the placeholders looked like a good ( and only I could think off ) idea to me. I just don't find any better solution.

        The (major) reason placeholders are better than variables in a dynamic SQl statement, is placeholders are strictly typed and cannot (generally) do anything unintended.

        In the script, $fieldList defines the list of fields. As there is now a variable available in the statement, little Bobby Tables lives on.

        The idea is to never have a variable anywhere in your SQL statements.

        Ultimately, it's just fyi. Keep it in mind and make the best decision for your particular situation.