in reply to Automation for pulling huge data from greenplum database in perl.

sarf13:

It would be hard to say whether it would be faster or slower since you've given such a high-level description. However, I find that shell programming is *quite* a bit harder than perl. (In fact, one of the reasons I began coding in perl was to avoid shell scripts.)

If your spreadsheets are going to be large, you might consider Excel::Writer::XLSX instead of Spreadsheet::WriteExcel. The chief advantage is that it uses the .xlsx format which allows for more rows and columns in your spreadsheets.

What you're wanting to do looks like a perfect task for perl. In fact, just a couple of days ago I posted a script to create a spreadsheet from a database query.

...roboticus

When your only tool is a hammer, all problems look like your thumb.

  • Comment on Re: Automation for pulling huge data from greenplum database in perl.

Replies are listed 'Best First'.
Re^2: Automation for pulling huge data from greenplum database in perl.
by sarf13 (Beadle) on Sep 22, 2012 at 20:02 UTC

    Hi roboticus thanks for your reply

    i agree with your point that shell script is bit tough but its the best current option for us.

    you also mentioned about Excel::Write::XLSX module actually i haven't done R&D on this module.the ExcelWrite module can generate excel file for office 2003 i guess as per my knowledge it can accommodate less then 3 lacks records .lets see how its work.

      sarf13:

      Excel::Write::XLSX is written by the same John McNamara (he's here frequently) responsible for Spreadsheet::WriteExcel, and both modules are quite handy. If you need Office 2003 compatibility, then you're probably better off with Spreadsheet::WriteExcel. But you'll have to keep the row limits in mind (not really all *that* big a deal, you can just start a new worksheet once you hit a certain row number).

      ...roboticus

      When your only tool is a hammer, all problems look like your thumb.