in reply to Reading and Writing from / to XLSX file (optimization)

I am trying to optimize the script not only in terms of speed but also regarding resources management. I am reading a file (primary file) that contains almost 4500 lines (Emphasize mine)
So? I'd start worrying at four and a half million lines. If Excel itself doesn't choke on the file, I doubt ParseExcel will. Why don't you just mock up a really large dataset, say 100,000 lines and see how your program does?


holli

You can lead your users to water, but alas, you cannot drown them.
  • Comment on Re: Reading and Writing from / to XLSX file (optimization)

Replies are listed 'Best First'.
Re^2: Reading and Writing from / to XLSX file (optimization)
by thanos1983 (Parson) on Nov 01, 2017 at 10:22 UTC

    Hello holli,

    Thanks for the time and effort reading and replying to my question.

    As I said "I have never worked with so big files so maybe the size in reality is really small and the optimization is not necessary.". I will try to run some crash tests as you said by "mock up a really large dataset, say 100,000 lines and see how your program does and see how the program response."

    BR / Thanos

    Seeking for Perl wisdom...on the process of learning...not there...yet!