An excellent question. I'm not really algorithming.
I suppose I should have been more specific. I wish to translate the few million dates into unix seconds... So I'm using $us = &UnixDate( &ParseDate($_), '%s' );
Perhaps that is the slow part? | [reply] |
Ok ... just out of curiousity - why aren't you batching this up and letting it run over a weekend?
The Perfect is the Enemy of the Good.
| [reply] |
(Several months late to the discussion, but for the benefit of other monks who might wander this way...)
Date::Manip is slow for parsing dates that are in some regular (in the "consistent" sense) format. It's slow because it handles many, many date/time formats (and handles them well, in my experience). It'll handle "2005-11-03T10:00:00" and "Nov 3, 2005 at 10AM" and "today at 10AM". (Currently, these are all interpreted as the same date-time.) Even the author of the module suggests that Date::Manip is probably not the module you should be using.
I came upon this thread after a log-parsing script I'd written was taking forever to finish. The log files I had listed dates in the form: DD/Mon/YEAR HH:MI:SS. With Date::Manip, parsing a 10,000-line file took 98 seconds. Using DateTime took 27. And, using Time::Local (plus a hash for the month names) reduced the total time to under 5 seconds.
YMMV, but I wanted to point out that despite all its goodness, Date::Manip is slow and is often overkill.
| [reply] |