in reply to Re^4: Our perl/xs/c app is 30% slower with 64bit 5.24.0, than with 32bit 5.8.9. Why?
in thread Our perl/xs/c app is 30% slower with 64bit 5.24.0, than with 32bit 5.8.9. Why?
First: I did say "Beyond those guesses,". The information provided by the OP so far consists solely of the build parameters for the two builds. I compared those two sets and attempted to reason about possibilities.
A non-problem that allows you to trivially DoS any web server where input from the client
Hm. That problem was addressed way back in 2003/5.8.1 with something akin to this:
From the 5.8.1 delta:
Mainly due to security reasons, the "random ordering" of hashes has been made even more random. Previously while the order of hash elements from keys(), values(), and each() was essentially random, it was still repeatable. Now, however, the order varies between different runs of Perl.
Perl has never guaranteed any ordering of the hash keys, and the ordering has already changed several times during the lifetime of Perl 5. Also, the ordering of hash keys has always been, and continues to be, affected by the insertion order.
The added randomness may affect applications.
One possible scenario is when output of an application has included hash data. For example, if you have used the Data::Dumper module to dump data into different files, and then compared the files to see whether the data has changed, now you will have false positives since the order in which hashes are dumped will vary. In general the cure is to sort the keys (or the values); in particular for Data::Dumper to use the Sortkeys option. If some particular order is really important, use tied hashes: for example the Tie::IxHash module which by default preserves the order in which the hash elements were added.
More subtle problem is reliance on the order of "global destruction". That is what happens at the end of execution: Perl destroys all data structures, including user data. If your destructors (the DESTROY subroutines) have assumed any particular ordering to the global destruction, there might be problems ahead. For example, in a destructor of one object you cannot assume that objects of any other class are still available, unless you hold a reference to them. If the environment variable PERL_DESTRUCT_LEVEL is set to a non-zero value, or if Perl is exiting a spawned thread, it will also destruct the ordinary references and the symbol tables that are no longer in use. You can't call a class method or an ordinary function on a class that has been collected that way.
The hash randomisation is certain to reveal hidden assumptions about some particular ordering of hash elements, and outright bugs: it revealed a few bugs in the Perl core and core modules.
To disable the hash randomisation in runtime, set the environment variable PERL_HASH_SEED to 0 (zero) before running Perl (for more information see PERL_HASH_SEED in the perlrun manpage), or to disable the feature completely in compile time, compile with -DNO_HASH_SEED (see INSTALL).
So what new problem was addressed by the 5.17 changes? (And has anyone ever seen a plausible demonstration of that "new problem"? Has there ever been a reported sighting of anyone exploiting that new problem in the field? If the change is so critical, why wasn't it back-ported to 5.10 and other earlier versions that are still being shipped with 95% of *nix distributions?)
Anyway, perl's hash handling has been getting faster, not slower in recent years.
Agreed. Not just hash handling, but just about every aspect of Perl (save maybe string handling) has gotten faster in recent builds. Congratulations.
However, over the years there have been some weird behaviours that only affected windows builds.
Once again I'll remind you that I was attempting to help the OP on the basis of the minimal information supplied; whilst asking him to provide more.
|
|---|