in reply to Re^2: cygwin ATE CPAN!!!
in thread cygwin ATE CPAN!!!

The (Open_Source) IM gang only supports the msvc compiler on windoz and my research suggests that you have to use the same (32bit) compiler and the same (i386??) optimizations to link to their binary distributions.

But doesn't your Strawberry build link to their binary distributions ?
Mine used their dlls, though I (perhaps unnecessarily) rebuilt the import libs from those dlls using gendef and dlltool.

I get the same test failures as you ... though, occasionally, the montage tests succeed.

Do you know how siginificant those failures are ? (I don't.)
They're not *obviously* of far-reaching siginificance:
C:\sisyphusion\Image-Magick-6.89>perl t/montage.t 1..19 ..... Test 12, signatures do not match. Expected: bcd96dabb454c5d25091422763b1cdecb6a69a9b02b84a5b7fa0a70 +f150b957c Computed: f1ed563cf9a446dd4516945b475bc21692037d0c12a83405a837abc +bdf37bcf7 Depth: 16 not ok 12 Test 13, signatures do not match. Expected: 9209b2db884fa4730eeab6c410b90e094fa305635baab7ede17270c +13f6e80ad Computed: c09fb4e6033eb562b277dc2d8d5b61e4255b56ee9c666011d18e4fe +7fbee994b Depth: 16 not ok 13 ..... C:\sisyphusion\Image-Magick-6.89>perl t/wmf/read.t 1..2 mean-error=1.00003051827663, maximum-error=1 not ok 1 mean-error=1.00003051827663, maximum-error=1 not ok 2
Even the slightest and most insignificant variation would result in a failed signature match.
And 1.00003051827663 is only just outside the "maximum-error".

Cheers,
Rob

Replies are listed 'Best First'.
Re^4: cygwin ATE CPAN!!!
by BrianP (Acolyte) on Dec 13, 2015 at 18:41 UTC

    Rob, >> Mine used their dlls, though I (perhaps unnecessarily) rebuilt the import libs from those dlls using gendef and dlltool.

    I finally got everything to compile and link, but I could not make it work with static. I had to use --enabled-shared. Why would anybody prefer shared over static for, say, ImageMagick?

    When you are dashing off to work in the morning, who would want to scour the neighborhood for 4 matching tires, pressurize them uniformly, jack up the car, torque them to spec, put the tools away and then, after half an hour of needless rigmarole, you're finally off! Why?

    In the horrible, old dos days when you had to operate in a Megabyte or less, this made sense. Today, breaking find.exe into a 64k .exe and 3 DLLs totaling 4.26MB seems anachronistic.

    With disk space going for $150 per 4 TERABYTES, a 100% disk savings on 4 Mb is worth:
    4E6B * $150 / 4E12B = $0.00015.
    If you have 7000 dlls, your savings could approach a whole DOLLAR! All of cygwin that I used had 331 DLLs totaling 75MB so they saved less than ¢1/3. The space saving angle is PREPOSTEROUS on the face of it!

    How much memory are you saving when you load a DLL containing 100 functions when you will only use 3 of them, with each costing a page fault? Caching code you will never use is a 100% waste. Seeking your heads all over town scavenging scattered DLLs while reading ahead MBs of stuff you will never use is another titanic cache miss! How can you beat 1 seek, 1 read, a tiny amount of excess read ahead and being immediately ready for action?

    But if you already a module have loaded into memory, you can share the same image? It's like having 5 kids and one set of toys and arbitrating who will get which toy and when. Buy each kid their own toy for $5 each and forget about the critical regions, setting and checking semaphores, locking and unlocking, rounding of robins, mutual assured exclusions, mutating mutexes, etc.

    There is a case when using a library provided by a third party where a DLL is efficient, but when you have all of the code right there, chopping it into tiny chunks and assembling it just before you need it seems like the pinnacle of inefficiency.

    Time is $$. My main use case is crunching 219MB of UINT48, raw, RGB data AFAP. Paying a 5% - 20% performance penalty to save a MB of memory seems absurd unless you are on a tiny device with extremely limited resources. I have 32GB already paid for so saving a meg or 2 saves me absolutely nothing.

    I must be missing something here.
    B

      I must be missing something here.

      Yeah, the part where the ecosystem (cygwin, cpan, OS, everything) was created and tailored for 32GB RAM/4TB HD systemsisn't)

      I finally got everything to compile and link, but I could not make it work with static. I had to use --enabled-shared. Why would anybody prefer shared over static
      I must be missing something here.

      Yes:

      #!/usr/bin/perl use v5.12; use warnings; my $n=0; my $bytes=0; for my $dirname (split /:/,$ENV{'PATH'}) { opendir my $dir,$dirname or die "Can't open $dirname: $!"; my @exe=grep { -f -x $_ } map { "$dirname/$_" } readdir $dir; closedir $dir; say "$dirname: ",0+@exe; $n+=@exe; $bytes+=-s $_ for @exe; } say "total: $n, $bytes";

      On my Slackware server, I count 3780 executables, most of them dynamically linked, using 765_220_787 bytes.

      Imagine a bug in a common library, like libc.so. How do you fix it?

      On a dynamically linked system, replace the broken libc.so and reboot. ld.so will link the repaired libc.so into all dynamically linked executables. Problem solved, replacing one file of about 2 MBytes in size. (We could discuss if rebooting is really needed or going through single user mode is sufficient, but a reboot eleminates all traces of the broken library in memory.)

      On a statically linked system, ALL executables linked against the broken library have to be replaced. In case of libc, that means all executables, except for one or two specially crafted executables that work without libc, and except for scripts. On my server, that would be about 2500 to 3000 executables (estimating most of the 3780 executables are binaries), using most of the 700+ MBytes.

      I prefer downloading 2 MBytes to fix a bug in a single place over downloading 700 MBytes and scanning each and every executable to see if it was linked against a broken static library.

      Also, the code of shared libraries is (ideally) kept in memory only once, whereas statically linked executables don't share identical code, wasting memory that could be used for other purposes (filesystem caching, application data).

      See also https://en.wikipedia.org/wiki/Library_%28computing%29

      Alexander

      --
      Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)