Bod has asked for the wisdom of the Perl Monks concerning the following question:
I've written Image::Square to make square images...
One of the tests I've carried out is to visually check that the new square image is taken from the correct part of the original image.
How can I convert that visual test into code?
Should I manually verify the image it is supposed to create, then take a hash of that image and compare the hash from the same process during testing? If I do it this way, I will need to use a hashing module and I'm reluctant to create a dependency that is only used in the tests. Perhaps I need to skip the test if the hashing module isn't installed.
If I do the comparison using a hash, will it work cross-platform or can I expect tests to fail on some OS's?
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Testing image output
by hippo (Archbishop) on Sep 07, 2023 at 10:27 UTC | |
Should I manually verify the image it is supposed to create, then take a hash of that image and compare the hash from the same process during testing? Sounds good to me. If I do it this way, I will need to use a hashing module and I'm reluctant to create a dependency that is only used in the tests. Digest::MD5 is a core module and has been since 5.7.3. You should have no qualms about relying on that one being present. Still declare it as a test dependency just in case for the 5.6.0 hold-outs. 🦛 | [reply] |
by Bod (Parson) on Sep 07, 2023 at 23:17 UTC | |
Thanks hippo I've used md5_hex from Digest::MD5 and checked it is installed as I've required Perl 5.010 Could I please have some feedback on this test file as testing is not my strongest skill...
| [reply] [d/l] [select] |
by eyepopslikeamosquito (Archbishop) on Sep 08, 2023 at 14:37 UTC | |
Could I please have some feedback on this test file as testing is not my strongest skill I pulled a face the instant I saw all those ok functions! From Basic Testing Tutorial by hippo: While the ok function is useful, the output is a simple pass/fail - it doesn't say how it failed ... Let's use Test::More and its handy cmp_ok function To convince yourself this is a worthwhile change, try running some failing test cases with your original ok and compare with cmp_ok. | [reply] [d/l] [select] |
by Bod (Parson) on Sep 08, 2023 at 23:22 UTC | |
by Bod (Parson) on Sep 13, 2023 at 21:55 UTC | |
by hippo (Archbishop) on Sep 13, 2023 at 23:05 UTC | |
by Anonymous Monk on Sep 08, 2023 at 10:48 UTC | |
It won't work as written. Neither coders nor decoders are stable. Does your module (can't find it on CPAN) inherit from GD (judging by width/height/jpeg methods)? It doesn't matter in the end, I only hope you force it to treat jpegs as truecolor on open, because it doesn't despite whatever its doco says. If GD converts jpeg to palette, it will be even more mess to add to description below -- looks like GD tunes this algo (true to palette quantization) more frequently, then no need to go as far back as 5.010 to demonstrate. Frog is frog
Blame ancient GD version? But
I'd say lossy codecs are murky waters and avoid them in tests:
Eh? What's the matter now? My first thought was zlib tunes its compression algo (despite same "level" 0..10), but it's not the reason for difference above -- though I strongly suspect it can influence the result for some input other than puny frog. But, above, it's just pHYs chunk that libpng decides to include from some version on. Then, what it all amounts to -- try stable (read: "obsolete") lossless coder with uncompressed output (GD can't dump raw pixels, unfortunately), short of enumerating pixels one by one and adding RGB to string. Hopefully it's OK now:
| [reply] [d/l] [select] |
by Bod (Parson) on Sep 08, 2023 at 23:17 UTC | |
by Anonymous Monk on Sep 09, 2023 at 11:23 UTC | |
by Bod (Parson) on Sep 08, 2023 at 09:49 UTC | |
Sounds good to me It might sound like a good approach...but...it's failing under testing 😕 But, I'm not sure what could be producing this failure other than different builds of GD producing slightly different outputs or the hashing being subtly different on Linux where it is failing to Windows where I am developing. I specified the image quality with $new->jpeg(50) to try and keep GD consistent across builds. | [reply] [d/l] |
by hippo (Archbishop) on Sep 11, 2023 at 09:45 UTC | |
So, it's JPEG. :-) I agree with our Anonymous friend who wrote: no JPGs in t folder, because same image file can't be expected to decode to same data. Maybe use a lossless format instead for this level of testing and then separately just confirm that using JPEGs doesn't error out? Or else see how other JPEG modules handle it in their test suites. 🦛 | [reply] [d/l] |
by Anonymous Monk on Sep 14, 2023 at 18:10 UTC | |
omg, ain't GD so very difficult. I'm looking at Image-Square 0.01_4 testers matrix, what was supposed to be walk in the park is like blood covered battlefield. Half failures are from gd native output format unsupported, who could expect. I'm sorry. It isn't really a problem, because ".gd" is just 11 bytes header plus raw data:
Note, one checksum is exactly what "t/02-image.t line 41" was expecting, but the latter is what many (but not all) failures have "got". It appears that copyResampled (and interpolation in general, see further) is unstable between versions and plagued with bugs. Then, even generating synthetic gradient or whatever, and checking for just couple of pixels (e.g. lower left and upper right points) is NOT reliable way to test anything with GD, let alone calculating checksum over whole re-sampled image. No CoventryCathedral for tests below, simply a red 8 by 8 square to reduce to smaller squares:
Oh, I thought, but I'm copying red pixels to another (smaller) canvas, filled with default black. Maybe, instead, plain simple resize would preserve pure red colour? Note, plain "resize" was not implemented in old versions anyway.
Wait, but there are a few dozen interpolation methods:
I have no idea why 3,4,5 i.e. GD_BILINEAR_FIXED, GD_BICUBIC, GD_BICUBIC_FIXED, are not ok i.e. don't preserve dumb uniform fill of dumb square canvas. I'd laugh out load if asked will this list stay stable for near future. I have much sympathy for GD, but above was a little bit too much.
| [reply] [d/l] [select] |
by Bod (Parson) on Sep 14, 2023 at 22:53 UTC | |
by pryrt (Abbot) on Sep 15, 2023 at 14:13 UTC | |
| |
Re: Testing image output
by bliako (Abbot) on Sep 08, 2023 at 07:54 UTC | |
Here is another idea, though the hash looks to me to be the simplest solution: Make the created square images small (say 10x10). Then it is practical to hardcode the expected pixel values and compare them (edit: compare pixel-by-pixel, not their hash) to the created square image's. Producing square images of size as small as 2x2, 3x3 (edit: the smaller the square image size the largest the probability they occur multiple times within the mother image, see the RGB gradient as a solution to this) is enough for testing your module, For testing behaviour at the borders just push the square image's coordinates a pixel short of the borders.There are modules to get you the raw pixel values, you probably know that but I can't inspect your module. In order to avoid bundling testing images with your module, you can create on the fly images entirely in-memory from a random array, crop and check the pixel values against the segment in the array of pixels you started with. Being lazy, I would create the big image, crop it to a square, then stitch as many of the squares to make another big image and compare the two big images pixel-by-pixel in a simple for-loop, thus, avoiding the mental torture of indexing an array of pixels to the corresponding square image's crop coordinares. In order to save resources for the testers and installers, create a mother image on-the-fly with incrementing pixel values (an RGB gradient). Then, to check the integrity of any square image, it is enough to check the pixel values of its four corners and verify arithmetically that they are indeed the correct ones. No bandwidth spend on transfering bundled test images, no disk-space for storing test images, no memory for storing and manipulating pixel values, no cpu for hashing them. And a word of caution, if you do bundle test images in your module make sure you remove all metadata with, say, exiftool. And that is good practice for all images in a website. bw, bliako | [reply] [d/l] |
by Bod (Parson) on Sep 11, 2023 at 22:40 UTC | |
And a word of caution, if you do bundle test images in your module make sure you remove all metadata with, say, exiftool. And that is good practice for all images in a website. Aare you suggesting this for security reasons, payload reduction or some other reason? | [reply] [d/l] |
by bliako (Abbot) on Sep 12, 2023 at 10:15 UTC | |
security and personal data | [reply] |