Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Is there a module that will allow me to check the size of an image that a member of my site uploads to make sure it fits the parameters that we have set? For example, if we want to restrict the size of the image to 75 x 75 or 100 x 100, that will allow us to do that?

I know how to verify it is a .jpg or .gif or .png, but not the size

I'd really appreciate any info.

Junior

Replies are listed 'Best First'.
Re: image upload size detection
by kyle (Abbot) on Aug 07, 2007 at 04:37 UTC
Re: image upload size detection
by atemon (Chaplain) on Aug 07, 2007 at 05:16 UTC

    You can check all attributes of an image with Image::Magick. With Image::Magick, you can do almost everything like reduce the size of the image to your standards, make thumbnails(if required) etc...

    Cheers !

    --VC



    There are three sides to any argument.....
    your side, my side and the right side.

Re: image upload size detection
by Samy_rio (Vicar) on Aug 07, 2007 at 05:34 UTC

    TIMTOWTDI

    use strict; use warnings; use Image::Info qw(image_info dim); opendir (FF, 'D:\Logos'); my @image = grep /\.(jpg|gif|png)$/, readdir(FF); closedir(FF); open (OUT, ">D:\\Logos\\Dim.txt") ||die ($!); for (@image){ my $info = image_info("D:\\Logos\\$_"); if (my $error = $info->{error}) { die "Can't parse image info: $error\n"; } my($w, $h) = dim($info); print OUT "$_\t$h\t$w\n"; } close(OUT);

    Regards,
    Velusamy R.


    eval"print uc\"\\c$_\""for split'','j)@,/6%@0%2,`e@3!-9v2)/@|6%,53!-9@2~j';

Re: image upload size detection
by Anonymous Monk on Aug 07, 2007 at 05:36 UTC
    Ok, let me ask you this. Would it be better to store images in directories or as a binary in MySQL in their record or in another table tied to them by their record ID?

    I would think a whole bunch of image binaries in a database would make it slow eventually. We just launched the site and already have over 1000 members, and foresee over 100k by the end of the year, and it will continue to double in size every 3 to 6 months thereafter for the next few years.

    So I am thinking just putting them in a directory called members in the images directory, with the htaccess forbidding a listing of all the pics, would be best.

    I could then name the image their username which is unique .whatever, and store the name of the file in their record and just pull it from the directory.

    I don't know what do you think would be the better way to do it?

    I don't have to use their images, but these are replicated websites with their name and phone number, I thought if their is an image, I could make a field where if they have it set to 1 it would show it if it exists, if not then it would not show it if it existed or if it did not exist it would not show anything. I thought that would be a very nice touch.

    Anyhow thank you for any wisdom you can impart, monks.

    Junior
      Storing 100K files in a directory is sure to tax any filesystem. You would have to sub-divise the directory into multiple levels à la CPAN in order to make it manageable.

      Putting the images into a database would free you of that burden, at the price of serving the files somewhat slower. After all, serving files is what a filesystem is made to do best, but sometimes you have to take it by the hand and guide it.

      CountZero

      A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James

      I don't know how much usage you're going to be seeing (how many requests per sec, etc.), but based on the general information you've mentioned, I'd probably do the following:

      1. Keep the images in the filesystem
      2. Store the images in a series of hashed directories
      3. Maintain a table of where the locations of the files are and other metadata.
      4. Turn off .htaccess
      5. Turn off directory listings for the whole server.

      .htaccess is useful for when there's distributed control of the system (different people control different directories), but it's an unnecessary overhead on a dedicated server. It's especially problematic for deep directory structures, which hashing would create.

Re: image upload size detection
by zentara (Cardinal) on Aug 07, 2007 at 11:57 UTC
    If you use ImageMagick, use it's ping method for efficiency.
    #!/usr/bin/perl -w use Image::Magick; my $x = $ARGV[0]; my $image = Image::Magick->new; #$image->Read($x); #my ($w,$h)= $image->Get('columns','height'); #print $x,' is ',$w.'x'.$h,"\n"; #this is very inneficient memory wise, use Ping ($width, $height, $size, $format) = $image->Ping($x); print $width,"\n", $height,"\n" ,$size,"\n", $format,"\n";

    I'm not really a human, but I play one on earth. Cogito ergo sum a bum