Braindead_One has asked for the wisdom of the Perl Monks concerning the following question:

Update: i Found my first mistake and changed the post to adress a different error in the same Code-Snippet, sorry :/

Hello fellow Monks!
I'm currently writing a small CGI-Script that, among other things, tries to feed images from a Mysql-DB to the browser. But for some strange reason i can't read both content-type and data from within the same SQL-Statement:

my $sql = "select type,data from bilder where id =?"; my $sth = $dbh->prepare($sql); $sth->execute( $q->param('id') ); my $error = 0; (my $type, my $data) = $sth->fetchrow() || ( $error = 1 ); binmode(STDOUT); if ( $error == 1 ) { print $q->header( -type => "image/gif" ); open( IN, "templates/keinbild.gif" ); binmode(IN); while (<IN>) { print; } close(IN); } else { print $q->header( -type => $type ); print $data; }
$type should be 'image/gif', 'image/png' or 'image/jpeg', depending on the output of Image::Info.

Adding a warn($type) gives:

[Sun Dec 18 11:24:34 2005] [error] [client 192.168.x.y] (\xa2\x80, ref +erer: http://server.lan/cgi-bin/index.pl?action=picman
DB is as follows:
CREATE TABLE `bilder` ( `id` int(11) NOT NULL auto_increment, `data` longblob NOT NULL, `type` enum('image/jpeg','image/gif','image/png') NOT NULL default ' +image/jpeg', `resx` int(11) NOT NULL default '0', `resy` int(11) NOT NULL default '0', PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 ;
Does anyone know what's wrong here?

Thank you in advance
Braindead_One

Replies are listed 'Best First'.
Re: Mysql and Images
by gloryhack (Deacon) on Dec 19, 2005 at 09:01 UTC
    It's generally best for several well documented reasons to store images, PDF files, etc. somewhere in the file system, and include only the path to the file in the database. I'm averse to helping someone do a wrong thing, so I'll leave the correction of the code as an exercise to anyone else who wishes to take it up.

    If there's a compelling reason to store images in the database, please set me straight and I'll take another look.

      If there's a compelling reason to store images in the database, please set me straight

      I sympathize with your poit of view, often lead to a thinking that all databases rest on those few unix-like operating systems and machines with a fortunate near-perfect disk-caching.

      I the old days, that wasn't the case. Actually, even today, it often isn't the case. As that happens, some databases caching and disk operations outperform those of the operating system... yes that's hard to believe.

      Imagine you find out it's true. Would you now go for storing large binary thingies in the database if you are asked to deliver them quickest possible?

      That said, I'll always support your opinion if that database runs on linux or on one of the unices with great disk caching.

      Cheers, Sören

        As that happens, some databases caching and disk operations outperform those of the operating system... yes that's hard to believe.

        Imagine you find out it's true. Would you now go for storing large binary thingies in the database if you are asked to deliver them quickest possible?

        In the OP's case, we're looking at an HTTP server running on the same host, so any gain in image delivery is going to have to contend with disk seeks initiated by the HTTP server which is probably either forking or multithreaded and reliant upon the file system provided by the OS. Under any appreciable load, huge gains in the service of images will be very small gains in system performance -- tortoises have an optimally low coefficient of drag but they still don't go very fast. :)

        If the server in question is on an ethernet (plain old 10Mbps) link, then the bottleneck is almost certain to be the network and just about any relatively recent/decent machine will be able to keep it swamped 24/7 without much optimization outside of the HTTP server itself. If it's on fast ethernet (100Mbps) or faster, then it's conceivable that the file system is the bottleneck -- but we're approaching the hardware limitations of most commonly deployed disk systems and cannot afford a lame operating system anyway.

        I'd consider overcoming a lame OS by shifting the burden to the database server if:

        1. The file system implementation (not the hardware itself) is proven to be the content delivery bottleneck;
        2. The database server is significantly faster *and* is capable of supporting the greatest anticipated production load *with headroom* for the inevitable invocations of Murphy's Law;
        3. The task of image service cannot be offloaded to another host; and
        4. There is an insurmountable obstacle between the current reality and a more suitable solution, such as installing a free Linux OS to eliminate the lame OS.

        Most often, the simpler solution is the best one. Unless/until benchmarking (in this case, from a remote host) proved that the performance of the system as a whole was lacking, I'd just let the HTTP server deal with file service, and do my best to get out of its way as quickly as possible.

        Caching aside, if the insert-whatever-binary-data lives in the filesystem then serving the content to the end user can be handled entirely at the HTTP layer without any interaction with the RDBMS. It could possibly be handed off to separate HTTP servers dedicated to that purpose which never even talk to the RDBMS, lowering the load on the database and freeing up its resources to do what it does best (servicing more queries rather than shuffling bits).