I'm writing a small ajax responder that will need to drop into a webserver which I have absolutly no control over (no shell access), is a different architecture (Solaris on a sparc) than I develop with, and has a very default perl install. Execution under CGI does not allow me to write to a file.
I've got a very simple data requirement. I need indexed access to a specific row (not range) of data. I've got 50,000 rows of simple, small text data with a small (3-5 character) text key. The data never changes except when I want to upload a new data file.
I built a nice little application using DB_File (DB_HASH). I created the hash.db on my Linux x86_64 box and uploaded it along with a perl cgi to the sparc. It didn't work :(. I'm not sure why, either it isn't binary portable or the spark has a funky berkeley db.
Is there a native perl implementation of DB_File? Is there a way to create a more generic Berkeley hash file? Should I look towards other packages such as MLDBM? Should I go with something huge like sqlite? Should I write my own file structure and b-tree?
Further complicating my choices is the fact that my client is a mere pawn in an enourmous government burocracy and had to grovel just to get execute permissions for cgi. I had to painfully guide him through ftp chmod 0755 to even get the script to run. I need a solution that is guaranteed (or just likely) to work under any environment.
Any help is welcome!
In reply to Super-Portable DB_File Solution by rokadave
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |