We do this here at work and we've managed to have a ton of problems handling Excel spreadsheets. What we finally did we have our users export the spreadsheet as a CSV file (one of the options from "File->Save As". Then, the users would Zip the file and upload the zipped version. This is considerably smaller than uploading a spread sheet.
You do not want to use GET as you will almost certainly get a "414 Request URL Too Long" error. This is configurable by your Web server, but it would still be ridiculous (and I'd just love to see your Web access logs with that one :). You want to use CGI.pm to upload the data. To unzip it, Archive::Zip works really well. If you have problems with Archive::Zip truncating data, make sure that you have the latest version of Compress::Zlib installed as this is a common problem.
To process the contents of the CSV file, you can try Text::CSV or any of another handful of modules to deal with this.
I would consider uploading the files to the server and then having another process deal with them. You're dealing with large files and the script might timeout on the user. This is Not Good. Be sure you have the user's email address so that you can send them feedback on the success or failure of incorporating the data into the database. Another reason you might want to consider having another process handle adding the data to the database is that it will be much easier to control the program flow. You never know when someone is going to click on the "Stumbit" button.
With uploads this large, I would seriously rethink a MySQL decision unless you have a version that supports atomic transactions. If you need to roll back the upload after processing the 6000th line, you're going to hate life if you can't issue a simple $dbh->rollback command.
Be sure to test the heck out of this. Be sure to insert gobs of dummy data into the system. Use bad dates, quote numbers, stuff in an extra field or 17. Short some of your records by a few fields, etc.
Also, I would consider having them FTP the data instead. Periodically check the FTP directory and process (and subsequently delete) files that are in there. Users wouldn't have to worry about waiting for their browser and it's an easier problem for you to code around.
Basic security: make sure you use placeholders with your DBI statements. Since you'll likely have repetitive statements, this means you script will operate faster. It also ensures that people can't "accidentally" try to insert dangerous characters in the spreadsheet that might break your script. Make sure that you delete the files after you are done with them (don't want to fill up the harddrive with files this large) or, better yet, figure out an archival scheme. Also, limit the max total directory size of the directory these files are uploaded. No sense in having 30 people upload 50 meg files and having your server crash from lack of disk space.
Cheers,
Ovid
Vote for paco!
Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.
In reply to (Ovid) Re: Uploading Excel data into MySQL via a browser
by Ovid
in thread Uploading Excel data into MySQL via a browser
by La12
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |