Re: Convert script to use MySQL
by ikegami (Patriarch) on Nov 26, 2004 at 20:49 UTC
|
For the snippet you presented, assuming you've already created the table in your database, you could use the following to move the data from the flatfile db to the MySQL db:
open(ORGDB,"<$database")
or die("Error opening database file.\n");
$dbh = DBI->connect(
'dbi:mysql:...',
'...',
'...',
{ AutoCommit => 1, PrintError => 0 }
) or die("Unable to connect to database: $DBI::errstr\n");
$sth = $dbh->prepare("INSERT INTO Agents (Name, Phone, Cell, Email) VA
+LUES (?, ?, ?, ?)")
or die("Unable to prepare SQL statement: $DBI::errstr\n");
while (<ORGDB>) {
chomp;
($agent_name,$agent_phone,$agent_cell,$agent_email)=split(/\|/);
$sth->execute($agent_name, $agent_phone, $agent_cell, $agent_email)
or die("Unable to insert record: $DBI::errstr\n");
}
close(ORGDB);
$sth->finish();
$dbh->disconnect();
| [reply] [d/l] |
Re: Convert script to use MySQL
by jfroebe (Parson) on Nov 26, 2004 at 20:42 UTC
|
Hi Lisa
There are a few cgi examples in the Code Catacombs but I would hold off on that for a little while.
First read up on MySQL itself.. namely learning a bit of SQL (if you don't already know it). Next write a few simple DBI/DBD::MySQL scripts until you are comfortable with making queries. Then and only then, tackle a few CGI scripts with mysql.
that's just my opinion though ;-)
Jason L. Froebe
Team Sybase member No one has seen what you have seen, and until that happens, we're all going to think that you're nuts. - Jack O'Neil, Stargate SG-1
| [reply] |
Re: Convert script to use MySQL
by erix (Prior) on Nov 26, 2004 at 21:04 UTC
|
mysqlimport is an easy way to load such a file as you describe into a mysql table. Alternatively, you can use LOAD DATA INFILE in the mysql shell, or in a script for it.
Another way of getting your work done is via phpmyadmin, but of course you would need to install php first. PhpMyAdmin lets you create tables and import data from files.
Once you have your data in MySQL see: DBI recipes, or just post another question ;)
| [reply] |
Re: Convert script to use MySQL
by radiantmatrix (Parson) on Nov 26, 2004 at 21:03 UTC
|
Rather than splitting out all of your pipe-separated records and dealing with the matches in the way you do, check out Text:xSV. That module will allow you to work with files of any delimiter in a much more intelligent manner.
| [reply] |
•Re: Convert script to use MySQL
by merlyn (Sage) on Nov 26, 2004 at 21:27 UTC
|
The real question for me is why MySQL, and not something more complete, like PostgreSQL? Are you not aware of PostgreSQL? Is it not installed on your system?
MySQL had its day. Now it's being left in the dust by PostgreSQL. There's no reason in my mind to continue to use MySQL except for legacy (software or people).
Update:
Take a look at the bullet list at the end of http://advocacy.postgresql.org/advantages/ if you're not sure why I'm saying what I'm saying. Most of what you see there is not mature (if available at all) in MySQL. But the most particularly troublesome part of MySQL is the licensing. Once you are making money with MySQL somehow, you're almost certainly in the "commercial" category, and will have to pay hefty fees to "license" the non-personal version. PostgreSQL is BSD-style licensed, and is therefore truly free, to embed or adapt as you see fit. | [reply] |
|
|
There's no reason in my mind to continue to use MySQL except for legacy (software or people).
How about performance?
I'm working with a system that essentially does OLAP with a lot of updates. PostgreSQL (7.3.4) works great for it in theory, but in practice the system bogs down over a relatively short period of time (a few hours), and way down over the course of a day. We have to stop and vacuum the database nightly. A vaguely similar system I've worked on that uses MySQL (4.0.something with InnoDB tables) didn't suffer the same degradation.
| [reply] |
|
|
| [reply] |
|
|
Could you provide some documentation on differences between the two for us still in the (apparent) Dark Ages of MySQL?
| [reply] |
|
|
Assuming always that PostgreSQL is available on your OS. Assuming that as the OP is talking about CGI that his hosting service supports it, or even allows it!
Speaking personally, I have tried three implementations of PostgreSQL on Windows. Using cygwin is a performance nightmare. The two other commercial products - both based on pretty old (like 7.2.1) PostgreSQL - are expensive (about double the price of commercially licensed MySQL) and are not yet really suitable for end-user application.
When PostgreSQL finally has native Windows support then it will be interesting. But until then it is nice to have on a *nix box, but not much use in my day to day real world. I am sure that one day it will be able to compete - but they have a little way to go yet.
jdtoronto
| [reply] |
|
|
1.4) What non-Unix ports are available?
Starting with version 8.0, PostgreSQL now runs natively on Microsoft
Windows NT-based operating systems like Win2000, WinXP, and Win2003.
A prepackaged installer is available at http://pgfoundry.org/projects/
+pginstaller.
MSDOS-based versions of Windows (Win95, Win98, WinMe) can run
PostgreSQL using Cygwin.
| [reply] [d/l] |
|
|
|
|
|
|
| [reply] [d/l] |
|
|
Score -1: Flamebait, heh :)
Anyway, why not keep using the flatfile? If it is a read-only small data store, just keep using that, with proper file locks. Or maybe use a BerkeleyDB tied hash. I'm sure there are other options to look at as well...
Again, I don't know exactly what the requirements are, but what I am pointing out is that an SQL engine can be overkill sometimes.
| [reply] |
Re: Convert script to use MySQL
by TedPride (Priest) on Nov 28, 2004 at 18:03 UTC
|
Unless you're working with quite a large number of records, your current system is just fine. A few modifications wouldn't hurt, though:
open (DATABASE,">$database");
@DB=<DATABASE>;
If you open a file for write, it's automatically emptied. What exactly is the point of the second line above, especially considering you never use @DB anywhere else in your script?
if ($agent_name eq $input{'nagent_name'} && $agent_phone eq $input{'na
+gent_phone'} ){
nagent? This can't be intentional, considering this is the only line that uses nagent.
fopen (ORGDB,"<$database");
@ODB=<ORGDB>;
close (ORGDB);
...
foreach $rec (@ODB){
chomp($rec);
($agent_name,$agent_phone,$agent_cell,$agent_email)=split(/\|/
+,$rec);
You don't have to chomp each line if you're only doing matches on the first two fields. You also don't need the variable $rec:
for (@ODB) {
($agent_name,$agent_phone,$agent_cell,$agent_email)=split(/\|/);
In addition, you can increase the efficiency of your file access by using read / split instead of assigning the contents of the file handle to an array. You may also be able to increase efficiency by modifying the found line inside the array and then writing the whole array at once (using join), but I haven't tested that yet and don't know if there's a time gain.
Bottom line, you don't need a database unless you have at least several thousand records. Just modify your script for higher efficiency. Beyond that, I'd use mySQL. :) | [reply] [d/l] [select] |