It started giving me trouble when I copied all the files over from the old account to the new one.
This makes me suspicious. If the two servers were different in any of a variety of ways (different architectures, especially big-endian (PPC) vs. little-endian (x86); 32-bit vs. 64-bit; different perl versions; different versions of Storable.pm...) then anything saved with Storable on one machine will likely not be readable with Storable on another machine.
The actual script should not have a problem; any data you wrote with Storable is where the issue is. And by "wrote", I'm including anything you put into a database from the first server as well. Are you hitting the same tables (same database and same schema) as you were before, or have you set up a new database?
The one thing that still worries me about your code in particular is that you seem to be setting a value into that column directly, then pulling it out; since that is all done on the same machine, the compatibility issues above should not come up. So I'm still not convinced that the column method is safe... but if it works fine on one machine, it likely is.
(On the other hand, if the data you are putting into the column now is not exactly the same, then there still could be reasons that it is failing that have nothing to do with it being on a new machine. In particular, if the freeze representation of the data is too wide to fit into your column, and you do not check for width overflow, then that could be corrupting your data by truncating it. You should be able to use the second test script in my original post to determine whether this is the case; instead of setting $orig to a generated string, find the actual data you were trying to store when your code blew up. The idea is to test just your column method, without complicating it by trying freeze and thaw on it.)
So, how to fix it? You will need export those values in a portable way on the first machine, then import them on the new machine. Data::Dumper is probably good enough for you here; it is slower than Storable, but you should only have to do it once. (Note that just doing a database-level export/import won't help you, if the problem is where I think it is.)
I can't give you code for how to do this without knowing how your database is set up. If you have less than about 10K rows, though, you can probably get away with using Data::Dumper to spit out each table row in a file named by a primary id number:
my $dumper = Data::Dumper->new(); $dumper->Terse(1); $dumper->Indent(0); my $sth = $dbh->prepare( 'SELECT * FROM table' ); $sth->execute(); # open a file to track all ids my $id_file_name = 'all-ids.txt'; open my $id_fh, "> $id_file_name" or die "opening $id_file_name: $!" # array position of primary key my $id_ix = 0; while ( my $cur = $sth->fetch() ) { my $id = $cur->[$id_ix]; open my $fh, "> $id" or { do warn "opening $id: $!"; next }; print $fh $dumper->($cur); close $fh or warn "closing $id: $!"; print $id_fh "$id\n"; } close $id_fh or die "closing $id_file_name: $!"; $sth->finish();
And you can then re-import them like so:
# open a file to track all ids my $id_file_name = 'all-ids.txt'; open my $id_fh, $id_file_name or die "opening $id_file_name: $!" my @ids = map { s/\s+$//; $_ } <$id_fh>; close $id_fh or die "closing $id_file_name: $!"; my $n_cols = XXX; # how many columns in "table"? my $places = join ',', ('?') x $n_cols; my $ins = $dbh->prepare( "INSERT INTO table" . " VALUES ( $places )" ); local $/ = undef; foreach my $id ( @ids ) { open my $fh, $id or do { warn "opening $id: $!"; next }; my $str = <$fh>; close $fh or do { warn "closing $id: $!"; next }; my $aref = eval $str; if ( !$aref || $@ ) { warn "$id: eval failed (str='$str')"; next; } $ins->execute( @$aref ); }
If you have up to about a million records, you can scale the above by making subdirs out of the last few digits of the primary id (which I'm assuming is an integer; if not, write out an integer counter to the "all-ids.txt" file instead, and use that to name the files). Something like:
my $subdir = sprintf '%03d', $id % 1000; unless ( -d $subdir ) { mkdir $subdir, 0777; } my $file = "$subdir/$id";
Up to a million rows, this will keep any subdirectory smaller than 1000 files, which should still be managable.
In reply to Re^5: Storable perl module error?
by tkil
in thread Storable perl module error?
by johnsaunders
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |