dump_results is not meant to fetch data!
use strict; use warnings; use autodie; use DBI; use Data::Peek; # For debugging only my $dbh = DBI->connect ("dbi:CSV:", undef, undef, { f_dir => ".", f_ext => ".csv/r", RaiseError => 1, PrintError => 1, }); my $sth = $dbh->prepare ("select * from viewmanifest"); my %rec; $sth->execute; $sth->bind_columns (\@rec{@{$sth->{NAME_lc}}}); while ($sth->fetch) { DDumper \%rec; # For debugging only print $rec{dahandle}, "\n"; } $dbh->disconnect;
FWIW DBD::CSV doesn't need to support LongReadLen as the only limit to the size of the fields is implied by the available memory and the remaining space on disk (DBD::CSV is always AutoCommit so the data in the table should also fit on disk). The other database that I know of that ignores LongReadLen completely as it has direct bindings to the perl variable is DBD::Unify.
In reply to Re^3: DBD::CSV and long fields (LongReadLen not supported?)
by Tux
in thread DBD::CSV and long fields (LongReadLen not supported?)
by zyzzogeton
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |