I'm attempting to dump records into table from a txt file with validation and substring. This file have 170568 records.
From normal reading files and taking out substring out of each line from file and preparing insert statement to dump records took me arround ~14400 sec i.e. ~4 hr & more
Is there any way to optimized this process like multithreading or something but I never use it before
For time being I not put my data instead I build scenario
Input file is like
00032463000000001000000010000010014110662004071664629003128182 O# 00032463000000002000000010000010014110662004071664729003070909 O# 00032463000000003000000010000010014110662004071664829003861044 O# 00032463000000004000000010000010014110662004071664929003005555 O#
Sample Code
use strict; use warnings; use DBI; my $start = time(); my $data; my ($version,$jobnumber); my $file = "C:/temp/dump.txt"; open (FH, "$file") or die "$!"; while(<FH>){ $data .= $_; } $version = substr($data,0,3); $jobnumber = substr($data,4,5); my $databaseName = "basedb"; my $connectHost = "127.0.0.1"; my $DSN = "DBI:mysql:dbname=$databaseName;host=$conn +ectHost"; my $db_username = 'root'; my $db_password = 'root'; my $dbh = DBI->connect($DSN,$db_username,$db_passwor +d,{RaiseError=>1, PrintWarn=>0,A +utoCommit=>0}) || die "error connecting: $!"; my $dump_sql = "INSERT INTO temp (version,jobnumber) VALUES(?,?);"; my $dump_sth = $dbh->prepare($dump_sql) or die "$DBI::errstr"; eval{ $dump_sth->execute($version,$jobnumber) or die "$DBI::errstr"; $dbh->commit; }; if($@){ $dbh->rollback(); print "$@\n"; } my $end = time(); my $elapsed = $end - $start; print "Job took $elapsed sec to execute.\n";
I think I clear my question Please give me suggestion.
In reply to How to optimized mysql insert into table by Mjpaddy
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |