Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

Re: Improving SQL speed

by gmargo (Hermit)
on Nov 01, 2009 at 16:40 UTC ( [id://804358]=note: print w/replies, xml ) Need Help??


in reply to Improving SQL speed

One alternative is to process all the mp3 files, and then write all the data simultaneously in one call to execute_array, instead of one call to execute for each mp3 file. It's more work to set up (must accumulate all data in separate arrays and use bind_param_array) but for a large number of entries it will probably be faster.

Replies are listed 'Best First'.
Re^2: Improving SQL speed
by doom (Deacon) on Nov 01, 2009 at 17:59 UTC

    And if ganging up inserts still seems too slow, there's apparently a bulk load command named ".import" that will load a table from a file in one-shot:

    The fact that there's a ".seperator" command makes it sound like it's better suited to tsv files than csv, but you'd need to play around with it to see. Most database bulk-loaders seem to work with some version of csv, but frequently you'll need to massage the csv first to get it in the form they expect: csv is a standard without a standard.

      Thanks. I'd had skimmed some other pages that suggested for this type of thing it might be faster to write everything out to a CSV and the do a bulk upload, but I hadn't found a good example of the necessary syntax yet. I'll check out those links.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://804358]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others goofing around in the Monastery: (4)
As of 2024-03-28 18:28 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found