The database is connected to once per file. SQLite is supposed to handle multiple connections, but I don't know if buffering is in place or how well it handles multithreading.
[Update: The parser is used for profiling files; the results are stored in the database. I'm using an existing framework and, honestly, the database seems to make sense.]
I had debated using a queue to bring results back to the parent thread to handle all the requests. This requires a lot more code restructuring, so I didn't follow that approach initially. Right now, with the 60% drop in performance for adding threading, it doesn't seem like it's worthwhile to pursue queuing database insertions. I suppose I could check if it's worthwhile to do db commits in the parent by doing some code profiling, but since I don't have a framework in place for that, I expect it would be more work than just trying the queue.
Just to clarify, if I put "use forks" in before "use Thread::Pool::Simple", then I'll get multiprocessing instead of multithreading?
In reply to Re^4: Thread local variables in Thread::Pool::Simple
by Annirak
in thread Thread local variables in Thread::Pool::Simple
by Annirak
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |