in reply to Re: Multithreading leading to Out of Memory error
in thread Multithreading leading to Out of Memory error
It's about 3000 files. I can't post all of it because it's on a classified network. All I can say is that it is mostly ascii file types, one .xls file, as well as one hex formatted file that I read through and do checks on. For each file I create a running log file with any errors of the file and at the end I insert a record into a database basically saying whether or not the file was bad. I guess if anything, I'll have to look to see if it's any of the modules I am using that could be a problem as well. Which are...
use threads(...); use Thread:: Queue; use File::Find; use File::Basename; use DBI; use DBD::ODBC; use Spreadsheet::ParseExcel; use Switch; use File::Find;
There is definitely a gradual increase in memory usage over time.
I don't have any references, which could lead to circular references that the perl garbage collector doesn't pick up.
Are there any tools I can use? Such as checking memory usage before and after a method call to ensure that the amount of used memory before and after the dealloc for that particular method is the same?
What happens in perl when two threads call on the same method? I assume each will just get their own copy, which will be fine since again, no shared resources.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: Multithreading leading to Out of Memory error
by BrowserUk (Patriarch) on Jun 07, 2013 at 19:58 UTC | |
by joemaniaci (Sexton) on Jun 07, 2013 at 21:30 UTC | |
by BrowserUk (Patriarch) on Jun 08, 2013 at 06:25 UTC |