Sewi has asked for the wisdom of the Perl Monks concerning the following question:
I need to store a huge amount of data having a fixed structure:
I considered using MongoDB, but it's becoming slow for 15+ mio. items and has a 16 MB limit per item. mySQL can't handle this amount, too. I'd like to store the stuff in files, but avoid one file per item as these many files are hard to handle for filesystems.
I considered tie and GDBM_File which is rock solid on reading, I could store many items in one file, delete them and append/insert text blocks as they are arriving, but GDBM is critical when more than one process is writing the same file and I'm not sure that no two process will ever write the same file as new text blocks are arriving for different messages.
Any suggestions?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Store a huge amount of data on disk
by erix (Prior) on Oct 18, 2011 at 15:43 UTC | |
by Sewi (Friar) on Oct 18, 2011 at 15:52 UTC | |
by erix (Prior) on Oct 18, 2011 at 17:08 UTC | |
by Sewi (Friar) on Oct 18, 2011 at 18:41 UTC | |
|
Re: Store a huge amount of data on disk
by BrowserUk (Patriarch) on Oct 18, 2011 at 15:37 UTC | |
by Sewi (Friar) on Oct 18, 2011 at 15:48 UTC | |
by BrowserUk (Patriarch) on Oct 19, 2011 at 00:16 UTC | |
by Sewi (Friar) on Oct 19, 2011 at 05:13 UTC | |
by BrowserUk (Patriarch) on Oct 19, 2011 at 14:51 UTC | |
by zentara (Cardinal) on Oct 19, 2011 at 16:34 UTC |