Typical query is "one item by id", no other queries than "by id" are required.
The deletion cronjob may crawl through all objects to find deletion candidates.
Do you think Postgres would handle that amount of data? I used it for a analysis of some million shorter records (mysql profiling data :-) ) lately and felt like it got slower when importing/dealing with a great number of rows. I'll try...
In reply to Re^2: Store a huge amount of data on disk
by Sewi
in thread Store a huge amount of data on disk
by Sewi
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |