I can't be too specific about how the whole thing works, unfortunately, but basically the problem is that the system works with documents, and one system was very slow with 9000 documents; I deleted some of the documents (and their related objects) to get it down to about 6000, and the performance improved, but only marginally.
I am considering locks so that the various processes tread on each other as little as possible. And all of the parts of the big system are too slow, considering that a smaller system with hundreds of documents is acceptably fast.
Alex / talexb / Toronto
"Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds
| [reply] |
Perhaps it's just a function of the vagueness required by whatever job you happened to be in, but everything you say sounds like a severe case of premature optimization. You *sound* like you have no idea what is slowing down the system so you'll just implement some kind of locking system to hopefully make it go faster?! But maybe you do have a very good idea of what is going wrong, and you just can't tell us, but in this case, how are we supposed to help you?
From here I just see "Well I've got this system, and it works fine with 100 documents but it sucks with 6000 documents, so I'm going to do X", but I, at least, have no idea if X will help or not because you've told me nothing about the system.
Or possibly I'm just confused because I don't understand enough of what you've said about what the system actually is or does.
| [reply] |
| [reply] |