It simply means you've got a choice between, for example: O(N) memory with a huge constant, O(N log N) with a modest constant, or O(N2) with a small constant. Which one you should use depends on the size of N.That doesn't make any sense.
When you bring in different independent costs, you could also be considering O(N)cpu and O(N log N) memoryI presume you mean "consider a program that manage to fill supra linear memory in linear time". Which means, even with removing the big-Oh rubbish, you still don't make sense. Since you can only write to a fixed amount of memory in a fixed time, you cannot use more than a linear amount of memory if your program runs in linear time.
I don't know of anybody who bothers to be technical enough to use "Omega" instead of just saying Big-Oh. Except you I suppose.The difference between "Omega" and "Big-Oh" isn't an obscure technicality. It's as big as the difference between < and >. Or is that a difference you usually cannot be bothered about either?
In reply to Re^3: Gay Dating; or, Thrifty Set Intersections with an Accumulator
by JavaFan
in thread Gay Dating; or, Thrifty Set Intersections with an Accumulator
by Xiong
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |