in reply to Re^2: Gay Dating; or, Thrifty Set Intersections with an Accumulator
in thread Gay Dating; or, Thrifty Set Intersections with an Accumulator

It simply means you've got a choice between, for example: O(N) memory with a huge constant, O(N log N) with a modest constant, or O(N2) with a small constant. Which one you should use depends on the size of N.
That doesn't make any sense.
When you bring in different independent costs, you could also be considering O(N)cpu and O(N log N) memory
I presume you mean "consider a program that manage to fill supra linear memory in linear time". Which means, even with removing the big-Oh rubbish, you still don't make sense. Since you can only write to a fixed amount of memory in a fixed time, you cannot use more than a linear amount of memory if your program runs in linear time.
I don't know of anybody who bothers to be technical enough to use "Omega" instead of just saying Big-Oh. Except you I suppose.
The difference between "Omega" and "Big-Oh" isn't an obscure technicality. It's as big as the difference between < and >. Or is that a difference you usually cannot be bothered about either?
  • Comment on Re^3: Gay Dating; or, Thrifty Set Intersections with an Accumulator

Replies are listed 'Best First'.
Re^4: Gay Dating; or, Thrifty Set Intersections with an Accumulator
by SuicideJunkie (Vicar) on Aug 12, 2010 at 18:17 UTC

    s/Omega/Theta/, my bad.

    It seems that you are refusing to acknowledge some basic reality checks in favor of theoretical purity.

    N does not actually get to go to infinity, and neither does your budget of time, money, ram, disk space, network bandwidth, etc.

    If you can't imagine when one network request and NlogN cpu cycles (EG: query a central server then figure out the sources from the data patterns) could be better than N network requests and N cpu cycles (EG: asking all the sources directly), then I am truly sorry for you.

    Addendum: Consider also that you may want to not just think about, but actually implement both an O(NlogN) algorithm and an O(N2) for the same task. Then choose which one to run with an if (N > $breakEvenPoint) { $result = doNlogN() } else { $result = doNsquared() }

      N does not actually get to go to infinity
      Actually, the entire point of "big-Oh" is to talk about behaviours of functions when taking limits.

      If one doesn't want to do so, don't mention "big-Oh".

        Getting Joe Watercooler to talk about big-Oh & etc correctly is about as likely as getting everybody to use proper code indentation...