in reply to Re: Gay Dating; or, Thrifty Set Intersections with an Accumulator
in thread Gay Dating; or, Thrifty Set Intersections with an Accumulator

A range of options between O(N) and O(N2) makes perfect sense to me. As long as you're not trying to mentally apply Perl's range operator that is. :)

It simply means you've got a choice between, for example: O(N) memory with a huge constant, O(N log N) with a modest constant, or O(N2) with a small constant. Which one you should use depends on the size of N.

When you bring in different independent costs, you could also be considering O(N)cpu and O(N log N) memory, vs O(N log N) cpu and O(N) memory. If you care to think of it this way, there is a multidimensional continuum of efficiencies between O(N) and O(N2) to consider.


BTW, I don't know of anybody who bothers to be technical enough to use "Omega" instead of just saying Big-Oh. Except you I suppose.

In practice, I've found it to always be a "Big-Crossed-Fingers" aka "Best Guess at Omega But Not Worth Strictly Proving; Nobody Wants The Cost To Make It Rigorous."

Few people would know what you're talking about if you said Big-Omega, instead of Big-Oh. Big-Oh gets used as a generic term whenever there are non-mathies around (which is always unless you're still a math student). Bit of a self-sustaining "problem" but as far as I can tell, nobody really cares.

  • Comment on Re^2: Gay Dating; or, Thrifty Set Intersections with an Accumulator

Replies are listed 'Best First'.
Re^3: Gay Dating; or, Thrifty Set Intersections with an Accumulator
by JavaFan (Canon) on Aug 11, 2010 at 21:43 UTC
    It simply means you've got a choice between, for example: O(N) memory with a huge constant, O(N log N) with a modest constant, or O(N2) with a small constant. Which one you should use depends on the size of N.
    That doesn't make any sense.
    When you bring in different independent costs, you could also be considering O(N)cpu and O(N log N) memory
    I presume you mean "consider a program that manage to fill supra linear memory in linear time". Which means, even with removing the big-Oh rubbish, you still don't make sense. Since you can only write to a fixed amount of memory in a fixed time, you cannot use more than a linear amount of memory if your program runs in linear time.
    I don't know of anybody who bothers to be technical enough to use "Omega" instead of just saying Big-Oh. Except you I suppose.
    The difference between "Omega" and "Big-Oh" isn't an obscure technicality. It's as big as the difference between < and >. Or is that a difference you usually cannot be bothered about either?

      s/Omega/Theta/, my bad.

      It seems that you are refusing to acknowledge some basic reality checks in favor of theoretical purity.

      N does not actually get to go to infinity, and neither does your budget of time, money, ram, disk space, network bandwidth, etc.

      If you can't imagine when one network request and NlogN cpu cycles (EG: query a central server then figure out the sources from the data patterns) could be better than N network requests and N cpu cycles (EG: asking all the sources directly), then I am truly sorry for you.

      Addendum: Consider also that you may want to not just think about, but actually implement both an O(NlogN) algorithm and an O(N2) for the same task. Then choose which one to run with an if (N > $breakEvenPoint) { $result = doNlogN() } else { $result = doNsquared() }

        N does not actually get to go to infinity
        Actually, the entire point of "big-Oh" is to talk about behaviours of functions when taking limits.

        If one doesn't want to do so, don't mention "big-Oh".