1

What time complexity would you classify the following as having?

int n = 100;
for(int x = 0; x < n; x++)
    for(int y = 0; y < n; y++)
        for(int z = 0; z < n; z++)
            DoWork(x,y,z);

I don't think anyone would argue that it's O(n^3)

Now consider a scenario where the bounds for each dimension are provided as 3 seperate inputs

int bx = 10, by = 1000, bz = 1000
for(int x = 0; x < bx; x++)
    for(int y = 0; y < by; y++)
        for(int z = 0; z < bz; z++)
            DoWork(x,y,z);

How would you describe the complexity of the above? I would have intuitively described this as still being O(n^3) as you still need to iterate in all 3 dimensions.

A friend suggested that the magnitude of the input comes into play and since bx is several orders of magnitude less than by or bz, that you would instead define it as O(n^2)

Which is it?

Edit

Just to provide a little more context as people have been voting to close the question.

This came out of a discussion around the AdventOfCode 2018 - Puzzle 6 (https://adventofcode.com/2018/day/6)

The "bounds" of this puzzle, were a 50 line input file where each input defined a point. So every competitor was working on a solution that was bounded by

  • number of inputs: 50 --constant
  • n = max-x coord: different per competitor - unique input values we generated for each user
  • m = max-y coord: different per competitor - unique input values we generated for each user

I think based on the feedback below, that makes the best case O(n*m) as you'd just ignore the constant 50 input values.

3 Answers3

7

Personally I would put it as O(bx*by*bz). The reason being I assume that the parameters can all change arbitrarily to one another, for example you could get the following input: bx=30000, by=30, bz=40. See how the big-Oh is now dependent on the bx instead due to it being many orders of magnitude higher? However, if those values are always constant say always bx = 10, by = 1000, bz = 1000 then it would be O(1).

3

The first thing you need to study detail about asymptotic analysis and amortized analysis.

If we consider Asymptotic analysis for your scenario with bx = 10, by = 1000, bz = 1000 as your friend suggested if you consider input set by and bz then relatively bx is having a tiny value which can be considered as constant However in your case all these values are itself constant and it's running time can be considered as O(1).

While calculating asymptotic upper bound Big-O 'O' we ignore the constants.

However, if you just use values like: bx = 1000000000, by = 100000000000, bz = 100000000000 where the input size of bx also relatively higher hence by definition of big-o

That is, f(x) = O(g(x)) if and only if there exists a positive real number c and a real number x' such that
f(x) <= c g(x) for all x > x'

you can state the complexity to O(bx*by*bz).

To summarize in the first case you used the same value n to compute all loops but in another case, you used three different input size hence your complexity will always be O(bx*by*bz) and if any one or more value of bx, by or bz is constant then you can omit it.

Gahan
  • 131
  • 3
0

There's a big, important detail missing in the other answers to the question here. This:

int n = 100;
for(int x = 0; x < n; x++)
    for(int y = 0; y < n; y++)
        for(int z = 0; z < n; z++)
            DoWork(x,y,z);

is not a cubic-time O(n^3) algorithm. Its runtime depends on the value of the single input n, not the size of the input called n. This program runs in pseudo-polynomial time because it is polynomial in the numeric value of its input but it is actually exponential in the size of its input (which is log_2(n), the number of bits needed to represent it).


The other answers are therefore correct that your algorithm runs in O(bx * by * bz), but your statement that

I would have intuitively described this as still being O(n^3) as you still need to iterate in all 3 dimensions.

is wrong. Big-O counts the number of operations, not the depth of a loop nest.