Interview questions - big-o

This is an interview question:
Given: f(n) = O(n)
g(n) = O(n²)
find f(n) + g(n) and f(n)⋅g(n)?
What would be the answer for this question?

When this answer was prepared, f(n) was shown as o(n) and g(n) as Θ(n²).
From f(n) = o(n) and g(n) = Θ(n²) you get a lower bound of o(n²) for f(n) + g(n), but you don't get an upper bound on f(n) + g(n) because no upper bound was given on f(n). [Note, in above, Θ is a big-θ, or big theta]
For f(n)·g(n), you get a lower bound of o(n³) because Θ(n²) implies lower and upper bounds of o(n²) and O(n²) for g(n). Again, no upper bound on f(n)·g(n) is available, because f(n) can be arbitrarily large; for f(n), we only have an o(n) lower bound.
With the question modified to give only upper bounds on f and g, as f(n) = O(n) and g(n) = O(n²), we have that f(n)+g(n) is O(n²) and f(n)·g(n) is O(n³).
To show this rigorously is a bit tedious, but is quite
straightforward. Eg, for the f(n)·g(n) case, suppose that by the definitions of O(n) and O(n²) we are given C, X, K, Y such that n>X ⇒ C·n > f(n) and n>Y ⇒ K·n² > g(n). Let J=C·K and Z=max(X,Y). Then n>Z ⇒ J·n³ > f(n)·g(n) which proves that f(n)·g(n) is O(n³).

O(f(n) + g(n)) = O(max{f(n), g(n)})
so for first
f(n) + g(n) = O(max{n, n^2}) = O(n^2)
for
f(n) ⋅ g(n)
we will have
O(f(n) ⋅ g(n)) = O(n ⋅ n^2) = O(n^3)

Think about it this way.
f(n) = c.n + d
g(n) = a.n^2 + b.n + p
Then,
f(n) + g(n) = a.n^2 + (lower powers of n)
And,
f(n).g(n) = x.n^3 + (lower powers of n)
It follows that O(f(n) + g(n)) = O(n^2)
and O(f(n).g(n)) = O(n^3)

This question can be understood like this :-
f(n)=O(n) means it takes O(n) time to compute f(n).
Similarly,
for g(n) which requires O(n^2) time
So,
P(n)=f(n)+g(n) would definitely take O(n)+O(n^2)+O(1)(for addition,
once you know the value of both f and g)
. Hence, this new function
P(n) would require O(n^2) time.
Same is the case for
Q(n) =f(n)*g(n) which requires O(n^2) time
.

Related

How to operate on Asymptotic Notation Function Sets ie. Big-O + Big-Omega?

I'm trying to determine if the following statement is true or false.
If f(n) ∈ O(n) and g(n) ∈ Ω(n), then f(n) + g(n) ∈ Θ(n).
I think I understand adding the same asymptotic big-O. O(n) + O(n) = O(n)
However, I am unsure about adding or operating on the others combined.
For example:
If f(n) ∈ Θ(n log n), then f(n) * n = ?
Could this answer be both O(n^2*logn) and Θ(n^2*logn)?
Thank you in advance!
You can use the definition of these symbols and try to find a proof or a contradiction example for them.
If f(n) = O(n) and g(n) = Omega(n), the f(n) + g(n) is not in Theta(n) necessarily! As a contradiction, if f(n) = n and g(n) = n^2, then f(n) + g(n) = Theta(n^2). On the other hand, if f(n) = n and g(n) = n, then f(n) + g(n) = Theta(n). Hence, you can just say f(n) + g(n) = Omega(n) and nothing more.

The Mathematical Relationship Between Big-Oh Classes

My textbook describes the relationship as follows:
There is a very nice mathematical intuition which describes these classes too. Suppose we have an algorithm which has running time N0 when given an input of size n, and a running time of N1 on an input of size 2n. We can characterize the rates of growth in terms of the relationship between N0 and N1:
Big-Oh Relationship
O(log n) N1 ≈ N0 + c
O(n) N1 ≈ 2N0
O(n²) N1 ≈ 4N0
O(2ⁿ) N1 ≈ (N0)²
Why is this?
That is because if f(n) is in O(g(n)) then it can be thought of as acting like k * g(n) for some k.
So for example if f(n) = O(log(n)) then it acts like k log(n), and now f(2n) ≈ k log(2n) = k (log(2) + log(n)) = k log(2) + k log(n) ≈ k log(2) + f(n) and that is your desired equation with c = k log(2).
Note that this is a rough intuition only. An example of where it breaks down is that f(n) = (2 + sin(n)) log(n) = O(log(n)). The oscillating 2 + sin(n) bit means that f(2n)-f(n) can be basically anything.
I personally find this kind of rough intuition to be misleading and therefore worse than useless. Others find it very helpful. Decide for yourself how much weight you give it.
Basically what they are trying to show is just basic algebra after substituting 2n for n in the functions.
O(log n)
log(2n) = log(2) + log(n)
N1 ≈ c + N0
O(n)
2n = 2(n)
N1 ≈ 2N0
O(n²)
(2n)^2 = 4n^2 = 4(n^2)
N1 ≈ 4N0
O(2ⁿ)
2^(2n) = 2^(n*2) = (2^n)^2
N1 ≈ (N0)²
Since O(f(n)) ~ k * f(n) (almost by definition), you want to look at what happens when you put 2n in for n. In each case:
N1 ≈ k*log 2n = k*(log 2 + log n) = k*log n + k*log 2 ≈ N0 + c where c = k*log 2
N1 ≈ k*(2n) = 2*k*n ≈ 2N0
N1 ≈ k*(2n)^2 = 4*k*n^2 ≈ 4N0
N1 ≈ k*2^(2n) = k*(2^n)^2 ≈ N0*2^n ≈ N0^2/k
So the last one is not quite right, anyway. Keep in mind that these relationships are only true asymptotically, so the approximations will be more accurate as n gets larger. Also, f(n) = O(g(n)) only means g(n) is an upper bound for f(n) for large enough n. So f(n) = O(g(n)) does not necessarily mean f(n) ~ k*g(n). Ideally, you want that to be true, since your big-O bound will be tight when that is the case.

Complexity analysis after multiplication of two functions

Given F(n) = θ(n)
H(n) = O(n)
G(n) = Ω(n)
then what will be order of F(n) + [G(n) . H(n)] ?
edit: F(n) = θ(n) not Q(n)
There isn't enough information to say anything about the function P(n) = G(n)*H(n). All we know is that G grows at least linearly; it could be growing quadratically, cubically, even exponentially. Likewise, we only know that H grows at most linearly; it could only be growing logarithmically, or be constant, or even be decreasing. As a result, P(n) itself could be decreasing or increasing without bound, which means the sum F(n) + P(n) could also be decreasing or increasing without bound.
Suppose, though, that we could assume that H(n) = Ω(1) (i.e., it is at least not decreasing). Now we can say the following about P(n):
P(n) = H(n) * G(n)
>= C1 * G(n)
= Ω(G(n)) = Ω(n)
P(n) <= C1*n * G(n)
= O(n*G(n))
Thus F(n) + P(n) = Ω(n) and F(n) + P(n) = O(n*G(n)), but nothing more can be said; both bounds are as tight as we can make them without more information about H or G.

Comparing complexities

I have these three questions for an exam review:
If f(n) = 2n - 3 give two different functions g(n) and h(n) (so g(n) doesn't equal h(n)) such that f(n) = O(g(n)) and f(n) = O(h(n))
Now do the same again with functions g'(n) and h'(n), but this time the function should be of the form
g'(n) = Ɵ(f(n)) and f(n) = o(h'(n))
Is it possible for a function f(n) = O(g(n)) and f(n) = Ω(g(n))?
I know that a function is O(n) of another, if it is less than or equal to the other function. So I think 1. could be g(n) = 2n²-3 and h(n) = 2n²-10.
I also know that a function is Ɵ(n) of another if it is basically equal to the other function (we can ignore constants), and o(n) if it is only less than the function, so for 2. I think you could have g'(n) = 2n-15 and h'(n) = 2n.
To 3.: It is possible for a function to be both O(n) and Ω(n) because O(n) and Ω(n) allows for the function to be the same as the given function, so you could have a function g(n) that equals f(n) and satisfies the rules for being both O and Ω.
Can someone please tell me if this is correct?
Your answers are mostly right. But I would like to add some points:
Given is f(n) = 2n - 3
With g(n) = 2n²-3 and h(n) = 2n²-10 f(n) is in O(g(n)) and in O(h(n)). But your g(n) and h(n) are basicly the same, at least they are both in Θ(n²). There exists many other function that would also work. E.g.
f(n) ∈ O(n) ⇒ g(n) = n
f(n) ∈ O(nk) ⇒ g(n) = nk ∀ k ≥ 1
f(n) ∈ O(2ⁿ) ⇒ g(n) = 2ⁿ
g'(n) = 2n-15 reduces to g'(n) = n, if we think in complexities, and this is right. In fact, it is the only possible answer.
But f(n) ∈ o(h'(n)) does not hold for h'(n) = 2n. Little-o means that
limn → ∞ | f(n)/g(n) | = 0 ⇔ f(n) ∈ o(g(n))
So you can choose h'(n) = n² or more general h'(n) = nk ∀ k > 1 or h'(n) = cⁿ for a constant c > 1.
Yes it is possible and you can take it also as a definition for Θ(g(n)):
f(n) ∈ Θ(g(n)) ⇔ f(n) ∈ O(g(n)) and f(n) ∈ Ω(g(n))

Examples of Asymptotic analysis

Here I will be giving two functions f(n) and g(n) and my aim is to decide if the f(n) is in theta, omega, big o, little o or little omega.
Please provide detailed proof if you are confident with such problems.
Problem 1: f(n) = (1/2)n^2 - 3n, g(n) = n^2
Problem 2: f(n) = 6n^3, g(n) = n^2
Problem 3: f(n) = 3n+5, g(n) = n^2
Problem 4: f(n)= n ceiling(lg n^2), g(n)= n^2 log n
Problem 5: f(n) = [10^(n+4)(n)]+6, g(n)=10^(n+3)
Polynomial functions are easy. Just compare the highest order of each.
f(n) is n^2 and g(n) is n^2, thus f(n) is theta g(n)
f(n) is n^3 and g(n) is n^2, thus f(n) is O(g(n))
f(n) is n and g(n) is n^2, thus f(n) is W(g(n))
A proof would involve computing the limits.

Resources