Examples of Asymptotic analysis - algorithm

Here I will be giving two functions f(n) and g(n) and my aim is to decide if the f(n) is in theta, omega, big o, little o or little omega.
Please provide detailed proof if you are confident with such problems.
Problem 1: f(n) = (1/2)n^2 - 3n, g(n) = n^2
Problem 2: f(n) = 6n^3, g(n) = n^2
Problem 3: f(n) = 3n+5, g(n) = n^2
Problem 4: f(n)= n ceiling(lg n^2), g(n)= n^2 log n
Problem 5: f(n) = [10^(n+4)(n)]+6, g(n)=10^(n+3)

Polynomial functions are easy. Just compare the highest order of each.
f(n) is n^2 and g(n) is n^2, thus f(n) is theta g(n)
f(n) is n^3 and g(n) is n^2, thus f(n) is O(g(n))
f(n) is n and g(n) is n^2, thus f(n) is W(g(n))
A proof would involve computing the limits.

Related

if f(n) = 2n^2 and g(n) = 1.01^n. Is f(n) = O(g(n))? Is f(n) = Ω(g(n))?

Let f(n) = 2n^2 and g(n) = 1.01^n. Is f(n) = O(g(n))? Is f(n) = Ω(g(n))? Justify your answers with a proof.
Think about what the graphs of those functions look like for very large n. Which one grows faster (i.e. overtakes the other in the long run)? Time complexities denote the asymptotic running-time of an algorithm.

How to operate on Asymptotic Notation Function Sets ie. Big-O + Big-Omega?

I'm trying to determine if the following statement is true or false.
If f(n) ∈ O(n) and g(n) ∈ Ω(n), then f(n) + g(n) ∈ Θ(n).
I think I understand adding the same asymptotic big-O. O(n) + O(n) = O(n)
However, I am unsure about adding or operating on the others combined.
For example:
If f(n) ∈ Θ(n log n), then f(n) * n = ?
Could this answer be both O(n^2*logn) and Θ(n^2*logn)?
Thank you in advance!
You can use the definition of these symbols and try to find a proof or a contradiction example for them.
If f(n) = O(n) and g(n) = Omega(n), the f(n) + g(n) is not in Theta(n) necessarily! As a contradiction, if f(n) = n and g(n) = n^2, then f(n) + g(n) = Theta(n^2). On the other hand, if f(n) = n and g(n) = n, then f(n) + g(n) = Theta(n). Hence, you can just say f(n) + g(n) = Omega(n) and nothing more.

Asymptotic Growth: Understanding the specific proof of f(n) + little o(f(n)) = theta (f(n))?

I'm working through proof of f(n) + o(f(n)) = theta (f(n)) and I came across a part in the proof that I am having trouble understanding.
We let f(n) and g(n) be asymptotically positive functions and assume g(n) = O(f(n)).
In the proof, it states that since we know that f(n) + g(n) ≥ f(n) for all n, we can conclude that f(n) + g(n) = Omega((f(n)).
We can also conclude similarly that f(n) + g(n) ≤ 2 f(n). Therefore f(n) + g(n) = O(f(n)).
I am having trouble understanding why it is the case that f(n) + g(n) = Omega((f(n)) and f(n) + g(n) = O(f(n)) would be true. How is it that we can prove that the tight-lower bound is specifically when we add g(n) to f(n)? What is it that we are exactly concluding from the value of g(n)?
One way of proving that f(n) is theta(g(n)) is to prove two separate statements: that f(n) is omega(g(n)), and f(n) is O(g(n)). It's pretty clear this way of proving is correct from the definitions of those notations.
In this exact problem, if we choose some constant c to be equal to 1, we will have, for every n, that f(n) + g(n) >= c * f(n), so that, by definition, shows that f(n) + g(n) is Omega(f(n)). Furthermore, for the O(f(n)) part, if we choose the constant c to be 2 in this case, we need to prove that there exists some n0 such that f(n) + g(n) <= c * f(n) for every n > n0, which is equivalent to g(n) <= f(n) for every n > n0, which is equivalent to the definition of g(n) = O(f(n)) given in the problem statement.
Hope this helps.

Interview questions

This is an interview question:
Given: f(n) = O(n)
g(n) = O(n²)
find f(n) + g(n) and f(n)⋅g(n)?
What would be the answer for this question?
When this answer was prepared, f(n) was shown as o(n) and g(n) as Θ(n²).
From f(n) = o(n) and g(n) = Θ(n²) you get a lower bound of o(n²) for f(n) + g(n), but you don't get an upper bound on f(n) + g(n) because no upper bound was given on f(n). [Note, in above, Θ is a big-θ, or big theta]
For f(n)·g(n), you get a lower bound of o(n³) because Θ(n²) implies lower and upper bounds of o(n²) and O(n²) for g(n). Again, no upper bound on f(n)·g(n) is available, because f(n) can be arbitrarily large; for f(n), we only have an o(n) lower bound.
With the question modified to give only upper bounds on f and g, as f(n) = O(n) and g(n) = O(n²), we have that f(n)+g(n) is O(n²) and f(n)·g(n) is O(n³).
To show this rigorously is a bit tedious, but is quite
straightforward. Eg, for the f(n)·g(n) case, suppose that by the definitions of O(n) and O(n²) we are given C, X, K, Y such that n>X ⇒ C·n > f(n) and n>Y ⇒ K·n² > g(n). Let J=C·K and Z=max(X,Y). Then n>Z ⇒ J·n³ > f(n)·g(n) which proves that f(n)·g(n) is O(n³).
O(f(n) + g(n)) = O(max{f(n), g(n)})
so for first
f(n) + g(n) = O(max{n, n^2}) = O(n^2)
for
f(n) ⋅ g(n)
we will have
O(f(n) ⋅ g(n)) = O(n ⋅ n^2) = O(n^3)
Think about it this way.
f(n) = c.n + d
g(n) = a.n^2 + b.n + p
Then,
f(n) + g(n) = a.n^2 + (lower powers of n)
And,
f(n).g(n) = x.n^3 + (lower powers of n)
It follows that O(f(n) + g(n)) = O(n^2)
and O(f(n).g(n)) = O(n^3)
This question can be understood like this :-
f(n)=O(n) means it takes O(n) time to compute f(n).
Similarly,
for g(n) which requires O(n^2) time
So,
P(n)=f(n)+g(n) would definitely take O(n)+O(n^2)+O(1)(for addition,
once you know the value of both f and g)
. Hence, this new function
P(n) would require O(n^2) time.
Same is the case for
Q(n) =f(n)*g(n) which requires O(n^2) time
.

functions that f(n) is not O(g(n)) and g(n) is not O(f(n))

Are there any functions such as f(n) and g(n) that both;
f(n) != O(g(n)) and
g(n) != O(f(n)).
Are there any functions that fulfills the requirements at the above?
f(n)=n and g(n)=n^(1 + sin(x)).
f(n) is not O(g(n)) and g(n) is not O(f(n)).
Refer http://c2.com/cgi/wiki?BigOh
Consider:
f(n) = 0 if n is odd, else n*n
g(n) = n
Then for odd values g(n) is more than a constant factor bigger than f(n) (and so g(n) is not O(f(n)), while for even values f(n) is more than a constant factor bigger than g(n) (and so f(n) is not O(g(n))).
Observe that f(n) does not have a limit at infinity as n approaches infinity, so in some sense this is a cheap example. But you could fix that by replacing 0, n, n*n with n, n*n, n*n*n.
I think if two non-negative functions have the property that f(n)/g(n) has a (perhaps infinite) limit as n approaches infinity, then it follows that one of them is big-O the other one. If the limit is 0 then f(n) is O(g(n)), if the limit is finite then each is big-O the other, and if the limit is infinite then g(n) is O(f(n)). But I'm too lazy to confirm by writing a proof.

Resources