Complexity analysis after multiplication of two functions - algorithm

Given F(n) = θ(n)
H(n) = O(n)
G(n) = Ω(n)
then what will be order of F(n) + [G(n) . H(n)] ?
edit: F(n) = θ(n) not Q(n)

There isn't enough information to say anything about the function P(n) = G(n)*H(n). All we know is that G grows at least linearly; it could be growing quadratically, cubically, even exponentially. Likewise, we only know that H grows at most linearly; it could only be growing logarithmically, or be constant, or even be decreasing. As a result, P(n) itself could be decreasing or increasing without bound, which means the sum F(n) + P(n) could also be decreasing or increasing without bound.
Suppose, though, that we could assume that H(n) = Ω(1) (i.e., it is at least not decreasing). Now we can say the following about P(n):
P(n) = H(n) * G(n)
>= C1 * G(n)
= Ω(G(n)) = Ω(n)
P(n) <= C1*n * G(n)
= O(n*G(n))
Thus F(n) + P(n) = Ω(n) and F(n) + P(n) = O(n*G(n)), but nothing more can be said; both bounds are as tight as we can make them without more information about H or G.

Related

Which pair of functions satisfy f(N) ∼ g(N)?

a. f(N) = N and g(N) = N + N2
b. f(N) = 2N and g(N) = √N
c. f(N) = NlogN + N and g(N) = 2NlogN + N
d. f(N) = 2√N + N and g(N) = √N + N
What is the best way of calculating these functions, I have tried putting values into them, but some of them are very close in values and I am not sure which one to pick.
Calculate f(N)/g(N) in the limit N➡Infinity.
If f(N)/g(N) approaches a positive constant 𝛼 in the limit N➡Infinity, then f(N) ~ 𝛼 g(N).

How to operate on Asymptotic Notation Function Sets ie. Big-O + Big-Omega?

I'm trying to determine if the following statement is true or false.
If f(n) ∈ O(n) and g(n) ∈ Ω(n), then f(n) + g(n) ∈ Θ(n).
I think I understand adding the same asymptotic big-O. O(n) + O(n) = O(n)
However, I am unsure about adding or operating on the others combined.
For example:
If f(n) ∈ Θ(n log n), then f(n) * n = ?
Could this answer be both O(n^2*logn) and Θ(n^2*logn)?
Thank you in advance!
You can use the definition of these symbols and try to find a proof or a contradiction example for them.
If f(n) = O(n) and g(n) = Omega(n), the f(n) + g(n) is not in Theta(n) necessarily! As a contradiction, if f(n) = n and g(n) = n^2, then f(n) + g(n) = Theta(n^2). On the other hand, if f(n) = n and g(n) = n, then f(n) + g(n) = Theta(n). Hence, you can just say f(n) + g(n) = Omega(n) and nothing more.

The Mathematical Relationship Between Big-Oh Classes

My textbook describes the relationship as follows:
There is a very nice mathematical intuition which describes these classes too. Suppose we have an algorithm which has running time N0 when given an input of size n, and a running time of N1 on an input of size 2n. We can characterize the rates of growth in terms of the relationship between N0 and N1:
Big-Oh Relationship
O(log n) N1 ≈ N0 + c
O(n) N1 ≈ 2N0
O(n²) N1 ≈ 4N0
O(2ⁿ) N1 ≈ (N0)²
Why is this?
That is because if f(n) is in O(g(n)) then it can be thought of as acting like k * g(n) for some k.
So for example if f(n) = O(log(n)) then it acts like k log(n), and now f(2n) ≈ k log(2n) = k (log(2) + log(n)) = k log(2) + k log(n) ≈ k log(2) + f(n) and that is your desired equation with c = k log(2).
Note that this is a rough intuition only. An example of where it breaks down is that f(n) = (2 + sin(n)) log(n) = O(log(n)). The oscillating 2 + sin(n) bit means that f(2n)-f(n) can be basically anything.
I personally find this kind of rough intuition to be misleading and therefore worse than useless. Others find it very helpful. Decide for yourself how much weight you give it.
Basically what they are trying to show is just basic algebra after substituting 2n for n in the functions.
O(log n)
log(2n) = log(2) + log(n)
N1 ≈ c + N0
O(n)
2n = 2(n)
N1 ≈ 2N0
O(n²)
(2n)^2 = 4n^2 = 4(n^2)
N1 ≈ 4N0
O(2ⁿ)
2^(2n) = 2^(n*2) = (2^n)^2
N1 ≈ (N0)²
Since O(f(n)) ~ k * f(n) (almost by definition), you want to look at what happens when you put 2n in for n. In each case:
N1 ≈ k*log 2n = k*(log 2 + log n) = k*log n + k*log 2 ≈ N0 + c where c = k*log 2
N1 ≈ k*(2n) = 2*k*n ≈ 2N0
N1 ≈ k*(2n)^2 = 4*k*n^2 ≈ 4N0
N1 ≈ k*2^(2n) = k*(2^n)^2 ≈ N0*2^n ≈ N0^2/k
So the last one is not quite right, anyway. Keep in mind that these relationships are only true asymptotically, so the approximations will be more accurate as n gets larger. Also, f(n) = O(g(n)) only means g(n) is an upper bound for f(n) for large enough n. So f(n) = O(g(n)) does not necessarily mean f(n) ~ k*g(n). Ideally, you want that to be true, since your big-O bound will be tight when that is the case.

Interview questions

This is an interview question:
Given: f(n) = O(n)
g(n) = O(n²)
find f(n) + g(n) and f(n)⋅g(n)?
What would be the answer for this question?
When this answer was prepared, f(n) was shown as o(n) and g(n) as Θ(n²).
From f(n) = o(n) and g(n) = Θ(n²) you get a lower bound of o(n²) for f(n) + g(n), but you don't get an upper bound on f(n) + g(n) because no upper bound was given on f(n). [Note, in above, Θ is a big-θ, or big theta]
For f(n)·g(n), you get a lower bound of o(n³) because Θ(n²) implies lower and upper bounds of o(n²) and O(n²) for g(n). Again, no upper bound on f(n)·g(n) is available, because f(n) can be arbitrarily large; for f(n), we only have an o(n) lower bound.
With the question modified to give only upper bounds on f and g, as f(n) = O(n) and g(n) = O(n²), we have that f(n)+g(n) is O(n²) and f(n)·g(n) is O(n³).
To show this rigorously is a bit tedious, but is quite
straightforward. Eg, for the f(n)·g(n) case, suppose that by the definitions of O(n) and O(n²) we are given C, X, K, Y such that n>X ⇒ C·n > f(n) and n>Y ⇒ K·n² > g(n). Let J=C·K and Z=max(X,Y). Then n>Z ⇒ J·n³ > f(n)·g(n) which proves that f(n)·g(n) is O(n³).
O(f(n) + g(n)) = O(max{f(n), g(n)})
so for first
f(n) + g(n) = O(max{n, n^2}) = O(n^2)
for
f(n) ⋅ g(n)
we will have
O(f(n) ⋅ g(n)) = O(n ⋅ n^2) = O(n^3)
Think about it this way.
f(n) = c.n + d
g(n) = a.n^2 + b.n + p
Then,
f(n) + g(n) = a.n^2 + (lower powers of n)
And,
f(n).g(n) = x.n^3 + (lower powers of n)
It follows that O(f(n) + g(n)) = O(n^2)
and O(f(n).g(n)) = O(n^3)
This question can be understood like this :-
f(n)=O(n) means it takes O(n) time to compute f(n).
Similarly,
for g(n) which requires O(n^2) time
So,
P(n)=f(n)+g(n) would definitely take O(n)+O(n^2)+O(1)(for addition,
once you know the value of both f and g)
. Hence, this new function
P(n) would require O(n^2) time.
Same is the case for
Q(n) =f(n)*g(n) which requires O(n^2) time
.

algorithm analysis - orders of growth question

I'm studing orders of growth "big oh", "big omega", and "big theta". Since I can't type the little symbols for these I will denote them as follows:
ORDER = big oh
OMEGA = big omega
THETA = big theta
For example I'll say n = ORDER(n^2) to mean that the function n is in the order of n^2 (n grows at most as fast n^2).
Ok for the most part I understand these:
n = ORDER(n^2) //n grows at most as fast as n^2
n^2 = OMEGA(n) //n^2 grows atleast as fast as n
8n^2 + 1000 = THETA(n^2) //same order of growth
Ok here comes the example that confuses me:
what is n(n+1) vs n^2
I realize that n(n+1) = n^2 + n; I would say it has the same order of growth as n^2; therefore I would say
n(n+1) = THETA(n^2)
but my question is, would it also be correct to say:
n(n+1) = ORDER(n^2)
please help because this is confusing to me. thanks.
Thank you guys!!
just to make sure I understand correctly, are these all true:
n^2+n = ORDER(2000n^2)
n^2+n = THETA(2000n^2)
n^2+n = OMEGA(2000n^2)
2000n^2 = ORDER(n^2+n)
2000n^2 = THETA(n^2+n)
2000n^2 = OMEGA(n^2+n)
So if f = THETA(g) then f=ORDER(g) and f=OMEGA(g) are also true.
Yes, n(n+1) = Order(n^2) is correct.
If f = Theta(g) then f = Order(g) and g = Order(f) are both true.
Moron is correct, and that is the easiest way to think about it.
But to understand it, return to the definition for f(n) = O(g(n)): there exists a positive M and n0 such that, for all n > n0, f(n) <= Mg(n).
Suppose M=2. Can you find a value, n0, such that for all n > n0, n^2+n <= M(n^2)?
(Plot both functions with pen and paper to understand how they grow in relation to one another.)
You can use this simple table to get an easy and intuitive understanding of what these symbols mean:
If f(n) and g(n) are two functions then
Growth Rate
if f(n) = Θ(g(n)) then growth rate of f(n) = growth rate of g(n)
if f(n) = O(g(n)) then growth rate of f(n) ≤ growth rate of g(n)
if f(n) = Ω(g(n)) then growth rate of f(n) ≥ growth rate of g(n)
if f(n) = o(g(n)) then growth rate of f(n) < growth rate of g(n)
if f(n) = ω(g(n)) then growth rate of f(n) > growth rate of g(n)
Also, the order is always written in terms of the highest order i.e if the order is O(n^2 + n + 1) then we simply write it as O(n^2) as n^2 is of the highest order.

Resources