Is log(f(n)^c)=Ω(log(g(n))) or not - algorithm

You are given functions f and g such that f(n)=Ω(g(n)). Is log(f(n)^c)=Ω(log(g(n)))? (Here c is some positive constant.) You should assume that f and g are non-decreasing and always bigger than 1.
This a question in my algorithm course and i cant figure out if it's true or false or depending on the constant or depending on the functions f and g

It's straightforward to prove. As f(n) = Omega(g(n)), it means lim{n -> ∞} f(n)/g(n) > 0.
As f and g are non-decreasing and greater than 1, and log is an increasing function, lim{n -> ∞} log(f(n))/log(g(n)) > 0. Hence, log(f(n)) = Omega(log(g(n)).
On the other hand log(f(n)^c) = c log(f(n)). As c is a constant factor, log(f(n)^c) is Omega(log(g(n)) as well anf your claim is correct.

First, I point out that instead of this notation f(n) = Ω(g(n)) I use this f(n) ∈ Ω(g(n))
From Omega definition we have:
f(n) ∈ g(n) <=> ∃s,k > 0 | f(n) >= s*g(n) ∀n >= k
So for log(f(n)^c) ∈ Ω(c*log(g(n))) we can say:
∃s > 0 (s=c for easiness) | log(f(n)^c) >= c*log(g(n)) ∀n >= k
Then we have:
c*log(f(n)) >= c*log(g(n)) ∀n >= k
f(n) >= g(n) ∀n >= k
And since we know that f(n) ∈ Ω(g(n)) we can state that log(f(n)^c) ∈ Ω(c*log(g(n))) .

Related

Asymptotic bounds and Big Θ notation

Suppose that f(n)=4^n and g(n)=n^n, will it be right to conclude that f(n)=Θ(g(n)).
In my opinion it's a correct claim but I'm not 100% sure.
It is incorrect. f(n) = Theta(g(n)) if and only if both f(n) = O(g(n)) and g(n) = O(f(n)). It is true that f(n) = O(g(n)). We will show that it is not the case that g(n) = O(f(n)).
Assume g(n) = O(f(n)). Then there exists a positive real constant c and a positive natural number n0 such that for all n > n0, g(n) <= c * f(n). For our functions, this implies n^n <= c * 4^n. If we take the nth root of both sides of this inequality we get n <= 4c^(1/n). We are free to assume c >= 1 and n0 >= since if we found a smaller value that worked a larger value would work too. For all c > 1 and n > 1, 4c^(1/n) is strictly less than 4c. But then if we choose n > 4c, the inequality is false. So, there cannot be an n0 such that for all n at least n0 the condition holds. This is a contradiction; our initial assumption is disproven.

If f(n) is Θ(h(n)) and g(n) = O(h(n)) then f(n) + g(n) is Θ(h(n)). True or False

I have been trying to prove/disprove the above,
I have proved that if f(n) is Θ(h(n)) and g(n) = O(h(n)) then f(n) + g(n) is O(h(n))
but now when I am trying to prove/disprove f(n) + g(n) is also Ω(h(n) I am facing a problem. Below is my approach.
From given,
There exists b,c > 0 such that b.h(n) =< f(n) <= c(h(n)) and exists a > 0 such that g(n) <= a.h(n)
I proved O(h(n)) by adding the above two inequalities, but to prove/disprove lower bound formally I am stuck since I do not have a lower bound to g(n) but have a lower bound only to f(n).
Also I am getting confused if the big-oh notation consist of strict inequalities always or not eq. if f(n) is Θ(h(n)) does the following statement hold :
There exists b,c > 0 such that b.h(n) =< f(n) =< c(h(n)).
Thank you.
Assuming f and g positive,
f + g >= f, g
implies
f + g = Ω(h(n)).

How is f(x) = 4x^2 - 5x + 3 is O(x^2) derived

Here are the steps that are used to prove the above
|f(x)| = |4x^2 – 5x + 3|
<= |4x^2|+ |- 5x| + |3|
<= 4x^2 + 5x + 3, for all x > 0
<= 4x^2 + 5x^2 + 3x^2, for all x > 1
<= 12x^2, for all x > 1
Hence we conclude that f(x) is O(x^2)
I referred this But it does not help
Can someone explain the above proof step by step?
Why the absolute value of f(x) is taken ?
Why and how were all the term replaced by x^2 term?
Preparations
We start by loosely stating the definition of a function or algorithm f being in O(g(n)):
If a function f is in O(g(n)), then c · g(n) is an upper
bound on f(n), for some non-negative constant c such that f(n) ≤ c · g(n)
holds, for sufficiently large n (i.e. , n ≥ n0 for some constant
n0).
Hence, to show that f ∈ O(g(n)), we need to find a set of (non-negative) constants (c, n0) that fulfils
f(n) ≤ c · g(n), for all n ≥ n0, (+)
We note, however, that this set is not unique; the problem of finding the constants (c, n0) such that (+) holds is degenerate. In fact, if any such pair of constants exists, there will exist an infinite amount of different such pairs.
Analysis
For common convention, we'll analyse your example using variable name n rather than x.
f(n) = 4n^2 - 5n + 3 (++)
Now, for your example, we may assume, without loss of generality (since we're studying asymptotic complexity: function/algorithm behavior for "large" n) that n > n0 where n0 > 0. This would correspond to the analysis you've shown in your question analyzing absolute values of x. Anyway, given this assumption, the following holds:
f(n) = 4n^2 - 5n + 3 < 4n^2 + 3, for all n > n0
Now let, again without loss of generality, n0 equal 2 (we could choose any value, but lets choose 2 here). For n0 = 2, naturally n^2 > 3 holds for n > n0, which means the following holds:
f(n) = 4n^2 - 5n + 3 < 4n^2 + 3 < 4n^2 + n^2, for all n > n0 = 2
f(n) < 5n^2, for all n > n0 = 2
Now choose c = 5 and let g(n) = n^2:
f(n) < c · g(n), for all n > n0,
with c = 5, n0 = 2, g(n) = n^2
Hence, from (+), we've shown that f as defined in (++) is in O(g(n)) = O(n^2).

Prove or disprove the following implication (Big O Notation)

I couldn't prove this:
f(n) = O(g(n)) implies f(n)^k = O(g(n)^k)
where k is element of the natural, positiv numbers
I've found similar examples on the internet. But I'm not sure if it's right to implement those solutions for this example.
Go back to the definition of big-o.
f(n) = O(g(n)) <=> \exists M \in R+,
\exists n_0 \in N,
such that:
\forall n > n_0
|f(n)| < M.|g(n)|
It is obvious that if k > 0 then |f(n)|^k < (M.|g(n)|)^k.
If k < 0, the relation is inversed.

Prove that f(n) = Θ(g(n)) iff g(n) = Θ(f(n))

I have been given the problem:
f(n) are asymptotically positive functions. Prove f(n) = Θ(g(n)) iff g(n) = Θ(f(n)).
Everything I have found points to this statement being invalid. For example an answer I've come across states:
f(n) = O(g(n)) implies g(n) = O(f(n))
f(n) = O(g(n)) means g(n) grows faster than f(n). It cannot imply that f(n) grows
faster than g(n). Hence not true.
Another states:
If f(n) = O(g(n)) then O(f(n)). This is false. If f(n) = 1 and g(n) = n
for all natural numbers n, then f(n) <= g(n) for all natural numbers n, so
f(n) = O(g(n)). However, suppose g(n) = O(f(n)). Then there are natural
numbers n0 and a constant c > 0 such that n=g(n) <= cf(n) = c for all n >=
n0 which is impossible.
I understand that there are slight differences between my exact question and the examples I have found, but I've only been able to come up with solutions that do not prove it. I am correct in thinking that it is not able to be proved or am I looking over some detail?
You can start from here:
Formal Definition: f(n) = Θ (g(n)) means there are positive constants c1, c2, and k, such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ k.
Because you have that iff, you need to start from the left side and to prove the right side, and then start from the right side and prove the left side.
Left -> right
We consider that:
f(n) = Θ(g(n))
and we want to prove that
g(n) = Θ(f(n))
So, we have some positive constants c1, c2 and k such that:
0 ≤ c1*g(n) ≤ f(n) ≤ c2*g(n), for all n ≥ k
The first relation between f and g is:
c1*g(n) ≤ f(n) => g(n) ≤ 1/c1*f(n) (1)
The second relation between f and g is:
f(n) ≤ c2*g(n) => 1/c2*f(n) ≤ g(n) (2)
If we combine (1) and (2), we obtain:
1/c2*f(n) ≤ g(n) ≤ 1/c1*f(n)
If you consider c3 = 1/c2 and c4 = 1/c1, they exist and are positive (because the denominators are positive). And this is true for all n ≥ k (where k can be the same).
So, we have some positive constants c3, c4, k such that:
c3*f(n) ≤ g(n) ≤ c4*f(n), for all n ≥ k
which means that g(n) = Θ(f(n)).
Analogous for right -> left.

Resources