What is true when Big O is of another function - big-o

Imagine that we have three functions f(n), g(n) and h(n) and that f(n) = O(g(n))
If g(n) = (1/5)h(n) is it possible that f(n) = O(h(n))?
Why is it possible / not possible?

Look at the formal Definition of big-O. You should understand what 1/5 represents. Hint: It is clearly a constant. I will not provide a formal proof as I guess is part of this homework, but you may try to show equality in big-O between these functions.

Related

If f(n) is Omega(g(n)) then 2^(f(n)) is Omega(2^g(n)). Is this true or false

For this question, I thought it's true because I thought the question is basically asking f(n) is greater than or equal to g(n) then is 2^(f(n)) greater than or equal to 2^(g(n))
So if we take an instance of f(n) = 2n and g(n) = n, f(n) is > g(n). Then 2^2n is greater than 2^n.
But my friend said that's not correct, can someone give me some insight? I think I might have some misunderstanding of the problem.
You're interested in proving or disproving this claim:
If f(n) = Ω(g(n)), then 2f(n) = Ω(2g(n)).
When you see a statement like this, it's often helpful to clarify what f and g are here. Specifically, the statement above really means the following:
For any functions f and g, if f(n) = Ω(g(n)), then 2f(n) = Ω(2g(n))
So in this sense, if you want to prove that this statement is true, you'd need to approach it by showing that this statement is true for any possible choice of f and g, not just by picking a single f and a single function g and confirming that the relationship holds for those particular functions. In this sense, your friend is correct.
(On the other hand, if you want to disprove this claim, you just need to give examples of functions f and g where f(n) = Ω(g(n)) but 2f(n) ≠ Ω(2g(n)).)
As a hint for this question: asymptotic notations like O, Ω, and Θ all completely ignore constant factors. If f(n) = Ω(g(n)), then you can scale either f or g by any constant factor that you'd like and the relationship will still hold. On the other hand, constant factors in an exponent radically change the properties of that exponent. for example, the function en grows exponentially slower than the function e2n, since e2n = (e2)n, which is an exponential function with a higher base. In other words, you can't scale exponents by a constant factor without completely changing their rates of growth.
Based on this disconnect - that Ω notation can't tell apart functions that differ by a constant factor, but that exponential functions are very sensitive to constant factors - do you think this statement is true or false? Based on the advice above, how would you prove a statement like that?
For this question, I thought it's true because I thought the question is basically asking f(n) is greater than or equal to g(n) then is 2^(f(n)) greater than or equal to 2^(g(n))
Nope. That's not what big-omega notation means at all. f(n) = Ω(g(n)) means that for sufficiently large n, the ratio f(n)/g(n) is bounded below by a positive constant.
To see that f(n) = Ω(g(n)) does not imply 2^f(n) = Ω(2^g(n)), consider f(n) = n - log(n) and g(n) = n. Then 2^f(n) = (2^n)/n and 2^g(n) = 2^n, and 2^f(n) != Ω(2^g(n)).
To answer your question, the statement is true.
We can find constants that work by the definition of big Omega. Specifically, let C_2 = 2^C_1 and let n >= max(n_1, n_2).
Proof

When to use Big O instead of theta or little o

A question about asymptotic notation. I've seen a lot of explanations of asymptotic notation say:
θ(...) is analogous to =
O(...) is analogous to <=
o(...) is analogous to <
Which would seem to imply that if f(n) = O(g(n)), then either f(n) = θ(g(n)) or f(n) = o(g(n)).
Is it possible to have f(n) = O(g(n)) such that neither f(n) = θ(g(n)) nor f(n) = o(g(n))? If so, what is an example of this? And if not, then why would we ever use O(...) when θ(...) or o(...) are stronger descriptors?
Let f(n)=k!, when k is the smallest integer such that n<=k!.
Then f(n) is not θ(n) (since f(k!+1)/(k!+1) tends to infinity) neither is o(n) (since f(k!)=k!), but clearly f(n)=O(n) (as f(n)<=n).

Growth functions of Algorithm?

Well i have two questions here:-
If f(n) is function whose growth rate is to be found then, Is for all three notations will the g(n) be same, like for f(n)=O(g(n)) and similaraly for omega and theta ?
Theta notation is "omega and Oh" if in some case if oh and omega functions are different then, how will we find theta function there ?
Thanks :)
O, Θ and Ω notation represent related but very different concepts. O-notation expresses an asymptotic upper bound on the growth rate of a function; it says that the function is eventually bounded from above by some constant multiple of some other function. Ω notation is similar, but gives a lower bound. Θ notation gives an asymptotic tight bound - for sufficiently large inputs, the algorithm grows at a rate that is bounded from both above and below by a constant multiple of a function.
If f(n) = O(g(n)), it is not necessarily true that f(n) = Ω(g(n)) or that f(n) = Θ(g(n)). For example, 1 = O(n), but 1 ≠ Ω(n) because n grows strictly faster than 1.
If you find that f(n) = O(g(n)) and Ω(h(n)), where g(n) ≠ h(n), you may have to do a more precise analysis to determine a function j(n) such that f(n) = Θ(j(n)). If g(n) = Θ(h(n)), then you can conclude that f(n) = Θ(g(n)), but if the upper and lower bounds are different there is no mechanical way to determine the Θ growth rate of the function.
Hope this helps!
f(n)=O(g(n)) means that n>N => |f(n)|≤C|g(n)| for some constants N and C.
f(n)=Ω(g(n)) means that n>N => |f(n)|≥C|g(n)| for some constants N and C.
f(n)=Θ(g(n)) means that f(n)=O(g(n)) and f(n)=Ω(g(n)).
It is not possible for all f to find a g such that f(n)=Θ(g(n)) if we want g to be a "good" function (i.e. something like n^r*Log(n)^s). For instance, if f(n)=cos(n)²*n+sin(n)²*n², we have f(n)=O(n²) and f(n)=Ω(n) but we can't find a "good" g such that f(n)=Θ(g(n)).

Determining Asympotic Notation

I have a set of problems where I am given an f(n) and g(n) and I am supposed to determine where f(n) is O(g(n)), Ω(g(n)) or Θ(g(n))
And I must also determine the c(s) and n0 for the correct relationship.
How do I get started on a problem like this?
Here's an example for the kind of problem I am given
f(n)= lg(n^2) g(n)=n lg(n)
You need to reduce f(n) to a form that makes it easy to compare to g(n). For your case:
f(n) = log(n2)
f(n) = 2 log(n)
That should be enough to answer your problem for that example - the process is going to be pretty much the same for the rest of the set.
You can do this using limits as follows
Limit as n tends to infinity(sorry i have no idea how to produce mathematical equations here)
of f(n)/g(n)
If the value obtained is
A constant then f(n) = Θ(g(n))
Infinity then f(n) = Ω(g(n))
Zero then f(n)= O(g(n))

How to prove big-o relations

Hey, the title is probably a bit off, so please correct it if you know how to put it better.
As a homework assignment I have been given several assignments along the following:
Let f(n) and g(n) be asymptotically positive functions. Prove or disprove each of the following conjectures.
a. f(n) = O(g(n)) implies g(n) = O(f(n))
Now, my real question is - how would you go about proving this in a formal way? I know that the above would be easy as I could easily provide a counter example to disprove it, but for the sake of the argument let's say that we want to do this without counter examples, as of course this continues on with some other examples where this will not work.
I am a bit stuck, I have the following inequalities written up (with <= being less than or equal to)
f(n) <= c1 * g(n)
g(n) <= c2 * f(n)
But I am uncertain of how I would combine these 2 inequations into a single (in)equation and disprove it. I am most certain that this is something quite easy that I have simply overlooked and that I am being rather stupid at the moment - but any pointers / concrete examples of how to do this would be great, so that I should be able to work the rest of these questions out on my own.
Why do you want to disprove it without using a counterexample? That is the most direct way to disprove a claim.
If you had to prove it instead, of course you would not be able to use a counterexample. In this case, contrapositive can work very well - assume that the claim is false, and then show how that leads to a logical inconsistency.
In this case, you start with f(n) <= c1 * g(n) being true, since this is what is meant by f(n) = O(g(n)). Now you want to assume that g(n) <= c2 * f(n) is true for all f and g (this last part is very important, because you can certainly pick f and g such that it is true), and show why this can't work. My hint for you: pick an f and a g such that it can't work, and show that it can't work by your choice of c1 and c2.
A few hints:
Don't forget that f(n) = O(g(n)) is a set notation and you can convert it to a mathematical form of inequalities.
Simple operations you can do with the O-notation:
f(n) = O(f(n))
c * O(f(n)) = O(f(n)), if c is constant
O(f(n)) + O(f(n)) = O(f(n))
O(O(f(n))) = O(f(n))
O(f(n)) * O(g(n)) = O(f(n)g(n))
O(f(n)g(n)) = f(n) * O(g(n))
(The Art of Computer Programming, vol 1 - The O-Notation)

Resources