Understanding big theta - complexity-theory

Consider the function f(n)=(e^(n^2))/log(n), I need to find a function g(n) such that that is f(n)=θ(g(n)).
So I understand that you can multiply constant to f(n) and the resulting function is always big theta of f(n), but I'm wondering is there another function that does the trick?

Related

Explanation for Big-O notation comparison in different complexity class

Why Big-O notation can not compare algorithms in the same complexity class. Please explain, I can not find any detailed explanation.
So, O(n^2) says that this algorithm requires less or equal number of operations to perform. So, when you have algorithm A which requires f(n) = 1000n^2 + 2000n + 3000 operations and algorithm B which requires g(n) = n^2 + 10^20 operations. They're both O(n^2)
For small n the first algorithm will perform better than the second one. And for big ns second algorithm looks better since it has 1 * n^2, but first has 1000 * n^2.
Also, h(n) = n is also O(n^2) and k(n) = 5 is O(n^2). So, I can say that k(n) is better than h(n) because I know how these functions look like.
Consider the case when I don't know how functions k(n) and h(n) look like. The only thing I'm given is k(n) ~ O(n^2), h(n) ~ O(n^2). Can I say which function is better? No.
Summary
You can't say which function is better because Big O notation stays for less or equal. And following is true
O(1) is O(n^2)
O(n) is O(n^2)
How to compare functions?
There is Big Omega notation which stays for greater or equal, for example f(n) = n^2 + n + 1, this function is Omega(n^2) and Omega(n) and Omega(1). When function has complexity equal to some asymptotic, Big Theta is used, so for f(n) described above we can say that:
f(n) is O(n^3)
f(n) is O(n^2)
f(n) is Omega(n^2)
f(n) is Omega(n)
f(n) is Theta(n^2) // this is only one way we can describe f(n) using theta notation
So, to compare asymptotics of functions you need to use Theta instead of Big O or Omega.

What is true when Big O is of another function

Imagine that we have three functions f(n), g(n) and h(n) and that f(n) = O(g(n))
If g(n) = (1/5)h(n) is it possible that f(n) = O(h(n))?
Why is it possible / not possible?
Look at the formal Definition of big-O. You should understand what 1/5 represents. Hint: It is clearly a constant. I will not provide a formal proof as I guess is part of this homework, but you may try to show equality in big-O between these functions.

Complexity of absolute difference and sum

How can I solve the following questions in complexity:
|O(f(n)) – O(f(n))| = ? (Complexity of absolute value of O(f(n)) – O(f(n))).
O(f(n))+ O(f(n)) = ?
|O(f(n)) – O(f(n))| = O(f(n)). That is basically the only thing you can say about it. This is because you're subtracting, from a function growing at most like f(n), another function growing at most like f(n). The result (pre the absolute value) can be even negative (e.g., when subtracting 2 f(n) = O(f(n)), but it is clearly not larger than a function growing like f(n).
O(f(n)) + O(f(n)) = O(f(n)). This is easy to prove throught the basic definition of O.

How can we denote the following function in terms of big-O notation?

I have got a function and want to denote it in terms of bigO notation.
f(n) = log4n+n*(1/3). Is this function O(n)? Thanks for your help
According to Wikipedia
If a function f(n) can be written as a finite sum of other functions, then the fastest growing one determines the order of f(n).
Between log4(n) and 1/3 n the fastest growing one is 1/3 n, and
O(1/3 n) = O(n)
So f(n) is O(n)
I think it is O(1) , because the calculation can be done on same time no matter how big the n is.

Growth functions of Algorithm?

Well i have two questions here:-
If f(n) is function whose growth rate is to be found then, Is for all three notations will the g(n) be same, like for f(n)=O(g(n)) and similaraly for omega and theta ?
Theta notation is "omega and Oh" if in some case if oh and omega functions are different then, how will we find theta function there ?
Thanks :)
O, Θ and Ω notation represent related but very different concepts. O-notation expresses an asymptotic upper bound on the growth rate of a function; it says that the function is eventually bounded from above by some constant multiple of some other function. Ω notation is similar, but gives a lower bound. Θ notation gives an asymptotic tight bound - for sufficiently large inputs, the algorithm grows at a rate that is bounded from both above and below by a constant multiple of a function.
If f(n) = O(g(n)), it is not necessarily true that f(n) = Ω(g(n)) or that f(n) = Θ(g(n)). For example, 1 = O(n), but 1 ≠ Ω(n) because n grows strictly faster than 1.
If you find that f(n) = O(g(n)) and Ω(h(n)), where g(n) ≠ h(n), you may have to do a more precise analysis to determine a function j(n) such that f(n) = Θ(j(n)). If g(n) = Θ(h(n)), then you can conclude that f(n) = Θ(g(n)), but if the upper and lower bounds are different there is no mechanical way to determine the Θ growth rate of the function.
Hope this helps!
f(n)=O(g(n)) means that n>N => |f(n)|≤C|g(n)| for some constants N and C.
f(n)=Ω(g(n)) means that n>N => |f(n)|≥C|g(n)| for some constants N and C.
f(n)=Θ(g(n)) means that f(n)=O(g(n)) and f(n)=Ω(g(n)).
It is not possible for all f to find a g such that f(n)=Θ(g(n)) if we want g to be a "good" function (i.e. something like n^r*Log(n)^s). For instance, if f(n)=cos(n)²*n+sin(n)²*n², we have f(n)=O(n²) and f(n)=Ω(n) but we can't find a "good" g such that f(n)=Θ(g(n)).

Resources