Asymptotic complexity between n! and n^n - asymptotic-complexity

What would be the example of a function f(n) that is asymptotically slower than O(n^n) and faster than O(n!), i.e.
O(n!) < O(f(n))< O(n^n)
?

f: n ↦ (n+1)! is one such function.

Related

N power n i-e n^n is polynomial or not ? Is there any polynomial difference between n^2 and n^n?

Is n to the power n (i-e n^n) a polynomial? Can T(n) = 2T(n/2) + n^n be solved using the master method?
It's not only not polynomial, it's also worse than factiorial. O(n^n) dominates O(n!). Also in the masters method f(n) must be polynomial, so you can not use it.

Big O complexity when two functions f(n) [O(1)] and g(n) [O(n)] are multiplied together

f(n) and g(n) represent the running time of two different algorithms. f(n) has algorithm complexity O(1), and g(n) has algorithm complexity O(n). Can we claim f(n)*g(n) has complexity O(n)? Why/Why not?
A mathematical proof:
If we want to proof that f(n)*g(n) is O(n) we must show that exists n0 and constant c such as :
f(n)*g(n) < c*n for every n>n0
We have as fact that f(n) is O(n) which means that there are c1,n1 :
f(n)<c1*n for every n>n1 (1)
and for g there are c2,n2:
g(n)<c2 for every n>n2 (2)
Now we have that, for every n>max(n1,n2) (max because we want both inequalities for f and for g to hold):
f(n)g(n)<c1*c2*n (by multiplying (1),(2))
so we proved that there is c=c1*c2 and n0=max(n1,n2) such as the below inequality holds:
f(n)g(n)<c*n -> f(n)*g(n) is O(n) for every n>n0.
O(n) is time complexity . When we multiply f(n) and g(n).The higher time complexity obtain so,algorithm complexity is O(n).

Algorithm domination

Studying for a test and getting this question:
Comparing two algorithms with asymptotic complexities O(n) and O(n + log(n)),
which one of the following is true?
A) O(n + log(n)) dominates O(n)
B) O(n) dominates O(n + log(n))
C) Neither algorithm dominates the other.
O(n) dominates log(n) correct? So in this do we just take o(n) from both and deduce neither dominate?
[C] is true, because of the summation property of Big-O
Summation
O(f(n)) + O(g(n)) -> O(max(f(n), g(n)))
For example: O(n^2) + O(n) = O(n^2)
In Big-O, you only care about the largest-growing function and ignore all the other additives.
Edit: originally I put [A] as an answer, I just didn't put much attention to all the options and misinterpreted the [A] option. Here is more formal proof
O(n) ~ O(n + log(n)) <=>
O(n) ~ O(n) + O(log(n)) <=>
O(n) ~ O(n).
Yes, that's correct. If runtime is the sum of several runtimes, by order of magnitude, the largest order of magnitude dominates.
Assuming that big-O notation is used in the sense of asymptotic tight bound, which really should be denoted with a big-Theta, then I would answer C), because Theta(n) = Theta(n + log(n)). (Because log(n) is dominated by n).
If I am formally (mathematically) correct, then I would say that none of these answers is correct, because O(n) and O(n+log(n)) only give upper bounds, but not lower bounds on the asymptotic behaviour:
Let f(n) in O(n) and g(n) in O(n + log(n)). Then there are the following contra examples:
For A): Let f(n) = n in O(n) and g(n) = 1 in O(n + log(n)). Then g(n) does not dominate f(n).
for B): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then f(n) does not dominate g(n).
for C): Let f(n) = 1 in O(n) and g(n) = n in O(n + log(n)). Then g(n) does dominate f(n).
As this would be a very tricky question, I assume that you use the more common sloppy definition, which would give the answer C). (But you might want to check your definitions for big-O).
If my answer confuses you, then you probably didn't use the formal definition and you should probably ignore my answer...

What will be the asymptotic time complexity of the following function?

I came across this problem on asymptotic complexity of a function:
The complexities of 3 functions are as follows:
f(n) = O(n)
g(n) = Big-Omega(n)
h(n) = Theta(n)
So what will be the asymptotic complexity of the resultant function [f(n).g(n)] + h(n)
I can figure out the answer to this will be Big-Omega(n) by elementary hit and trial. For example if I say f(n) = n and g(n) = n and h(n) = n. So we can say f(n) is O(n) and g(n) is Big-Omega(n) and h(n) is Theta(n). Now f(n).g(n) is n2 and this will be Big-Omega(n) but not O(n). Now adding this to h(n) is n2+n. Which also is Big-Omega(n) but not Theta(n).
But I'm not able to figure out a proper logical or mathematical proof to this. Can someone please help me out with this?
Here's an attempt at a logical explanation:
f(n) = O(n) means that f's running time is at most linear (may be constant time).
h(n) = Theta(n) means that h's running time running is linear.
g(n) = Big-Omega(n) means that g's running time is atleast linear (may be polynomial, exponential... we don't know).
Now let's analyse the best case: f(n) is constant time, g(n) is linear, h(n) is linear. what can we say about the function f(n)*g(n)+h(n)? that it's also linear.
What can we say about the worst case? nothing, as we have no clue about the behaviour of g(n) in the worst case.
So we can conclude that f(n)*g(n)+h(n) = Big-Omega(n) as this function is linear at the best case.

Why complexity of linear function is same as that of quadratic equation

I am learning algorithms.. So, I came along with something very interesting.
The asymptotic bound of linear equation ( (a*n)+b ) is O(n^2).. for all a>0.
This is same that of not so surprising.. a* n^2 + b* n + c
Why?
Because big-oh gives you an upper bound. Your first function is also O(n^3), O(n^4), O(n^2012) etc.
The definition of big-oh basically says that f(n) is O(g(n)) if there exists some k such that, for all n > k, we have g(n) > f(n).
Look into big-theta for stronger / tight bounds.

Resources