What Big O is this equation? - algorithm

I am trying to find the Big O of this equation.
n^2*2^(2n+1)
I know that n^2 is smaller than the other part but I don't know what Big O value this would be. Its not O(n^2) obviously and I don't think 2^(2n+1) can be simplified in any way.
If someone could help that'd be great.

n^2 * 2^(2 * n + 1) = n^2 * 2^(2 * n) * 2 = O(n^2 * 2^(2 * n)). It cannot be simplified further.

Related

What is the complexity of n/2 * log⁡(n^n)

I would like to know what is the Big O notation of n/2 * log⁡(n^n)
I'm trying to figure out if it's O(n/2 * log⁡(n^n)) just like that or is it O(n * log⁡(n^n)).
And how do I check it?

Solve the recurrence of T(n) = 3T(n/5) + T(n/2) + 2^n

I am solving the recurrence of T(n) under the assumption that T(n) is contact for n <= 2. I started to solve this T(n) with the tree-method since we cannot use the master method here, but when I do the tree I am of course calculating the time C for this T(n) but my C-s are very non-trivial and weird, so I get for
c = 2^n and then for the next c I get ' 3 * 2^(n/5) + 2^(n/3)
And I don't how to solve with these values, is there anything that I am doing wrong or what procedure should I follow in order to solve this?
You might want to reduce the number of terms down as much as you can.
3 * 2^(n/5) + 2^(n/3) = 3 * (2^(1/5) * 2^n) + (2^(1/3) * 2^n)
Then combine all the coefficients together.
(3 * 2^(1/5)) * 2^n + (2^(1/3)) * 2^n
Notice that the common factor is 2^n. So you would get:
(3 * 2^(1/5) + 2^(1/3)) * 2^n
and I'm going to name the first part of the product as constant which
will give us:
constant * 2^n which is just T(2^n) because the constant is insignificant as the size of n gets very large.
You can simplify the case. As T(n) is increasing we know that T(n/2) > T(n/5). Hence, T(n) < 4T(n/2) + 2^n. Now, you can use master theorem, and say that T(n)=O(2^n). On the other hand, without this replacement, as there exists 2^n in T(n), we can say T(n) = \Omega(2^n). Therefore, T(n) = \Theta(2^n).

is (n+1)! in the order of (n!)? can you show me a proof?

What about (n-1)!?
Also if you could show me a proof that would help me understand better.
I'm stuck on this one.
To show that (n+1)! is in O(n!) you have to show that there is a constant c so that for all big enough n (n > n0) the inequality
(n+1)! < c n!
holds. However since (n+1)! = (n+1) n! this simplifies to
n+1 < c
which clearly does not hold since c is a constant and n can be arbitrarily large.
On the other hand, (n-1)! is in O(n!). The proof is left as an exercise.
(n+1)! = n! * (n+1)
O((n+1)*n!) = O(nn!+n!) = O(2(nn!)) = O(n*n!) > O(n!)
(n-1)! = n! * n-1
O(n-1)! = O(n!/n) < O(n!)
I wasnt formally introduced to algorithmic complexity so take what I write with a grain of salt
That said, we know n^3 is way worse than n, right?
Well, since (n + 1)! = (n - 1)! * n * (n + 1)
Comparing (n + 1)! to (n - 1)! is like comparing n to n^3
Sorry, I dont have proof but expanding the factorial as above should lead to it

(Beginner) Questions about the Big O notation

I have some questions related to the Big O notation:
n^3 = Big Omega(n^2)
This is true because:
n^3 >= c * n^2 for all n >= n0
-> Lim(n-> infinity) = n^3/n^2 >= c
Then I used L'Hospital and got 6n/2 >= c which is true if I for example choose c as 1 and n0 as 1
Are my thoughts right on this one ?
Now I got two pairs:
log n and n/log n, do they lie in Theta, O or somewhere else ? Just tell me where they lie, then I can do the proof by myself.
n^(log n) and 2^n follows vice versa
And at last:
f(n) = O(n) -> f(n)^2 = O(n^2)
f(n)g(n) = O(f(n)g(n))
The question is: Are these statements correct ?
I would say yes to the first one, I don't really know why and it seems like there is a hidden trick to this, but I don't really know, could someone help me out here ?
The second one should be true if g(n) lies in O(n) ,but I don't really know here either.
Seems like you're right here.
As for log(n) and n/log(n) you can check it by finding the lim log(n)/(n/log(n)) and vice versa.
Using the fact that a^b = e^(b*ln(a)):
n^log(n) = e^(log(n) * log(n)) < e^(n^2) = e^e^(2*log(n)) < (e^e)^(2*n) = O(C^n), and 2^n is also Big O(C^n).
Let's use the definition and some properties of the Big O(f):
O(f) = f * O(1)
O(1) * O(1) = O(1)
Now we have:
f(n)^2 = f(n) * f(n) = O(n) * O(n) = n * O(1) * n * O(1) = n^2 * O(1) = O(n^2).
f(n)g(n) = f(n)g(n) * O(1) = O(f(n)g(n)).
So, yes, it is correct.

Are the two complexities O((2n + 1)!) and O(n!) equal?

This may be a naive question but I am new to the concept of Big-O notation and complexity and could not found any answer for this. I am dealing with a problem for which the algorithm (2n + 1)! times check a condition. Can I say that the complexity of the problem is O(n!) or the complexity is O((2n + 1)!)?
Use Stirling's approximation:
n! ~ (n / e)^n * sqrt(2 * pi * n)
Then
(2n + 1)! ~ ((2n + 1) / e)^(2n + 1) * sqrt(2 * pi * (2n + 1))
>= (2n / e)^(2n) * sqrt(2 * pi * 2n)
= 2^2n * (n / e)^(2n) * sqrt(2) * sqrt(2 * pi * n)
= sqrt(2) * (2^n)^2 * ((n / e)^n)^2 * sqrt(2 * pi * n)
And now it's pretty clear why there's no hope of O((2n + 1)!) being O(n!): the exponential factors are way worse. It's more like O((2n + 1)!) is O((n!)^2).
Let (N,c) be any ordered pair of positive constants. Let n be any integer such that n>N and n>c.
Then (2n+1)! > (2n+1)*n! > cn!
Thus for any pair of positive constants (N,c) there exists n>N such that (2n+1)! > cn!, so (2n+1)! is not O(n!).
O((2n+1)!) contains a function, (2n+1)!, that is not in O(n!), so O((2n+1)!) and O(n!) are not the same.
(I concur in wanting LaTeX.)
Here is the definition: https://en.wikipedia.org/wiki/Big_O_notation . So we need to check whether there exists a constant c, and n0 such that:
(2n+1)! < cn! for n > n0
Intuitively, from observing how (2n+1)! and n! behave:
http://www.wolframalpha.com/input/?i=n!+%3E%282n+%2B1%29!
(2n+1)! just goes two times faster, so regardless of "c" it will always reach n!. So you can't simplify to n!.
(2n+1)! = n!(n+1)...(2n+1)
O((2n+1)!) = O(n!)O((n+1)...(2n+1))
==>
O(1) = o((n+1)...(2n+1))
O(n!) = o((2n+1)!)

Resources