What is the complexity of n/2 * log⁡(n^n) - data-structures

I would like to know what is the Big O notation of n/2 * log⁡(n^n)
I'm trying to figure out if it's O(n/2 * log⁡(n^n)) just like that or is it O(n * log⁡(n^n)).
And how do I check it?

Related

Shouldn't Merge Sort be described as O(n * log(n) + log(n))

When you think about the Merge Sort you split the set ( log(n) ) and then you join it together while sorting each group (n * log(n)). In sum that should be log(n) + n * log(n).
As shown in this diagram: https://en.wikipedia.org/wiki/Merge_sort#/media/File:Merge_sort_algorithm_diagram.svg

How to prove the complexity of T(n)=T(n-1)+log n =Θ(nlog n)

How can I prove that
T(n)=T(n-1)+log n = Θ(nlog(n))
My thinking:
We can get T(n)=∑log(n). To prove the result, We have to prove both ∑log(n)>=Nlog(N) and ∑log(n)<=Nlog(N). The second is easy, I want to know how to prove ∑log(n)>=Nlog(N)?
Assume N > 10. (You only need to prove the bound for "large enough" N.)
Suppose we have sum_n log(n) but we ignore the terms where n < N/10.
We have 9/10 * N terms left, and each term is at least log(N/10). Then:
sum_n log(n) >= (9/10 * N) * log(N/10)
= (9/10 * N) * (log(N) - log(10))
= (constant) * N * log(N) - (constant) * N
which is clearly Omega(N log N).

Solve the recurrence of T(n) = 3T(n/5) + T(n/2) + 2^n

I am solving the recurrence of T(n) under the assumption that T(n) is contact for n <= 2. I started to solve this T(n) with the tree-method since we cannot use the master method here, but when I do the tree I am of course calculating the time C for this T(n) but my C-s are very non-trivial and weird, so I get for
c = 2^n and then for the next c I get ' 3 * 2^(n/5) + 2^(n/3)
And I don't how to solve with these values, is there anything that I am doing wrong or what procedure should I follow in order to solve this?
You might want to reduce the number of terms down as much as you can.
3 * 2^(n/5) + 2^(n/3) = 3 * (2^(1/5) * 2^n) + (2^(1/3) * 2^n)
Then combine all the coefficients together.
(3 * 2^(1/5)) * 2^n + (2^(1/3)) * 2^n
Notice that the common factor is 2^n. So you would get:
(3 * 2^(1/5) + 2^(1/3)) * 2^n
and I'm going to name the first part of the product as constant which
will give us:
constant * 2^n which is just T(2^n) because the constant is insignificant as the size of n gets very large.
You can simplify the case. As T(n) is increasing we know that T(n/2) > T(n/5). Hence, T(n) < 4T(n/2) + 2^n. Now, you can use master theorem, and say that T(n)=O(2^n). On the other hand, without this replacement, as there exists 2^n in T(n), we can say T(n) = \Omega(2^n). Therefore, T(n) = \Theta(2^n).

What Big O is this equation?

I am trying to find the Big O of this equation.
n^2*2^(2n+1)
I know that n^2 is smaller than the other part but I don't know what Big O value this would be. Its not O(n^2) obviously and I don't think 2^(2n+1) can be simplified in any way.
If someone could help that'd be great.
n^2 * 2^(2 * n + 1) = n^2 * 2^(2 * n) * 2 = O(n^2 * 2^(2 * n)). It cannot be simplified further.

Drawing Recurrence Tree and Analysis

I am watching Intro to Algorithms (MIT) lecture 1. Theres something like below (analysis of merge sort)
T(n) = 2T(n/2) + O(n)
Few questions:
Why work at bottom level becomes O(n)? It said that the boundary case may have a different constant ... but I still don't get it ...
Its said total = cn(lg n) + O(n). Where does O(n) part come from? The original O(n)?
Although this one's been answered a lot, here's one way to reason it:
If you expand the recursion you get:
t(n) = 2 * t(n/2) + O(n)
t(n) = 2 * (2 * t(n/4) + O(n/2)) + O(n)
t(n) = 2 * (2 * (2 * t(n/8) + O(n/4)) + O(n/2)) + O(n)
...
t(n) = 2^k * t(n / 2^k) + O(n) + 2*O(n/2) + ... + 2^k * O(n/2^k)
The above stops when 2^k = n. So, that means n = log_2(k).
That makes n / 2^k = 1 which makes the first part of the equality simple to express, if we consider t(1) = c (constant).
t(n) = n * c + O(n) + 2*O(n/2) + ... + (2^k * O(n / 2^k))
If we consider the sum of O(n) + .. + 2^k * O(n / 2^k) we can observe that there are exactly k terms, and that each term is actually equivalent to n. So we can rewrite it like so:
t(n) = n * c + {n + n + n + .. + n} <-- where n appears k times
t(n) = n * c + n *k
but since k = log_2(n), we have
t(n) = n * c + n * log_2(n)
And since in Big-Oh notation n * log_2(n) is equivalent to n * log n, and it grows faster than n * c, it follows that the Big-O of the closed form is:
O(n * log n)
I hope this helps!
EDIT
To clarify, your first question, regarding why work at the bottom becomes O(n) is basically because you have n unit operations that take place (you have n leaf nodes in the expansion tree, and each takes a constant c time to complete). In the closed-formula, the work-at-the-bottom is expressed as the first term in the sum: 2 ^ k * t(1). As I said above, you have k levels in the tree, and the unit operation t(1) takes constant time.
To answer the second question, the O(n) does not actually come from the original O(n); it represents the work at the bottom (see answer to first question above).
The original O(n) is the time complexity required to merge the two sub-solutions t(n/2). Since the time complexity of the merge operation is assumed to grow (or decrease) linearly with the size of the problem, that means that at each level you will have a sum of O(n / 2^level), of 2^level terms; this is equivalent to one O(n) operation performed once. Now, since you have k levels, the merge complexity for the initial problem is {O(n) at each level} * {number of levels} which is essentially O(n) * k. Since k = log(n) levels, it follows that the time complexity of the merge operation is: O(n * log n).
Finally, when you examine all the operations performed, you see that the work at the bottom is less than the actual work performed to merge the solutions. Mathematically speaking, the work performed for each of the n items, grows asymptotically slower than the work performed to merge the sub-solutions; put differently, for large values of n, the merge operation dominates. So in Big-Oh analysis, the formula becomes: O(n * log(n)).

Resources