Time complexity of the expression - algorithm

I am trying to find the time complexity of the following:
Algorithm C solves X by dividing a problem of size n into 2 subproblems of size n/2, recursively solving each sub problem, and then combining the solutions in O(n^3) time.
My try:
2 T(n/2) + O(n^3)
Can I write directly that the time complexity will be : O(n^3)

using master theorem we can solve 2 T(n/2) which would result in O(n) then as per the question add O(n^3)
thus.,
O(n) + O(n^3)
As you have mentioned The time complexity will be : O(n^3) but be sure to solve the `2 T(n/2)' since if this part has a greater complexity it would affect the final result.

Related

What will be the Time complexity of the given algorithm, by substitution or recurrence tree method

What will be the time complexity of this algorithm, can anyone explain itbin detail. We have just figured out that when n is in the form of 2^x then only it reaches the base case .
And we were taking it as T(n) = T(n/2) + T(3n+1)
Note this is not an infinite loop as collatz conjecture (3n-1).
The time complexity is the length of the Collatz sequence starting from n.
See http://oeis.org/A006577.

Time complexity of a recursive algoritm that divides the problem in n^(1/2) subproblems

I am learning time analysis of algorithms and I have a question about the following problem:
Let's say we have a problem size of n and we can use an algorithm that solves the problem by dividing it into n^(1/2) subproblems (each has size of n^(1/2), recursively solving each subproblem, and then combining the solutions in linear time.
My target is to find the time complexity of this algorithm. I believe that I have to solve one of these two recursive equations : T(n) = n^(1/2) * T(n^(1/2)) + n, T(n) = n^(1/2) * T(n^(1/2)) + O(n). However I do not know which one. Could you tell me the differences between those two?

What's the difference between Theta(n) and T(n) when considering time complexity?

The professor was discussing the time complexity of merge sort and he divided the whole process into three steps.
check whether the size of array is 1 -> time complexity: theta(1)
He described the sorting process -> time complexity: 2T(n/2)
Merge the tow sorted sequences -> time complexity: theta(n)
I don't understand step 2, why did he describe it as 2T(n/2) instead of 2Theta(n/2)? What's the difference between theta(n) and T(n)?
Here is the link from Youtube: https://www.youtube.com/watch?v=JPyuH4qXLZ0
And it's between 1:08:45 - 1:10:33
What the professor means by T(n), is the exact complexity, i.e. the number of steps the algorithm needs to complete, which actually may vary depending on the implementation. What's more interesting, is the asymptotic complexity, which is here denoted as Θ(n), and shows how fast T grows along with n.
The first step of the mergesort algorithm is to split the array into halves and sort each half with the same algorithm (which is therefore recursive). That step takes obviously 2T(n/2). Then you merge both halves (hence the name), that takes linear time, Θ(n). From that recursive definition T(n) = 2T(n/2) + Θ(n) he derives that T(n) = Θ(nlogn) which is the complexity class of the mergesort algorithm.

Amortized Runtime Cost for an algorithm alternating between O(n^2) & O(n^4)

If I implement an algorithm that runs at O(n^4) at the current timestep and then O(n^2) at the next.
is the complexity still the max[O(n^4), O(n^2)] ?
Is there a way to get a polynomial in the range [2, 4) for the complexity? I.e something like O(n^2.83) on average
How would I calculate the average runtime cost amortized from t=0...inf ? Is it just [O(n^2) + O(n^4)] / 2 ?
O(n2) is negligible over O(n4) since the quotient of the first on the second has a zero limit when n grows indefinitely.
So your algorithm is just O(n4)
Read wikipage on Big 0 notation and any good textbooks about limits of polynomials.

asymptotic complexity based off running time?

How do you go about finding the asymptotic complexity based off a running time? For example:
If the run time of a recursive algorithm is given as
T(n) = 2 T(n/2) + O(n)
considering the Master Theorem, what is its asymptotic complexity?
Could someone explain the steps on how to figure this out?
For the Master Theorem, there are 3 different cases to solve.
First step is to understand which case applies.
For questions involving Master Theorem, our general formula is T(n) = aT(n/b) + f(n)
So first thing to do is compare n^(logb a) with f(n).
Simply whichever is bigger that's the complexity of it(these are Cases 1 and 3) and if they are equal then you multiply the result with lgn, such as if you have a case like T(n) = 16 T(n/4) + O(n^2) then
n^(logb a) would make n^2 as well as f(n) = n^2 so the answer would be Θ(n^(2)*lgn), which is by using the case #2 of the Master Theorem.

Resources