After solving the recurrence relation T(n) = 3T(n/3) + nlogn
I get following equation: T(n)=3kT(n/3k)+ nlogn + nlog(n/3) + nlog(n/3^2) …nlog(n/3^k)
How can I simplify the summation and how to know the asymptotic function?
log a + log b = log(a*b), so something like
T(n) = 3kT(n/3k) + n^k*logn(\Pi_k(n/(3^k)))
product of 1/(3^k) (\Pi_k(1/(3^k))) is
3^(-1/2k(k+1))
so
it simplifies
the logs become the single term n*log{3^(-1/2k(k+1)) * n^(k+1)}
that takes care of the simplification i gues..
Related
Everyone.
I have quick question about one recurrence: T(n) = n^2 * T(n-1).
I am using "recursion-tree method" of CLRS, and got
T(n)=n(square) + (n-1)(square)*n + (n-2)(square)n(n-1) + (n-3)(square)n(n-1)*(n-2) + ...+1(square)*n!
I don't know how to summarize this expression to a upper bound.
Could some one help here
You seem to be overcomplicating things. If T(n) = n^2 * T(n - 1) is correct, you would simply have a product of squares:
(assuming the stopping condition is n = 1).
I have the following "divide and conquer" algorithm A1.
A1 divides a problem with size n , to 4 sub-problems with size n/4.
Then, solves them and compose the solutions to 12n time.
How can I to write the recursive equation that give the runtime of algorithms.
Answering the question "How can I to write the recursive equation that give the runtime of algorithms"
You should write it this way:
Let T(n) denote the run time of your algorithm for input size of n
T(n) = 4*T(n/4) + 12*n;
Although the master theorem does give a shortcut to the answer, it is imperative to understand the derivation of the Big O runtime. Divide and conquer recurrence relations are written in the form T(n) = q * T(n/j) + cn, where q is the number of subproblems, j the amount we divide the data for each subproblem, and cn is the time it takes to divide/combine/manipulate each subproblem at each level. cn could also be cn^2 or c, whatever the runtime would be.
In your case, you have 4 subproblems of size n/4 with each level being solved in 12n time giving a recurrence relation of T(n) = 4 * T(n/4) + 12n. From this recurrence, we can then derive the runtime of the algorithm. Given it is a divide and conquer relation, we can assume that the base case is T(1) = 1.
To solve the recurrence, I will use a technique called substitution. We know that T(n) = 4 * T(n/4) + 12n, so we will substitute for T(n/4). T(n/4) = 4 * T(n/16) + 12(n/4). Plugging this into the equation gets us T(n) = 4 * (4 * T(n/16) + 12n/4) + 12n, which we can simplify to T(n) = 4^2 * T(n/16) + 2* 12n. Again, we still have more work to do in the equation to capture the work in all levels, so we substitute for T(n/16), T(n) = 4^3 * T(n/64) + 3* 12n. We see the pattern emerge and know that we want to go all the way down to our base case, T(1), so that we substitute to get T(n) = 4^k*T(1) + k * 12n. This equation defines the total amount of work that is in the divide and conquer algorithm because we have substituted all of the levels in, however, we still have an unknown variable k and we want it in terms of n We get k by solving the equation n/4^k = 1 as we know that we have reached the point where we are calling the algorithm on only one variable. We solve for n and get that k = log4n. That means that we have done log4n substitutions. We plug that in for k and get T(n) =4^log4n*T(1) + log4n * 12n. We simplify this to T(n) =n *1 + log4n * 12n. Since this is Big O analysis and log4n is in O(log2n) due to the change of base property of logarithms, we get that T(n) = n + 12n * logn which means that T(n) is in the Big O of nlogn.
Recurrence relation that best describes is given by:
T(n)=4*T(n/4)+12*n
Where T(n)= run time of given algorithm for input of size n, 4= no of subproblems,n/4 = size of each subproblem .
Using Master Theorem Time Complexity is calculated to be:theta(n*log n)
I am trying to find the runtime of the following recurrence using iterative substitution:
T(n) = T(n/2) + T(n/3) + n
The issue is that there are two T(n/x) terms and finding general form for this case has proven to be quite challenging.
Is there a general guideline one should follow using iterative substitution for cases like this?
This recurrence is from the class of Akra–Bazzi recurrences . Following the formula the solution is:
Alternatively, suppose that T(1) = c0 then you can prove that T(n) <= max(6,c0)*n by induction.
You can also use the substitution rule. Here's how:
T(n) = T(n/2)+T(n/3) + n =
= n+(n/2+n/3)+T(n/(2*2))+T(n/(2*3))+T(n/(3*2))+T(n/(3*3))
= n+(n/2+n/3)+(n/(2*2)+n/(2*3)+n/(3*2)+n/(3*3))
+T(n/(2*2*2))+T(n/(2*2*3))
+T(n/(2*3*2))+T(n/(2*3*3))
+T(n/(3*2*2))+T(n/(3*2*3))
+T(n/(3*3*2))+T(n/(3*3*3))=
...
= n * (1 + 5/6 + (5/6)^2 + (5/6)^3 + (5/6)^4 + ...)
= 6 * n (assuming n = 2^k3^k. you get < 6*n otherwise)
Nothing formal here, but
T(n) = 2T(n/2) + n // O(nlog(n))
So your recurrence might still be O(nlog(n))?
Also what is the base case?
In masters theorem were given a "plug-in" formula to find the big O, given it satisfies some condition.
However, what if we have problems like the following below? Can anyone show me how to do a step by step formula. And what topics would help me to know more about these types of questions. Assume that the person asking this question knows nothing about induction.
T(n)=T(n^(1/2))+1
T(n)=T(n-1) + 1
T(n)=T(n-1) + n^c , c is a natural number >1
T(n)= T(n-1) C^n, c is a natural number >1
You'll need to know a little math to do some of these. You can figure out what the recursion looks like when you expand it out all the way to the base case, e.g. for T(n) = T(n-1) + n^c you get T(n) = 1^c + 2^c + ... + n^c, but then you need to know some math in order to know that this is O(n^(c+1)). (The easiest way to see this is by bounding the sum above and below in terms of integrals of x^c). Similarly for T(n) = T(n-1) + c^n you easily get T(n) = c^1 + c^2 + ... + c^n but you again need to use some calculus or something to figure out that this is T(n) = O(c^n).
For T(n) = T(n^(1/2)) + 1 you need to count how many times you apply the recurrence before you get to the base case. Again math helps here. When you take square-root, the logarithm gets cut in half. So you want to know how many times you can cut the logarithm in half until you get to the base case. This is O(log log n).
You can expand upon the formula and work on it:
For example:
T(n) = T(n-1) + 1
T(n) = [T(n-2) + 1] + 1
...
T(n) = 1 + 1 + 1 ... (n times)
So T(n) = O(n).
I understand that it is similar to the Fibonacci sequence that has an exponential running time. However this recurrence relation has more branches. What are the asymptotic bounds of T(n) = 2T(n-1) + 3T(n-2)+ 1?
Usually, you will need to make some assumptions on T(0) and T(1), because there will be exponentially many of them and their values may determine the functional form of T(n). However in this case it doesn't seem to matter.
Then, recurrence relations of this form can be solved by finding their characteristic polynomials. I found this article: http://www.wikihow.com/Solve-Recurrence-Relations
I obtained the characteristic polynomial with roots 3 and 1, so that guesses the form T(n) = c_1*3^n + c_2. In particular, T(n) = 1/2*3^n - 1/4 satisfies the recurrence relation, and we can verify this.
1/2*3^n - 1/4 = 2*T(n-1) + 3*T(n-2) + 1
= 2*(1/2*3^(n-1) - 1/4) + 3*(1/2*3^(n-2) - 1/4) + 1
= 3^(n-1) - 1/2 + 1/2*3^(n-1) - 3/4 + 1
= 3/2*3^(n-1) - 1/4
= 1/2*3^n - 1/4
Hence it would give that T(n) = Theta(3^n). However, this may not be the only function that satisfies the recurrence and other possibilities will also depend on what you defined the values T(0) and T(1), but they should all be O(3^n).
This type of recurrences are called: non-homogeneous recurrence relations and you have to solve in the beginning homogeneous recurrence (the one without a constant at the end). If you are interested, read the math behind it.
I will show you an easy way. Just type your equation in wolfram-alpha and you will get:
,
which is clearly an exponential complexity: O(3^n)