I give you three short codes:
First code:
procedure Proc (n:integer)
begin
if n>0 then
begin
writeln('x')
Proc(n-2)
writeln('*');
Proc(n-2)
end
end
Second code:
procedure Proc (n:integer)
begin
if n>0 then
begin
writeln('*');
Proc(n-1)
end
end
Third code:
procedure Proc (n:integer)
begin
if n>0 then
begin
writeln('x')
Proc(n/2)
writeln('*');
Proc(n/2)
end
end
I would like to know how to determine the computational complexity of each code that I gave, cuz it will help me to better understand.. Can someone write an algorithm for determination of computational complexity of sample code step by step, and do it so that it was possible to apply this algorithm for another examples of codes?
First Question: Assume you know that for the value of n - 2, Proc is called T(n-1) times. Therefore, for the value of n, T(n) = 1 + 2T(n-2), as there would be one call to Proc(n) which would in turn call Proc(n-2) twice. T(n) = 1 + 2T(n-2) is a variant of Tower of Hanoi which is T(n) = 1+2T(n-1). There are proofs here http://en.wikipedia.org/wiki/Tower_of_Hanoi to Show that T(n) = 1+2T(n-1) = 2^n-1. Therefore T(n-1) = 1+2T((n-1)-1)= 1+2T(n-2) = 2^(n-1) -1. In your case T(n) = 1 + 2T(n-2) = 2^(n-1) -1. In other words, subtracting out every other term in the Tower of Hanoi problem saves about half the calls. 2^(n-1) - 1 = 2^n/2 - 1 which is O(2^n).
Second Question: This is easier. T(0) = 1 and T(n) = 1 + T(n-1). You can solve this many different ways, but one is via telescoping:
T(n) = 1 + T(n-1)
T(n-1) = 1 + (n-2)
...
T(1) = 1 + T(0)
Adding up both sides...
T(n) + T(n-1)+...+T(1) = 1 + T(n-1) + ... + 1 + T(0) = n + T(n-1)+...+T(0)
Subtract out like terms.
T(n) = n + T(0) = n + 1. So this is O(n).
Third Question: Similar to the first. T(0) = 1, say we know that for value of n-1, you can see that T(n) = 1 + 2 T(n/2). Note here that T(n) = 1 + 2T(n/2) < n + 2T(n/2).
So solve for 2T(n/2) + n with rolling out the recurrence:
T(n) = 2 T(n/2) + n
T(n/2) = 2 T(n/4) + n/2
So T(n) = 4T(n/4) + n + n
T(n/4) = 2T(n/8) + n/4
So T(n) = 8T(n/8) n + n + n
... It looks like T(n) = 2^kT(n/2^k)+kn for positive k.
Prove it by induction.
k = 1: T(n) = 2 T(n/2)+n which was given. This is our base case.
If true for k-1, show true for k:
T(n) = (2^(k-1))T(n/2^(k-1))+(k-1)n //Inductive hypothesis
T(n/2^(k-1)) = 2 T([n/2^(k-1))]/2)+n/2^(k-1)) //Given recurrence
= 2T(n/2^k)+n/2^(k-1)
=> T(n) = (2^k)T(n/2^k)+ n + (k-1)n = (2^k)T(n/2^k) + kn. So true for k.
T(n) = 2^kT(n/2^k)+kn, choose an appropriate positive k, such as k = ln(n).
T(n) = 2^ln(n) T(n/2^Ln(n)) + nln(n) = nT(1) +nln(n).
T(1) = 1 since Proc would just end. So n(T(1)) + nln(n) = nln(n) + n = O(nln(n)).
Unfortunately, there is not a one-size-fits all procedure for complexity. You have to take it on a case-by-case basis and figure out the problem.
Related
def maximum(array):
max = array[0]
counter = 0
for i in array:
size +=1
if i>max:
max=i
return max
I need to analyze that algorithm which find maximum number in array with n numbers in it. the only thing I want to know how to get Recursive and General formula for Average case of this algorithm.
Not sure what you mean by "Recursive and General formula for Average case of this algorithm". Your algorithm is not recursive. So, how can it be "recursive formula"?
Recursive way to find maximum in an array:
def findMax(Array, n):
if (n == 1):
return A[0]
return max(Array[n - 1], findMax(Array, n - 1))
I guess you want Recurrence relation.
Let T(n) be time taken to find the maximum of n elements. So, for above written code.
T(n) = T(n-1) + 1 .... Equation I
In case you are interested to solve the recurrence relation:
T(n-1) = T((n-1)-1) + 1 = T(n-2) + 1 .... Equation II
If you substitute value of T(n-1) from Equation II into Equation I, you get:
T(n) = (T(n-2) + 1) + 1 = T(n-2) + 2
Similarly,
T(n) = T(n-3) + 3
T(n) = T(n-4) + 4
and so on..
Continuing the above for k times,
T(n) = T(n-k) + k
If n-k = 0, means n = k. The equation then becomes
T(n) = T(0) + n = 1 + n
Therefore, the recursive algorithm we came up with has time complexity O(n).
Hope it helped.
How do I determine the running time (in terms of Big-Theta) for the algorithm of input size n that satisfies recurrence relation T(n) = T(n-1)+n where n >= 1 and with initial condition T(1) = 1?
Edit: I was practicing a past exam paper. Got stuck on this question. Need guidance
Look at it this way: T(n) = T(n-1) + n = T(n-2) + (n-1) + n = T(n-3) + (n-2) + (n-1) + n. Which means if n >= 1 then you will get something like T(n) = 1 + 2 + 3 + ... + n. If you work out the pattern of this series you will see that (n+1)n/2. Therefore, Ө(n^2)
I have been trying to solve a recurrence relation.
The recurrence is T(n) = T(n/3)+T(2n/3)+n^2
I solved the the recurrence n i got it as T(n)=nT(1)+ [ (9/5)(n^2)( (5/9)^(log n) ) ]
Can anyone tell me the runtime of this expression?
I think this recurrence works out to Θ(n2). To see this, we'll show that T(n) = Ω(n2) and that T(n) = O(n2).
Showing that T(n) = Ω(n2) is pretty straightforward - since T(n) has an n2 term in it, it's certainly Ω(n2).
Let's now show that T(n) = O(n2). We have that
T(n) = T(n / 3) + T(2n / 3) + n2
Consider this other recurrence:
S(n) = S(2n / 3) + S(2n / 3) + n2 = 2S(2n / 3) + n2
Since T(n) is increasing and T(n) ≤ S(n), any upper bound for S(n) should also be an upper-bound for T(n).
Using the Master Theorem on S(n), we have that a = 2, b = 3/2, and c = 2. Since logb a = log3/2 2 = 1.709511291... < c, the Master Theorem says that this will solve to O(n2). Since S(n) = O(n2), we also know that T(n) = O(n2).
We've shown that T(n) = Ω(n2) and that T(n) = O(n2), so T(n) = Θ(n2), as required.
Hope this helps!
(By the way - (5 / 9)log n = (2log 5/9)log n = 2log n log 5/9 = (2log n)log 5/9 = nlog 5/9. That makes it a bit easier to reason about.)
One can't tell about runtime from the T(n) OR the time complexity!It is simply an estimation of running time in terms of order of input(n).
One thing which I'd like to add is :-
I haven't solved your recurrence relation,but keeping in mind that your derived relation is correct and hence further putting n=1,in your given recurrence relation,we get
T(1)=T(1/3)+T(2/3)+1
So,either you'll be provided with the values for T(1/3) and T(2/3) in your question OR you have to understand from the given problem statement like what should be T(1) for Tower of Hanoi problem!
For a recurrence, the base-case is T(1), now by definition its value is as following:
T(1) = T(1/3) + T(2/3) + 1
Now since T(n) denotes the runtime-function, then the run-time of any input that will not be processed is always 0, this includes all terms under the base-case, so we have:
T(X < 1) = 0
T(1/3) = 0
T(2/3) = 0
T(1) = T(1/3) + T(2/3) + 1^2
T(1) = 0 + 0 + 1
T(1) = 1
Then we can substitute the value:
T(n) = n T(1) + [ (9/5)(n^2)( (5/9)^(log n) ) ]
T(n) = n + ( 9/5 n^2 (5/9)^(log n) )
T(n) = n^2 (9/5)^(1-log(n)) + n
We can approximate (9/5)^(1-log(n)) to 9/5 for asymptotic upper-bound, since (9/5)^(1-log(n)) <= 9/5:
T(n) ~ 9/5 n^2 + n
O(T(n)) = O(n^2)
Given..
T(0) = 3 for n <= 1
T(n) = 3T(n/3) + n/3 for n > 1
So the answer's suppose to be O(nlogn).. Here's how I did it and it's not giving me the right answer:
T(n) = 3T(n/3) + n/3
T(n/3) = 3T(n/3^2) + n/3^2
Subbing this into T(n) gives..
T(n) = 3(3T(n/3^2) + n/3^2) + n/3
T(n/3^2) = 3(3(3T(n/3^3) + n/3^3) + n/3^2) + n/3
Eventually it'll look like..
T(n) = 3^k (T(n/3^k)) + cn/3^k
Setting k = lgn..
T(n) = 3^lgn * (T(n/3^lgn)) + cn/3^lgn
T(n) = n * T(0) + c
T(n) = 3n + c
The answer's O(n) though..What is wrong with my steps?
T(n) = 3T(n/3) + n/3
T(n/3) = 3T(n/9) + n/9
T(n) = 3(3T(n/9) + n/9) + n/3
= 9T(n/9) + 2*n/3 //statement 1
T(n/9)= 3T(n/27) + n/27
T(n) = 9 (3T(n/27)+n/27) + 2*n/3 // replacing T(n/9) in statement 1
= 27 T (n/27) + 3*(n/3)
T(n) = 3^k* T(n/3^k) + k* (n/3) // eventually
replace k with log n to the base 3.
T(n) = n T(1) + (log n) (n/3);
// T(1) = 3
T(n) = 3*n + (log n) (n/3);
Hence , O (n* logn)
Eventually it'll look like.. T(n) = 3^k (T(n/3^k)) + cn/3^k
No. Eventually it'll look like
T(n) = 3^k * T(n/3^k) + k*n/3
You've opened the parenthesis inaccurately.
These types of problems are easily solved using the masters theorem. In your case a = b = 3, c = log3(3) = 1 and because n^c grows with the same rate as your f(n) = n/3, you fall in the second case.
Here you have your k=1 and therefore the answer is O(n log(n))
This question can be solved by Master Theorem:
In a recursion form :
where a>=1, b>1, k >=0 and p is a real number, then:
if a > bk, then
if a = bk
a.) if p >-1, then
b.) if p = -1, then
c.) if p < -1, then
3. if a < bk
a.) if p >=0, then
b.) if p<0, then T(n) = O(nk)
So, the above equation
T(n) = 3T(n/3) + n/3
a = 3, b = 3, k =1, p =0
so it fall into 2.a case, where a = bk
So answer will be
O(n⋅log(n))
I know that
T(n) = T(n/2) + θ(1) can be result to O(Log N)
and my book said this is a case of Binary Search.
But, how do you know that? Is it just by the fact that Binary Search cuts the problem in half each time so it is O(Log N)?
And T(n) = 2T(n/2) + θ(1)
why is it that the result is O(N) and not O(Log N) when the algorithm divides in half each time as well.
Then T(n) = 2T(n/2) + θ(n)
can be result to O(N Log N)? I see the O(N) is from θ(n) and O(Log N) is from T(n/2)
I am really confused about how to determine the Big O of an algorithm that I don't even know how to word it properly. I hope my question is making sense.
Thanks in advance!
an intuitive solution for these problems is to see the result when unfolding the recursive formula:
Let's assume Theta(1) is actually 1 and Theta(n) is n, for simplicity
T(n) = T(n/2) + 1 = T(n/4) + 1 + 1 = T(n/8) + 1 + 1 + 1 = ... =
= T(0) + 1 + ... + 1 [logN times] = logn
T'(n) = 2T'(n/2) + 1 = 2(2T'(n/4) + 1) + 1 = 4T'(n/4) + 2 + 1 =
= 8T'(n/4) + 4 + 2 + 1 = ... = 2^(logn) + 2^(logn-1) + ... + 1 = n + n/2 + ... + 1 =
= 2n-1
T''(n) = 2T(n/2) + n = 2(2T''(n/2) + n/2) + n = 4T''(n/4) + 2* (n/2) + n =
= 8T''(n/8) + 4*n/4 + 2*n/2 + n = .... = n + n + .. + n [logn times] = nlogn
To formally prove these equations, you should use induction. Assume T(n/2) = X, and using it - prove T(n) = Y, as expected.
For example, for the first formula [T(n) = T(n/2) + 1] - and assume base is T(1) = 0
Base trivially holds for n = 1
Assume T(n) <= logn for any k <= n-1, and prove it for k = n
T(n) = T(n/2) + 1 <= (induction hypothesis) log(n/2) + 1 = log(n/2) + log(2) = log(n/2*2) = log(n)
I find an easy way to understand these is to consider the time the algorithm spends on each step of the recurrence, and then add them up to find the total time. First, let's consider
T(n) = T(n/2) + O(1)
where n=64. Let's add up how much the algorithm takes at each step:
T(64) = T(32) + 1 ... 1 so far
T(32) = T(16) + 1 ... 2 so far
T(16) = T(08) + 1 ... 3 so far
T(08) = T(04) + 1 ... 4 so far
T(04) = T(02) + 1 ... 5 so far
T(02) = T(01) + 1 ... 6 so far
T(01) = 1 ... 7 total
So, we can see that the algorithm took '1' time at each step. And, since each step divides the input in half, the total work is the number of times the algorithm had to divide the input in two... which is log2 n.
Next, let's consider the case where
T(n) = 2T(n/2) + O(1)
However, to make things simpler, we'll build up from the base case T(1) = 1.
T(01) = 1 ... 1 so far
now we have to do T(01) twice and then add one, so
T(02) = 2T(01) + 1 ... (1*2)+1 = 3
now we have to do T(02) twice, and then add one, so
T(04) = 2T(02) + 1 ... (3*2)+1 = 7
T(08) = 2T(04) + 1 ... (7*2)+1 = 15
T(16) = 2T(08) + 1 ... (15*2)+1 = 31
T(32) = 2T(16) + 1 ... (32*2)+1 = 63
T(64) = 2T(32) + 1 ... (65*2)+1 = 127
So we can see that here the algorithm has done 127 work - which is equal to the input multiplied by a constant (2) and plus a constant (-1), which is O(n). Basically this recursion corresponds to the infinite sequence (1 + 1/2 + 1/4 + 1/8 + 1/16) which sums to 2.
Try using this method on T(n) = 2T(n/2) + n and see if it makes more sense to you.
One visual solution to find the T(n) for a recursive equation is to sketch it with a tree then:
T(n) = number of nodes * time specified on each node.
In your case T(n) = 2T(n/2) + 1
I write the one in the node itself and expand it to two node T(n/2)
Note T(n/2) = 2T(n/4) + 1, and again I do the same for it.
T(n) + 1
/ \
T(n/2)+1 T(n/2)+1
/ \ / \
T(n/4)+1 T(n/4)+1 T(n/4)+1 T(n/4)+1
... ... .. .. .. .. .. ..
T(1) T(1) .......... ............T(1)
In this tree the number of nodes equals
2*height of tree = 2*log(n) = n
Then T(n) = n * 1 = n = O(n)