Order of growth determination - algorithm

I am practicing order of growth and I was having trouble determining the order of growth for the following function:
def ri(na):
if na <= 1:
return na
def han(na):
i = 1
while i < na:
i *= 2
return i
return ri(na/2) + ri(na/2) + han(na-2)
I believe the function han has an order of growth $\Theta(n) = log(n)$ but I'm not sure how to think about this when adding ri(na/2). I would appreciate if anyone can help me out in figuring out how to compute the run time. Thanks so much!

The time complexity of han function is Theta(log(n)) (each time i is mulplied by 2). Hence, the time complexity of ri is T(n) = 2T(n/2) + Theta(log(n)). Using the master theorem we can say T(n) = Theta(n).

Related

T(Running Time) vs T (Recursion)

I don't understand whats the difference between the (T) from recurrence and the (T) from Running Time. I took some courses that teach me recursion and lineal recurrence, for example in this code
factorial (n) {
if (n = 0)
return 1
else
return n * factorial(n-1)
}
Why the time complexity is O(n) ?
I took a course about recurrence and I'm a little confuse. I would analize the code in this way:
Tn = 1 , n=0
Tn = n*T(n-1)
so if we expand the recurssion :
Tn = (n-1)*(n)*T(n-2)
so the recurssion grow n! and the growth is O(n!), however, the analysis is different, but what am I doing wrong?
And then I've another similar question , I took a linear recurrence function course and in these course I learn how to solve a recurrence for example : f(n) = f(n-1) + f(n-2)
So in the fibanicci program:
def Fibonacci(n):
if n<0:
print("Incorrect input")
# First Fibonacci number is 0
elif n==0:
return 0
# Second Fibonacci number is 1
elif n==1:
return 1
else:
return Fibonacci(n-1)+Fibonacci(n-2)
I would solve the fibonacci linear recurrence with a close form like these:
1/sqr(5)*(1+sqr(5))/2 *((1+sqr(5))/2)**n - 1/sqr(5)*(1-sqr(5))/2 *((1-sqr(5))/2)**n
Why would the order of growth not be? O(1/sqr(5)*(1+sqr(5))/2 *((1+sqr(5))/2)**n)
There's a difference between the value of the function and the time to compute that value.
When you analyze your linear recurrence, you claim that the analysis is:
Tn = 1 , n=0
Tn = n*T(n-1)
But the computation involved in computing each next term is really just 1 (one multiplication) given the previous values. So it should be Tn = 1 + T(n-1). When you rerun your analysis, the linear result will become clear.
A similar separation between the value and the runtime will help you analyze your second question.

How to calculate time complexity of this recursion?

I'm interested in calculating the following code's time and space complexity but seem to be struggling a lot. I know that the deepest the recursion could reach is n so the space should be O(n). I have no idea however how to calculate the time complexity... I don't know how to write the formula when it comes to recursions similar to this forms like: f(f(n-1)) .
if it was something like, return f3(n-1) + f3(n-1) then i know it should be O(2^n) since T(n) = 2T(n-1) correct?
Here's the code:
int f3(int n)
{
if(n <= 2)
return 1;
f3(1 + f3(n-2));
return n - 1;
}
Thank you for your help!
Notice that f3(n) = n - 1 for all n, so the line f3(1 + f3(n-2)), first f3(n-2) is computed, which returns n - 3 and then f3(1 + n - 3) = f3(n-2) is computed again!
So, f3(n) computes f3(n-2) twice, alongside with some O(1) overhead.
We got the recursion formula T(n) = 2T(n-2) + c for some constant c, and T(n) is the running time of f3(n).
Solving the recursion, we get T(n) = O(2^(n/2)).

Solve recurrence T(n) = T(6n/5) + 1

So I'm preparing for the Algorithms exam and I don't know how to solve this recurrence T(n) = T(6n/5) + 1 since b = 5/6 < 1 and Master theorem cannot be applied.
I hope that someone can give me a hint on how to solve this. :)
Given just that recurrence relation (and no additional information like T(x) = 1 when x > 100), an algorithm with time complexity as described by the relation will never terminate, as the amount of work increases at each call.
T(n) = T(6n/5) + 1
= T(36n/25) + 2
= T(216n/125) + 3
= ...
You can see that the amount of work increases each call, and that it's not going to have a limit as to how much it increases by. As a result, there is no bound on the time complexity of the function.
We can even (informally) argue that such an algorithm cannot exist - increasing the size of the input by 1.2 times each call requires at least 0.2n work, which is clearly O(n) - but the actual cost at each step is claimed to be 1, O(1), so it's impossible for an algorithm described by
this exact recurrence to exist (but fine for algorithms with recurrence eg. T(n) = T(6n/5) + n).

Calculating Algorithm Efficiency and Complexity

My instructor taught us how to calculate the computational complexity of an algorithm, but she only did so very briefly and not so well. More specifically, could someone help me calculate the computational complexity of the following:
While (PLength > 0)
chartest = sHex(i)
ascvalue = Strings.Asc(chartest)
decvalue = Convert.ToDecimal(ascvalue)
shiftdecvalue = decvalue + 1
asc = ChrW(shiftdecvalue)
emptychararray(i) = asc
i = i + 1
PLength = PLength - 1
End While
They way I've thought about it, it just comes out to be T(n)= C_1*n+C_2*(n-1)+C_3*(n-1)+C_4*(n-1)+C_5*(n-1)+C_6*(n-1)+C_7*(n-1)+C_8*(n-1)+C_9*(n-1)+C_10*(n-1)
But I feel like I might be oversimplifying it. Furthermore, how do I get the big O notation for this? I've been looking online for resources on how to teach myself this, so if you have any recommendations, those would be greatly appreciated. Thank you in advance.
First note that T(n) can be rewritten as T(n) = C1 * n + C2. Then note that big O notation ignores constants, so the complexity is O(n).

Finding time complexity of relation T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3

Somewhat similar to fibonacci sequence
Running time of an algorithm is given by
T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3
= n otherwise the order of this algorithm is?
if calculated by induction method then
T(n) = T(n-1) + T(n-2) + T(n-3)
Let us assume T(n) to be some function aⁿ
then aⁿ = an-1 + an-2 + an-3
=> a³ = a² + a + 1
which give complex solutions also roots of above equation according to my calculations are
a = 1.839286755
a = 0.419643 - i ( 0.606291)
a = 0.419643 + i ( 0.606291)
Now, how can I proceed further or is there any other method for this?
If I remember correctly, when you have determined the roots of the characteristic equation, then the T(n) can be the linear combination of the powers of those Roots
T(n)=A1*root1^n+A2*root2^n+A3*root3^n
So I guess the maximum complexity here will be
(maxroot)^n where maxroot is the maximum absolute value of your roots. So for your case it is ~ 1.83^n
Asymptotic analysis is done for running times of programs which give us how the running time will grow with the input.
For Recurrence relations (like the one you mentioned), we use a two step process:
Estimate the running time using the recursion tree method.
Validate(Confirm) the estimate using the substitution method.
You can find explanation of these methods in any algorithm text (eg. Cormen).
it can be aproximated like 3+9+27+......3^n which is O(3^n)

Resources