How many calls to generator are made? - algorithm

Suppose I have the following algorithm:
procedure(n)
if n == 1 then break
R = generaterandom()
procedure(n/2)
Now I understand that the complexity of this algorithm is log(n) but does it make log(n) calls to the random generator or log(n)-1 since it is not called for the call when n==1.
Sorry if this is obvious, but i've been looking around and its not really stated anywhere what the exact answer is.

There are ceil(log(n))calls to the generator
Proof Using induction:
Hypothesis:
There are ceil(log(k)) calls to generator for each k<n
Base:
log_2(1) = 0 => 0 calls
Step:
For arbitrary n>1 there is one call, and then from hypothesis ceil(log(n/2) more calls in the recursive calls.
This gives us total of ceil(log(n/2))+1 = ceil(log(n/2)) + log(2) = ceil(log(n/2 * 2)) = ceil(log(n)) calls
QED
Note: In here, all logs are with base 2.

By the Master's Theorem, your method can be written as T(n) = T(n/2) + O(1), since you are dividing n into half every function call, and this is exactly O(log n). - I realized you are not asking for complexity analysis, but like I mentioned, the idea is the same (i.e. finding the number of calls is equivalent to its complexity)

Related

What is the time complexity for this generic bruteforce/backtracking function?

I am having trouble reasoning what the time complexity of this is. I was writing a backtracking function to solve a problem. To simplify, let's just say I have a list of size "a" and I am allowed to place down 0 or 1 into each element of the list. After trying all combinations, I return. This is clearly 2^(nm).
However, what if during each recursive call I have a constant amount of work to do? I am stuck reasoning through what the complexity is here. Can you point me to sources? From my undergrad studies all I remember is Master's theorem, but this approach is not relevant. (We subtract rather than divide to get the subproblem)
def myfunc(x,a):
if x == a:
return
myfunc2() #Some constant time work
myfunc(x+1,a)
myfunc(x+1,a)
In your case, the time complexity is T(n) = m + 2T(n - 1). Although we can't apply Master's theorem here, we can use substitution:
T(n) = m + 2T(n - 1)
= m + 2(m + 2T(n - 2))
= m + 2m + 4(m + 2T(n - 3))
= ∑(i = 1, i = n) m2^i
Evaluating this, we have m2^n or ϴ(2^n).
Recursion doesn't really offer you any benefits here over readability. You could see savings if you combined this with memory of what you've evaluated, however. In that case, evaluating the time complexity becomes more... complex.
The time complexity is obviously exponential.
And the myfunc2() contribution is negligible unless x is less than 2.
Maybe if myfunc2() is searching for 42.

Recursive equation from algorithm

I started my masters degree in bioinformatics this October, for a former biologist finding a recursive equation from a piece of code is pretty hard. If somebody could explain this to me, i would be very grateful.
How do i find a recursive equation from this piece of code?
procedure DC(n)
if n<1 then return
for i <- 1 to 8 do DC(n/2)
for i <- 1 to n³ do dummy <- 0
My guess is T(n) = c + 8T(n/2), because the first if condition needs constant time c and the first for loop is the recursive case which performs from 1 to 8, therefore 8*T(n/2), but I dont know how to ad the last line of code to my equation.
You’re close, but that’s not quite it.
Usually, a recurrence relation only describes the work done by the recursive step of a recursive procedure, since it’s assumed that the base case does a constant amount of work. You’d therefore want to look at
what recursive calls are made and on what size inputs they’re made on, and
how much work is done outside of that.
You’ve correctly identified that there are eight recursive calls on inputs of size n / 2, so the 8T(n / 2) term is correct. However, notice that this is followed up by a loop that does O(n3) work. As a result, your recursive function is more accurately modeled as
T(n) = 8T(n / 2) + O(n3).
It’s then worth seeing if you can argue why this recurrence solves to O(n3 log n).
This turns out to be T(n)= 8*T(n/2)+O(n^3).
I will give you a jab at solving this with iteration/recursion tree method.
T(n) = 8* T(n/2) + O(n^3)
~ 8* T(n/2) + n^3
= 8*(8* T(n/4) + (n/2)^3))+n^3
= 8^2*T(n/4)+8*(n/2)^3+ n^3
= 8^2*T(n/2^2)+n^3+n^3
= 8^2( 8*T(n/8)+(n/4)^3)+n^3+n^3
= 8^3*T(n/2^3)+ n^3 + n^3 + n^3
...
= 8^k*T(n/2^k)+ n^3 + n^3 + n^3 + ...k time ...+n^3
This will stop when n/2^k=1 or k=log_2(n).
So the complexity is O(n^3log(n))

What is the big-O complexity of the following pseudocode?

What would be the computational complexity of the following pseudocode?
integer recursive (integer n) {
if (n == 1)
return (1);
else
return (recursive (n-1) + recursive (n-1));
}
In the real world, the calls would get optimized and yield linear complexity, but with the RAM model on which big-Oh is calculated, what would be the complexity? 2^n?
The complexity of this algorithm in its current form is indeed O(2n), because on each level of call, there will be twice more number of calls.
The first call (recursive(n)) constitutes one call
The next level (recursive(n-1)) constitutes 2 calls
At the base case (recursive(1)) it constitutes 2n-1 calls.
So the total number of function calls is 1+2+…+2n-1 = 2n-1
So the complexity is O(2n).
Additional points:
As you said, this can be easily made O(n) (or perhaps O(log n) for this special case using fast exponentiation) by memoization, or dynamic programming.
Your complexity will be
Why is it so? Simply mathematical induction proof:
N=1: special case, count of steps = 1.
N=2, Obvious, = 2, so it's correct
Let it be correct for N=K, i.e. for N=K it will be
Assuming N=K+1. The function recursive will call itself recursively for N=K two times: recursive(K+1) = recursive(K) + recursive(K) as it follows from the code. That is: . So, for N=K+1 we got steps.
So we've proof that complexity for N will be in common case (from definition of mathematical induction).

Finding time complexity of relation T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3

Somewhat similar to fibonacci sequence
Running time of an algorithm is given by
T (n) =T (n-1)+T(n-2)+T(n-3) if n > 3
= n otherwise the order of this algorithm is?
if calculated by induction method then
T(n) = T(n-1) + T(n-2) + T(n-3)
Let us assume T(n) to be some function aⁿ
then aⁿ = an-1 + an-2 + an-3
=> a³ = a² + a + 1
which give complex solutions also roots of above equation according to my calculations are
a = 1.839286755
a = 0.419643 - i ( 0.606291)
a = 0.419643 + i ( 0.606291)
Now, how can I proceed further or is there any other method for this?
If I remember correctly, when you have determined the roots of the characteristic equation, then the T(n) can be the linear combination of the powers of those Roots
T(n)=A1*root1^n+A2*root2^n+A3*root3^n
So I guess the maximum complexity here will be
(maxroot)^n where maxroot is the maximum absolute value of your roots. So for your case it is ~ 1.83^n
Asymptotic analysis is done for running times of programs which give us how the running time will grow with the input.
For Recurrence relations (like the one you mentioned), we use a two step process:
Estimate the running time using the recursion tree method.
Validate(Confirm) the estimate using the substitution method.
You can find explanation of these methods in any algorithm text (eg. Cormen).
it can be aproximated like 3+9+27+......3^n which is O(3^n)

Recursive function with a specific runtime of Theta(...)

I'm stuck with this homework I got from my Algorithms course:
Write a recursive function with a runtime of Theta(n^4 logn).
I thought something like this, but I'm very unsure about my approach.
function(int n)
{
for-loop (1..n^4)
//do something
return function(n/2);
}
You should be unsure, your function has some problems:
It doesn't has initial values, and runs forever.
If you set initial value for your function it will be as:
T(n) = T(n/2) + O(n^4)
and by master theorem this is Θ(n^4).
Hint: You should increase coefficient of T(n/2), but how much? find it yourself. For this increasing you can call it x times.
By Master theorem log n happens when we have a recursion like this:
T(n) = a T(n/b) + na/b
In your case you have a/b = 4, so you can fix b = 2 and a = 8 to achieve this.
T(n) = 8T(n/2) + n4
and for arriving to this, you can call T(n/2) for 8 times.
Hint: n4 could be four nested loops from 1 to n. Logarithmic run-time factor is usually obtained by halving problem size recursively until reaching 1. Your proposal kind of works, but is quite blunt and you could even do
func(int n)
for i = 1 to n^4 log n
nop()
but I don't believe this is something that's being looked for.
Your approach is sane, and your insecurity is normal. You should now prove that your algorithm is Theta(n^4 log n). Apply your normal algorithmic analysis techniques to show that function executes do something n^4 log_2 n times.
Hint: Count how many times function is called recursively and how often the loop runs in each call. You'll see that there is still a small bug in your function; the n for the n^4 factor is reduced in each recursive call.

Resources