What is the worst case time complexity of the following:
def fun(n):
count=0
i=n
while i>0:
for j in range(0,i):
count+=1
i/=2
return count
Worst case time complexity if n is really big integer.
O(log(n)*n) (Guessing).
Here's how I came to my conclusion.
while i>0:
...
i/=2
This will run log(n) times because it's getting halved every time it runs.
for j in range(0,i):
This will run n times first, n/2 times the 2nd time, and so on. So, total running time for this line is n + n/2 + n/4 .... 1 = (2n-1)
count+=1
This is a cheap operation so is O(1).
Thus making total running time of this function O(log(n)) * O(2n-1), if n is an integer. Simplifying becomes O(log(n)*(n)).
Related
BinaryConversion:
We are inputting a positive integer n with the output being a binary representation of n on a stack.
What would the time complexity here be? I'm thinking it's O(n) as the while loop halves every time, meaning the iterations for a set of inputs size 'n' decrease to n/2, n/4, n/8 etc.
Applying sum of geometric series whereby n = a and r = 1/2, we get 2n.
Any help appreciated ! I'm still a noob.
create empty stack S
while n > 0 do
push (n mod 2) onto S
n = floor(n / 2)
end while
return S
If the loop was
while n>0:
for i in range n:
# some action
n = n/2
Then the complexity would have been O(n + n/2 + n/4 ... 1) ~ O(n), and your answer would have been correct.
while n > 0 do
# some action
n = n / 2
Here however, the complexity will should be the number of times the outer loop runs, since the amount of work done in each iteration is O(1). So the answer will be O(log(n)) (since n is getting halved each time).
The number of iterations is the number of times you have to divide n by 2 to get 0, which is O(log n).
I have this algorithm and I wanna analyse the time complexity but I am not sure I am correct:
n = int(input("Enter Target Value: "))
x = 1
count = 0
while n != x:
if n % 2 == 0:
n /= 2
count +=1
else:
n -= 1
count +=1
print(count)
for the while loop, n/2 will have the time complexity of O(logn) and n-1 will be O(n), so O(logn)+O(n) will still be O(logn) in the for loop. The 3 initializing will be O(1), so the run time complexity of this algo will be O(logn). Am I correct? Thanks
The outcome is correct, but the reasoning is not. The n-=1 statement will not be executed O(n) times, and O(logn)+O(n) is actually O(n), not O(logn).
Imagine n in its binary representation. Then the n-=1 operation will be executed just as many times as there are 1-bits in that representation. The n/=2 statement will be executed just as many times as there are bits in the representation, regardless of whether they are 0 or 1. This is because a 1-bit will first be converted to a 0-bit with the n-=1 operation, and then the next iteration will pick up that same bit (which has become 0) for the n/=2 operation, which actually drops that bit.
So in the worst case, all the significant bits of n are 1-bits. And then you have O(logn) executions of n-=1 operation, and O(logn) executions of n/=2. In total the loop makes 2O(logn) iterations, which gives this algorithm a O(logn) time complexity.
I have the following pseudocode:
for i=1 to 3*n
for j=1 to i*i
for k=1 to j
if j mod i=1 then
s=s+1
endif
next k
next j
next i
When I want to analyze the number of times the part s=s+1 is performed, assuming that this operation takes constant time, I end up with a quadratic complexity or is it linear? The value of n can be any positive integer.
The calculations that I made are the following:
Definitely not quadratic, but should at least be polynomial.
It goes through 3n iterations.
On each iteration it does 9n2 more.
On each of those it does up to 9n2 more.
So I think it would be O(n5).
When talking about running time, you should always make explicit in terms of what you are defining your running time.
If we assume you are talking about your running time in terms of n, the answer is O(n^5). This is because what you are doing boils down to this, when we get rid of the constant factors:
do n times:
do n^2 times:
do n^2 times:
do something
And n * n^2 * n^2 = n^5
x=0
for i=1 to ceiling(log(n))
for j=1 to i
for k=1 to 10
x=x+1
I've included the answer I've come up with here:
I think the time complexity is θ(n^2 log(n)), but I am not sure my logic is correct. I would really appreciate any help understanding how to do this type of analysis!
Outermost loop will run for ceil(log n) times. The middle loop is dependent on the value of i.
So, it's behaviour will be :
1st iteration of outermost-loop - 1
2nd iteration of outermost-loop - 2
.....................................
ceil(log n) iteration of outermost-loop - ceil(log n)
Innermost loop is independent of other variables an will always run 10 times for each iteration of middle-loop.
Therefore, net-iterations
= [1*10 + 2*10 + 3*10 + ... + ceil(log n)*10]
= 10 * {1+2+...+ceil(log n)}
= 10 * { (ceil(log n) * ceil(log n)+1)/2} times
= 5 * [ceil(log n)]^2 + 5 * ceil(log n)
= Big-Theta {(log n)^2}
= Θ{(log n)^2}.
I hope this is clear to you. Hence, your answer is incorrect.
You have three loops. Lets consider one by one.
Innermost loop: It is independent of a n or i, and will run always 10 times. So time complexity of this loop is Theta(10).
Outermost loop: Very simply time complexity of this loop is Theta(logn).
Middle loop: As value of i can be upto logn time complexity of this loop is also O(logn)
Overall complexity: Theta(logn)*O(logn)*Theta(10) or O(logn*logn*10) or 10*O((logn)^2) or O((logn)^2)
What is the time complexity of this algorithm:
sum = 0
i = 1
while (i < n) {
for j = 1 to i {
sum = sum + 1
}
i = i*2;
}
return sum
I know that the while loop is O(logn), but what is the complexity of the for loop? Is it O(n) or O(logn)?
One way to analyze this would be to count up the number of iterations of the inner loop. On the first iteration, the loop runs one time. On the second iteration, it runs two times. It runs four times on the third iteration, eight times on the fourth iteration, and more generally 2k times on the kth iteration. This means that the number of iterations of the inner loop is given by
1 + 2 + 4 + 8 + ... + 2r = 2r + 1 - 1
Where r is the number of times that the inner loop runs. As you noted, r is roughly log n, meaning that this summation works out to (approximately)
2log n + 1 - 1 = 2(2log n) - 1 = 2n - 1
Consequently, the total work done by the inner loop across all iterations in O(n). Since the program does a total of O(log n) work running the outer loop, the total runtime of this algorithm is O(n + log n) = O(n). Note that we don't multiply these terms together, since the O(log n) term is the total amount of work done purely in the maintenance of the outer loops and the O(n) term is total amount of work done purely by the inner loop.
Hope this helps!