Calculating Big O running time - algorithm

Hello I am having trouble finding the running time for this algorithm with following assumptions that s = O(logN) and the running time of Random_Integer is constant time:
1 express N-1 as 2^t * u where u is odd:
2 for i <-- to s do
3 a <-- Random_Integer(2, N-2);
4 if EuclidGCD(a, N) not equal to 1 then
5 return false;
6 x sub 0 <-- a^u mod N;
7 for j <-- 1 to t do
8 x sub j <-- x^2 sub j-1 mod N;
9 if x sub j = 1 and x sub j-1 not equal to 1 and x sub j-1 not equal to N -1 then
10 return false;
11 if x sub t not equal to one then
12 return false;
13 return true;
Starting from the inner loop the exponential modulous operation takes n^3 time and the loop runs for n iterations giving a total of n^4. Then working my way to the outter loop, we have another exponential modulous operation which takes again n^3 time and then the EuclidGCD also take n^3 time. Finally the outter loop also runs for n iterations. I believe that these values are correct but, im confused as to how I get a total running time. I'm also confused on if these two nested loops running time should be multiplied together and if the method call for ExtendedEuclid within the outer loop should be multiplied with the running time for the outter loop. I hope this is clear thanks for any help.

The inner loop is n^4 (the slowest part inside the outer loop) and runs once for every iteration of the outer loop, which is EDIT: logn times, so n^4logn.
HOWEVER...
Depending on how often return false is reached early, it may only be n^5 in the worst case, e.g. if almost all of the time you return false on the first iteration then you've only spent n^3-n^4 of work, so you'd have an average of O(n^3) or O(n^4) (depending on which return false it was) instead.

Related

How can we find the Time Complexity of the Algorithm

I want to know about the time complexity of the given code below, I have a doubt it is O of root n
i = n, sum = 0
while (i >= 0){
i /= 2
sum += i*i*i
}
I am really confused can anyone help me out and explain
If you're unsure you can always use "time" module to roughly get an idea of the complexity. It would go something like this.
import time
start = time.time() # put this before the loop
end = time.time() #put this after the loop
print(end - start) #this gives you evaluation time of your loop
Find evaluation times for different n's and determine the complexity.
Just by looking at it though, your loop gets executed roughly log2(n) times, and inside you have 2 multiplications and one division (so nothing complex). Therefore, I would assume O(log(n)) is a reasonable guess.
As mentioned in the comments, I'm assuming that the code was meant to be written as
i = n, sum = 0
while (i > 0) { // <--- Change >= to >
i /= 2
sum += i*i*i
}
since otherwise the code would be an infinite loop. With this in mind, let's take a look at what this code is doing.
For starters, note that the sum variable isn't doing anything that impacts our time complexity. On each iteration, it gets bigger, but we're doing only O(1) work to update it. That means that the time complexity here is going to depend on how many times the loop runs. Notice that, across the different iterations of the loop, the value of i will take on the sequence
n, n / 2, n / 4, n / 8, n / 16, n / 32, ...
and, in particular, on the kth iteration of the loop the value of i will be equal to n / 2k (ignoring rounding down, which we can safely do here). The question, then, is at what iteration of the loop we end up with n / 2k < 1, which is when the loop will stop. Solving, we get that
n / 2k < 1
n < 2k
log2 n < k
So this loop will stop as soon as the number of loop iterations k is greater than log2 n. This means that we do Θ(log n) loop iterations, of which each iteration does O(1) work, so the total work done is Θ(log n).

How to calculate big O notation for this nested loop?

if I have for example this simple code
for (i =1;i<=n;i++)
for (j=1 ;j<=i;j++)
count++;
for this line
for (i =1;i<=n;i++)
if I say that the time for 'i' to get a value is T then i will increase n+1 times since the condition is i<=n so the time for increasing i is (n+1)*T the condition will be asked n+1 times so lets say that the time needed to check the condition is T aswell then the total time for it to complete is (n+1)*T and i++ will be executed n times because when the condition is asked if i(in this case i is n+1) <=n it will be false so it wont increase i so the total time for executing this single loop would be (n+1)*T+(n+1)T+nT or (n+1+n+1+n)*T = (3n+2)T so big O for this case would be n
but I dont know how to calculate for the second loop I was thinking if it would be n[(3n+2)*T] and big O for this would be n^2 but I am not too sure if you dont understand what I am saying or if I made a mistake with first loop too if you can please explain in details how to I calculate it for that code .
First loop will execute n times, second loop i times, for each i from the outer loop. At the beginning, i=1, so the inner loop will have only one iteration, then i=2, i=3.. until i reaches the value n. Therefore, the total number of iterations is 1 + 2 + 3 + ... + n = n * (n + 1) / 2, which gives O(n^2).

How to determine the time complexity of this loop?

x=1;
While(x<n)
{
x=x + n/100;
}
I'm trying to figure out if it's o(n) or o(1). Because no matter what we put in n's place I think the loop will go just 10 times.
lets say n=1.1
then it will go for 10 times and if n=1.2 loop will go on for 17 times
and if n=2 it will go on for 50 times and when n>=101 loop will be repeated 100 times even if n=10^10000 else you can figure out
Unfortunately you're wrong it it being O(n) or O(1) and this is immediately clear by the fact that it can't be O(1), because it takes different numbers of iterations for varying values of n(even looking at n = 1,2,3,4,5), and it can't be O(n) because it doesn't grow linearly.
Even through a bit of manual calculation you can see clearly that it won't always run 10 times. Examine the following short python program:
def t(n):
x = 1
c = 0
while x < n:
c += 1
x += n/100
return c
a = []
for i in range(10000):
a += [i/100 + 1]
with open("out.csv","w") as f:
for i in a:
f.write(str(i) + "," + str(t(i)) + "\n")
Using Excel or some other application you can easily trend the number of iterations taken to see the following curve:
It is immediately clear at this point that the number of iterations taken is logarithmic in the range {0:100} with any n < 1 taking 0 iterations and n > 100 taking 100 operations. So while Big-O notation wasn't my best subject, I would guess that the time complexity is thus O(log(n)).

How to effectively calculate an algorithm's time complexity? [duplicate]

This question already has answers here:
Big O, how do you calculate/approximate it?
(24 answers)
Closed 5 years ago.
I'm studying algorithm's complexity and I'm still not able to determine the complexity of some algorithms ... Ok I'm able to figure out basic O(N) and O(N^2) loops but I'm having some difficult in routines like this one:
// What is time complexity of fun()?
int fun(int n)
{
int count = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
count += 1;
return count;
}
Ok I know that some guys can calculate this with the eyes closed but I would love to to see a "step" by "step" how to if possible.
My first attempt to solve this would be to "simulate" an input and put the values in some sort of table, like below:
for n = 100
Step i
1 100
2 50
3 25
4 12
5 6
6 3
7 1
Ok at this point I'm assuming that this loop is O(logn), but unfortunately as I said no one solve this problem "step" by "step" so in the end I have no clue at all of what was done ....
In case of the inner loop I can build some sort of table like below:
for n = 100
Step i j
1 100 0..99
2 50 0..49
3 25 0..24
4 12 0..11
5 6 0..5
6 3 0..2
7 1 0..0
I can see that both loops are decreasing and I suppose a formula can be derived based on data above ...
Could someone clarify this problem? (The Answer is O(n))
Another simple way to probably look at it is:
Your outer loop initializes i (can be considered step/iterator) at n and divides i by 2 after every iteration. Hence, it executes the i/2 statement log2(n) times. So, a way to think about it is, your outer loop run log2(n) times. Whenever you divide a number by a base continuously till it reaches 0, you effectively do this division log number of times. Hence, outer loop is O(log-base-2 n)
Your inner loop iterates j (now the iterator or the step) from 0 to i every iteration of outer loop. i takes the maximum value of n, hence the longest run that your inner loop will have will be from 0 to n. Thus, it is O(n).
Now, your program runs like this:
Run 1: i = n, j = 0->n
Run 2: i = n/2, j = 0->n/2
Run 3: i = n/4, j = 0->n/4
.
.
.
Run x: i = n/(2^(x-1)), j = 0->[n/(2^(x-1))]
Now, runnning time always "multiplies" for nested loops, so
O(log-base-2 n)*O(n) gives O(n) for your entire code
Lets break this analysis up into a few steps.
First, start with the inner for loop. It is straightforward to see that this takes exactly i steps.
Next, think about which different values i will assume over the course of the algorithm. To start, consider the case where n is some power of 2. In this case, i starts at n, then n/2, then n/4, etc., until it reaches 1, and finally 0 and terminates. Because the inner loop takes i steps each time, then the total number of steps of fun(n) in this case is exactly n + n/2 + n/4 + ... + 1 = 2n - 1.
Lastly, convince yourself this generalizes to non-powers of 2. Given an input n, find smallest power of 2 greater than n and call it m. Clearly, n < m < 2n, so fun(n) takes less than 2m - 1 steps which is less than 4n - 1. Thus fun(n) is O(n).

How to find the running time of a specific procedure?

For each of the procedures below, let T (n) be the running time. Find the order of T (n)
(i.e., find f(n) such that T (n) ∈ (f(n)).
Procedure Fum(int n):
for i from 1 to n do
y ← 1/i
x ← i
while x > 0 do
x ← x − y
end while
end for
I know how to find run times of simple functions but since this is a nested loop where the inner loop depends on a variable from the outer loop, I'm having trouble.
It should be 1+4+9+...+n^2 = n(n+1)(2n+1)/6, or simply O(n^3), for this case.
For each step in the for-loop, it will run i^2 times for the while. Given x=i;y=1/i;, it will take i^2 (as x=y*i^2) times for x to reach x<=0 by decreament step x=x-y.
For i, it will be 1,2,...,n, summing them up, you will get 1+4+9+...n^2 = n(n+1)(2n+1)/6.
First, lets consider the runtime of the inner loop:
We want to figure out how many times the inner loop runs, in terms of i.
That is, we want to solve for f(i) in x-f(i)y = 0. If we sub in x = i, and y = 1/i, we get f(i) = i^2.
We know the outer loop will run exactly n times, so then, we get the total number of times the inner loop will be run:
= 1 + 4 + 9 + ... + n^2
This sum is equal to n(n+1)(2n+1)/6, which is O(n^3)

Resources