Complexity Algorithm Analysis with if - algorithm

I have the following code. What time complexity does it have?
I have tried to write a recurrence relation for it but I can't understand when will the algorithm add 1 to n or divide n by 4.
void T(int n) {
for (i = 0; i < n; i++);
if (n == 1 || n == 0)
return;
else if (n%2 == 1)
T(n + 1);
else if (n%2 == 0)
T(n / 4);
}

You can view it like this: you always divide by four only if you have odd you add 1 to n before division. So, you should count how many times 1 was added. If there no increments then you have log4n recursive calls. Let's assume that you always have to add 1 before division. Then can rewrite it like this:
void T(int n) {
for (i = 0; i < n; i++);
if (n == 1 || n == 0)
return;
else if (n%2 == 0)
T(n / 4 + 1);
}
But n/4 + 1 < n/2, and in case of recursive call T(n/2), running time is O(log(n,4)), but base of logarithm doesn't impact running time in big-O notation because it's just like constant factor. So running time is O(log(n)).
EDIT:
As ALB pointed in a comment, there is cycle of length n. So, with accordance with master theorem running time is Theta(n). You can see it in another way as sum of n * (1 + 1/2 + 1/4 + 1/8 + ...) = 2 * n.

Interesting question. Be aware that even though your for loop is doing nothing, since it is not an optimized solution (see Dukeling's comment), it will be considered in your time complexity as if computing time was taken to iterate through it.
First part
The first section is definitely O(n).
Second part
Let's suppose for the sake of simplicity that half the time n will be odd and the other half time it will be even. Hence, the recursion is looping (n+1) half the time and (n/4) the other half.
Conclusion
For each time T(n) is called, the recursion will implicitly loop n times. Hence, The first half of the time, we will have a complexity of n * (n+1) = n^2 + n. The other half of the time, we will deal with a n * (n/4) = (1/4)n^2.
For Big O notation, we care more about the upper bound than its precise behavior. Hence, your algorithm would be bound by O(n^2).

Related

How to work out the complexity of an algorithm?

I have two questions for algorithm analysis, and would like to know how to determine the complexity of the following two:
First:
For(int i=2; i<n; i=i*i*i)
{
//something O(1)
}
Second:
n/1 + n/2 + n/3 +...+ n/n
To the first:
It will be infinite, because 1*1*1 = 1 so i is always 1 and will never be >= n.
The second algorithm is not really an algorithm but the addition runs in O(n).
For the first Algorithm:
Suppose that the initial value of i is 2 (rather than 1 that would lead to an infinite loop as #tschaefemedia remarked).
At the 1st iteration, i==2
At the second iteration, i==2*2*2 == 2ˆ3
At the third iteration, i== (2ˆ3 * 2ˆ3 * 2ˆ3) == 2ˆ(3*3)
At the fourth iteration, i== 2ˆ(3*3) * 2ˆ(3*3) * 2ˆ(3*3) == 2ˆ(3*3*3)
...
At iteration k+1, i== 2ˆ(3*3*3*...*3) == 2ˆ(3ˆk)
Suppose for simplicity that at iteration k-1, i becomes equal to n and the loop stops. Then:
n == 2ˆ(3ˆk)
log2(n) == 3^k
log3(log2(n)) == k
So, the complexity is O(log3(log2(n)))
As for the second question, I suppose that you are giving the complexity formula. So,
n/1 + n/2 + n/3 + ... + n/n = n (1+ 1/2 + 1/3 + ... + 1/n)
This is Harmonic series and it is O(log(n))
So, the overall complexity is O(n*log(n))

Is my understanding of big-O correct for these Java functions incorrect?

My approach (might be incorrect) is formulaic. If there is a loop then (n+1) if there is a nested loop (n^2) if a statement then O(1). If division then log(n).
Here are some example and my reasoning to solving, not sure at all if this approach is problematic or if any of them are correct. I need help with this.
Example1:
i = n; // I think O(1) because it's a statment
while (i > 1) // I think O(n) because it's a loop
i = i/4; // O(n) because it's in a loop and log_4(n) b/c division
// I think Overall if we combine the O(n) from earlier and the log_4(n)
// Therefore, I think overall O(nlog(n))
Example2:
for (i = 1; i < n; i = i + i) // I think this is O(n+1) thus, O(n)
System.out.println("Hello World"); // O(n) because it's in a loop
// Therefore, overall I think O(n)
Example3:
for (i = 0; i < n; i = i + 1) // I think O(n+1), thus O(n)
for (j = 1; j < n; j++) // I think O(n^2) because in a nested loop
System.out.println("Hello Universe!"); // O(n^2) because in a nested
// Therefore, overall I think O(n^2)
Example4:
for (i = 1; i < (n * n + 3 * n + 17) / 4; i = i + 1) // O((20+n^3)/4) thus, O(n^3)
System.out.println("Hello Again!'); // O(n) because it's in a loop
// Therefore, overall I think O(n^3) because largest Big-O in the code
Thank you
Example1: Your result is wrong. Because the loop happens log_4(n) times and also the division takes O(1) (division by 4 just needs a bitwise shift). Thus the overall time is only O(log(n))
Example2: It's wrong too. In each iteration you duplicate the loop variable. So the loop happens O(log(n)) times. The print command takes O(1) and total time is O(log(n)).
Example3: Your answer is correct. Because you have two nested O(n) loops. Note that this loop is different from two previous examples.
Example4: I think you made a writing mistake. Does ((n * n + 3 * n + 17) / 4) equal O((20+n^3)/4)??? It is O(n^2). Thus, according to my previous explanations, the overall time is O(n^2).

Recursion, inner loop and time complexity

Consider the following function:
int testFunc(int n){
if(n < 3) return 0;
int num = 7;
for(int j = 1; j <= n; j *= 2) num++;
for(int k = n; k > 1; k--) num++;
return testFunc(n/3) + num;
}
I get that the first loop is O(logn) while the second loop gives O(n) which gives a time complexity of O(n) in total. But due to the recursive calls I thought the time complexity would be O(nlogn), but apperantly it is only O(n). Can anyone explain why?
The recursive call pretty much gives the following for the complexity(denoting the complexity for input n by T(n)):
T(n) = log(n) + n + T(n/3)
First observation as you correctly noted is that you can ignore the logarithm as it is dominated by n. Now we are only left with T(n) = n + T(n/3). Try writing this up to 0 for instance. We have:
T(n) = n + n/3 + n/9+....
You can easily prove that the above sum is always less than 2*n. In fact better limits can be proven but this one is enough to state that overall complexity is O(n).
For procedures using a recursive algorithm such as the following:
procedure T( n : size of problem ) defined as:
if n < base_case then exit
Do work of amount f(n) // In this case, the O(n) for loop
T(n/b)
T(n/b)
... a times... // In this case, b = 3, and a = 1
T(n/b)
end procedure
Applying the Master theorem to find the time complexity, the f(n) in this case is O(n) (due to the second for loop, like you said). This makes c = 1.
Now, logba = log31 = 0, making this the 3rd case of the theorem, according to which the time complexity T(n) = Θ(f(n)) = Θ(n).

Time complexity of the following algorithm?

I'm learning Big-O notation right now and stumbled across this small algorithm in another thread:
i = n
while (i >= 1)
{
for j = 1 to i // NOTE: i instead of n here!
{
x = x + 1
}
i = i/2
}
According to the author of the post, the complexity is Θ(n), but I can't figure out how. I think the while loop's complexity is Θ(log(n)). The for loop's complexity from what I was thinking would also be Θ(log(n)) because the number of iterations would be halved each time.
So, wouldn't the complexity of the whole thing be Θ(log(n) * log(n)), or am I doing something wrong?
Edit: the segment is in the best answer of this question: https://stackoverflow.com/questions/9556782/find-theta-notation-of-the-following-while-loop#=
Imagine for simplicity that n = 2^k. How many times x gets incremented? It easily follows this is Geometric series
2^k + 2^(k - 1) + 2^(k - 2) + ... + 1 = 2^(k + 1) - 1 = 2 * n - 1
So this part is Θ(n). Also i get's halved k = log n times and it has no asymptotic effect to Θ(n).
The value of i for each iteration of the while loop, which is also how many iterations the for loop has, are n, n/2, n/4, ..., and the overall complexity is the sum of those. That puts it at roughly 2n, which gets you your Theta(n).

Time complexity for Shell sort?

First, here's my Shell sort code (using Java):
public char[] shellSort(char[] chars) {
int n = chars.length;
int increment = n / 2;
while(increment > 0) {
int last = increment;
while(last < n) {
int current = last - increment;
while(current >= 0) {
if(chars[current] > chars[current + increment]) {
//swap
char tmp = chars[current];
chars[current] = chars[current + increment];
chars[current + increment] = tmp;
current -= increment;
}
else { break; }
}
last++;
}
increment /= 2;
}
return chars;
}
Is this a correct implementation of Shell sort (forgetting for now about the most efficient gap sequence - e.g., 1,3,7,21...)? I ask because I've heard that the best-case time complexity for Shell Sort is O(n). (See http://en.wikipedia.org/wiki/Sorting_algorithm). I can't see this level of efficiency being realized by my code. If I added heuristics to it, then yeah, but as it stands, no.
That being said, my main question now - I'm having difficulty calculating the Big O time complexity for my Shell sort implementation. I identified that the outer-most loop as O(log n), the middle loop as O(n), and the inner-most loop also as O(n), but I realize the inner two loops would not actually be O(n) - they would be much less than this - what should they be? Because obviously this algorithm runs much more efficiently than O((log n) n^2).
Any guidance is much appreciated as I'm very lost! :P
The worst-case of your implementation is Θ(n^2) and the best-case is O(nlogn) which is reasonable for shell-sort.
The best case ∊ O(nlogn):
The best-case is when the array is already sorted. The would mean that the inner if statement will never be true, making the inner while loop a constant time operation. Using the bounds you've used for the other loops gives O(nlogn). The best case of O(n) is reached by using a constant number of increments.
The worst case ∊ O(n^2):
Given your upper bound for each loop you get O((log n)n^2) for the worst-case. But add another variable for the gap size g. The number of compare/exchanges needed in the inner while is now <= n/g. The number of compare/exchanges of the middle while is <= n^2/g. Add the upper-bound of the number of compare/exchanges for each gap together: n^2 + n^2/2 + n^2/4 + ... <= 2n^2 ∊ O(n^2). This matches the known worst-case complexity for the gaps you've used.
The worst case ∊ Ω(n^2):
Consider the array where all the even positioned elements are greater than the median. The odd and even elements are not compared until we reach the last increment of 1. The number of compare/exchanges needed for the last iteration is Ω(n^2).
Insertion Sort
If we analyse
static void sort(int[] ary) {
int i, j, insertVal;
int aryLen = ary.length;
for (i = 1; i < aryLen; i++) {
insertVal = ary[i];
j = i;
/*
* while loop exits as soon as it finds left hand side element less than insertVal
*/
while (j >= 1 && ary[j - 1] > insertVal) {
ary[j] = ary[j - 1];
j--;
}
ary[j] = insertVal;
}
}
Hence in case of average case the while loop will exit in middle
i.e 1/2 + 2/2 + 3/2 + 4/2 + .... + (n-1)/2 = Theta((n^2)/2) = Theta(n^2)
You saw here we achieved (n^2)/2 even though divide by two doesn't make more difference.
Shell Sort is nothing but insertion sort by using gap like n/2, n/4, n/8, ...., 2, 1
mean it takes advantage of Best case complexity of insertion sort (i.e while loop exit) starts happening very quickly as soon as we find small element to the left of insert element, hence it adds up to the total execution time.
n/2 + n/4 + n/8 + n/16 + .... + n/n = n(1/2 + 1/4 + 1/8 + 1/16 + ... + 1/n) = nlogn (Harmonic Series)
Hence its time complexity is some thing close to n(logn)^2

Resources