Time complexity while loop with recursion - performance

void call(int n)
{
for (int j=1;j<=n;j++)
{
call(n/2);
}
}
void main()
{
int i;
for (i=1;i<=n;i++)
{
call(i);
}
}
For the time complexity of this loop. Is this thought process correct? In the main function, the loop is O(N). In the call function, the loop is O(N), which the recursion is n/2, therefore the O(logN)with base 2. So the overall time complexity of in the main is O(N)*[O(N)*O(LogN)]= O(N^2 Log N)?

you can use recursion tree to figure out the number of calls and the order of recursion function is equal to the number of nodes in the recursion tree (leaves are call(n/2) that is not showing):
so to calculate the number of all nodes you can calculate summation and estimate the order (using geometric sequence by formula to calculate summation) :
Order of the main loop is less than , so main loop order is

Related

Complexity analysis for the permutations algorithm

I'm trying to understand the time and space complexity of an algorithm for generating an array's permutations. Given a partially built permutation where k out of n elements are already selected, the algorithm selects element k+1 from the remaining n-k elements and calls itself to select the remaining n-k-1 elements:
public static List<List<Integer>> permutations(List<Integer> A) {
List<List<Integer>> result = new ArrayList<>();
permutations(A, 0, result);
return result;
}
public static void permutations(List<Integer> A, int start, List<List<Integer>> result) {
if(A.size()-1==start) {
result.add(new ArrayList<>(A));
return;
}
for (int i=start; i<A.size(); i++) {
Collections.swap(A, start, i);
permutations(A, start+1, result);
Collections.swap(A, start, i);
}
}
My thoughts are that in each call we swap the collection's elements 2n times, where n is the number of elements to permute, and make n recursive calls. So the running time seems to fit the recurrence relation T(n)=nT(n-1)+n=n[(n-1)T(n-2)+(n-1)]+n=...=n+n(n-1)+n(n-1)(n-2)+...+n!=n![1/(n-1)!+1/(n-2)!+...+1]=n!e, hence the time complexity is O(n!) and the space complexity is O(max(n!, n)), where n! is the total number of permutations and n is the height of the recursion tree.
This problem is taken from the Elements of Programming Interviews book, and they're saying that the time complexity is O(n*n!) because "The number of function calls C(n)=1+nC(n-1) ... [which solves to] O(n!) ... [and] ... we do O(n) computation per call outside of the recursive calls".
Which time complexity is correct?
The time complexity of this algorithm, counted by the number of basic operations performed, is Θ(n * n!). Think about the size of the result list when the algorithm terminates-- it contains n! permutations, each of length n, and we cannot create a list with n * n! total elements in less than that amount of time. The space complexity is the same, since the recursion stack only ever has O(n) calls at a time, so the size of the output list dominates the space complexity.
If you count only the number of recursive calls to permutations(), the function is called O(n!) times, although this is usually not what is meant by 'time complexity' without further specification. In other words, you can generate all permutations in O(n!) time, as long as you don't read or write those permutations after they are generated.
The part where your derivation of run-time breaks down is in the definition of T(n). If you define T(n) as 'the run-time of permutations(A, start) when the input, A, has length n', then you can not define it recursively in terms of T(n-1) or any other function of T(), because the length of the input in all recursive calls is n, the length of A.
A more useful way to define T(n) is by specifying it as the run-time of permutations(A', start), when A' is any permutation of a fixed, initial array A, and A.length - start == n. It's easy to write the recurrence relation here:
T(x) = x * T(x-1) + O(x) if x > 1
T(1) = A.length
This takes into account the fact that the last recursive call, T(1), has to perform O(A.length) work to copy that array to the output, and this new recurrence gives the result from the textbook.

Time Complexity of a recursive function where the base case isn't O(1)

Most recursive functions I have seen being asked about (e.g. Fibonacci or Hanoi) have had O(1) returns, but what would the time complexity be if it wasn't O(1) but O(n) instead?
For example, a recursive Fibonacci with O(n) base case:
class fibonacci {
static int fib(int n) {
if (n <= 1)
for (int i=0;i<n;i++) {
// something
}
return n;
return fib(n-1) + fib(n-2);
}
public static void main (String args[])
{
int n = 9;
System.out.println(fib(n));
}
}
The base case for the function that you’ve written here actually still has time complexity O(1). The reason for this is that if the base case triggers here, then n ≤ 1, so the for loop here will run at most once.
Because so many base cases trigger when the input size is small, it’s comparatively rare to get a base case whose runtime is, say, O(n) when the input to the algorithm has size n. This would mean that the base case is independent of the array size, which can happen but is somewhat unusual.
A more common occurrence - albeit one I think is still pretty uncommon - would be for a recursive function to have two different parameters to it (say, n and k), where the recursion reduces n but leaves k unmodified. For example, imagine taking the code you have here and replacing the for loop on n in the base case with a for loop on k in the base case. What happens then?
This turns out to be an interesting question. In the case of this particular problem, it means that the total work done will be given by O(k) times the number of base cases triggered, plus O(1) times the number of non-base-case recursive calls. For the Fibonacci recursion, the number of base cases triggered computing Fn is Fn+1 and there are (Fn+1 - 1) non-base-case calls, so the overall runtime would be Θ(k Fn+1 + Fn+1) = Θ(k φn). For the Towers of Hanoi, you’d similarly see a scaling effect where the overall runtime would be Θ(k 2n). But for other recursive functions the runtime might vary in different ways, depending on how those functions were structured.
Hope this helps!

Complexity of the following recursive code

Trying to figure out why complexity is O(n) for this code:
int sum(Node node) {
if (node == null) {
return 0;
}
return sum(node.left) + node.value + sum(node.right);
}
Node is:
class Node {
int value;
Node left;
Node right;
}
This is from CCI book. Shouldn't it be O(2^n) since it iterates through each node?
Yet this one is O(2^n), which is clear to me why:
int f(int n) {
if (n <= 1) {
return 1;
}
return f(n - 1) + f(n - 1);
}
Thanks for help.
An algorithm is said to take linear time, or O(n) time, if its time
complexity is O(n). Informally, this means that for large enough input
sizes the running time increases linearly with the size of the input.
For example, a procedure that adds up all elements of a list requires
time proportional to the length of the list.
From Wikipedia
It is very reasonable that the alogrithm complexity is O(n) since the recursive function numbers of calls is proportional to the number of items in the tree, there is n items in the tree and we will pass each item only once, this sounds very linear realtion to me.
In contrast to the other algorithm which is very similar to recursive Fibonacci Sequence algorithm, it this algorithm we will pass each number from 1 until n much more times than once and not linear proportional to n either, this explains why it has O(2^n) complexity.

What will be the complexity of this code?

My code is :
vector<int> permutation(N);
vector<int> used(N,0);
void try(int which, int what) {
// try taking the number "what" as the "which"-th element
permutation[which] = what;
used[what] = 1;
if (which == N-1)
outputPermutation();
else
// try all possibilities for the next element
for (int next=0; next<N; next++)
if (!used[next])
try(which+1, next);
used[what] = 0;
}
int main() {
// try all possibilities for the first element
for (int first=0; first<N; first++)
try(0,first);
}
I was learning complexity from some website where I came across this code. As per my understanding, the following line iterates N times. So the complexity is O(N).
for (int first=0; first<N; first++)
Next I am considering the recursive call.
for (int next=0; next<N; next++)
if (!used[next])
try(which+1, next);
So, this recursive call has number of step involved = t(n) = N.c + t(0).(where is some constant step)
There we can say that for this step, the complexity is = O(N).
Thus the total complexity is - O(N.N) = O(N^2)
Is my understanding right?
Thanks!
Complexity of this algorithm is O(N!) (or even O(N! * N) if outputPermutation takes O(N) which seems possible).
This algorithm outputs all permutations of 0..N natural numbers without repetitions.
Recursive function try itself sequentially tries to put N elements into position which and for each try it recursively invokes itself for the next which position, until which reaches N-1. Also, for each iteration try is actually invoked (N - which) times, because on each level some element is marked as used in order to eliminate repetitions. Thus the algorithm takes N * (N - 1) * (N - 2) ... 1 steps.
It is a recursive function. The function "try" calls itself recursively, so there is a loop in main (), a loop in try (), a loop in the recursive call to try (), a loop in the next recursive call to try () and so on.
You need to analyse very carefully what this function does, or you will get a totally wrong result (as you did). You might consider actually running this code, with values of N = 1 to 20 and measuring the time. You will see that it is most definitely not O (N^2). Actually, don't skip any of the values of N; you will see why.

Tracking Runtime Complexity

What is the best way to calculate runtime complexity for any method? It's easy to do that for non-recursive methods, like bubblesort
outer-for loop
{
inner-for loop
{
compare and exchange
}
}
To check, the best way is to put a counter in the inner-most loop. But, when the method is recursive, where should I put the counter, for instance merge sort,
sort(int[] array){
left = first-half
right = second-half
sort(left);
sort(right);
ret merge(left, right);
}
merge(int[] left, right)
{
count = length(left + right);
int[] result;
loop-count-times
{
compare and put in result;
}
return result;
}
Since this is merge sort, the big(o) is o(n log n), so an array of 100 ints should return a big-o of 200 exactly. Where will the counter go? If I put it at the top of sort(..), I get an average of 250, 280, 300, which should be wrong. What is the best place for this counter?
references:http://en.wikipedia.org/wiki/Mergesort
Thanks.
Since this is merge sort, the big(o) is o(n log n), so an array of 100 ints should return a big-o of 200 exactly.
Not even close to right.
Computational complexity denoted using the big Ordo-notation does not tell you how many steps/computational operations will be executed exactly. There's a reason it's called asymptotic and not identical complexity: it only gives you a function that approaches (more precisely, gives a higher bound on) the running time of the algorithm with regards to the size of the input.
So O(n log n) doesn't mean that for 100 elements, 200 operations will be performed (how come, by the way, that the base of the logarithm must be 10?), it tells you that if you increase the size of your input, the (average-case) running time will be proportional to the number of pieces of input data added, multiplied by the logarithm of the number of this additional data.
To the point: if you want to count the number of calls to a recursive function, you should put the counter in as an argument, like this:
void merge_sort(int array[], size_t length, int *counter)
{
(*counter)++;
// apply the algorithm to `array`:
merge_sort(array, length, counter);
}
and call it like this:
int num_calls = 0;
merge_sort(array, sizeof(array) / sizeof(array[0]), &num_calls);
printf("Called %d times\n", num_calls);
I think you have slightly misunderstood the concept of Big-O notation. If the complexity is O(n log n) and the value of n is 100, there is no strict rule that the program should execute exactly in Big-O of 200. It only gives us an upper bound. For example consider selection sort with an O(n2) complexity. Even if n is 100 the counter set inside the inner loop will not give you 1002 as result if the list is already sorted. So in your case what you get as answer (250, 280, 300, etc.) is perfectly valid. Because all the answers are limited by k times n log n, where k is an arbitrary constant.

Resources