Time complexity for algorithm - algorithm

I'd like to know the Big Oh for the following algorithm
public List<String> getPermutations(String s){
if(s.length()==1){
List<String> base = new ArrayList<String>();
base.add(String.valueOf(s.charAt(0)));
return base;
}
List<String> combos = createPermsForCurrentChar(s.charAt(0),
getPermutations(s.substring(1));
return combos;
}
private List<String> createPermsForCurrentChar(char a,List<String> temp){
List<String> results = new ArrayList<String>();
for(String tempStr : temp){
for(int i=0;i<tempStr.length();i++){
String prefix = tempStr.substring(0, i);
String suffix = tempStr.substring(i);
results.add(prefix + a + suffix);
}
}
return results;
}
Heres what I think it is getPermutations is called n times , where n is length of the string.
My understanding is that
createPermutations is O(l * m) where l is the length of list temp and m is the length of each string in temp.
However since we are looking at worst case analysis, m<=n and l<= n!.
The length of the temp list keeps growing in each recursive call and so does the number of characters in each string in temp.
Does this mean that the time complexity of this algorithm is O(n * n! *n). Or is it O(n * n * n) ?

Well, I will just write this up as an answer instead of having a long list of comments.
Denote the run time of getPerm on string of length n as T(n). Observe that inside getPerm, it calls getPerm(string length n-1), so clearly
T(n)=T(n-1) + [run time of createPerm]
Note that createPerm has 2 loops that are nested. The outer loop iterates through the size of the result of getperm(string of length n-1) and the inner loop iterates through n-1 (length of individual strings). The result of getPerm(string of length n-1) is a list of T(n-1) strings. From this, we get that
[run time of createPerm] = (n-1) T(n-1)
Substituting this into the previous equation gives
T(n) = T(n-1) + (n-1) T(n-1) = n T(n-1)
T(1) = 1 from the exit condition. We can just expand to find the solution (or, alternatively, use Z-transform: Can not figure out complexity of this recurrence). Since it is a simple equation, expanding is faster:
T(n) = n T(n-1)
= n (n-1) T(n-2)
= n (n-1) (n-2) T(n-3) ....
= n (n-1) ... 1
= n!
So T(n) = n!
Exercise: prove this by induction! :p
Does this make sense? Let's think about it. We are creating permutations of n characters: http://en.wikipedia.org/wiki/Permutation.
EDIT: note that T(n)=n! is O(n!)

I'm not the best with combinatorics, but I think it's O(n^3) where n is the number of characters in your string.
My logic is this:
The number of times that
getPermutations(String)
is called is related to the call:
createPermsForCurrentChar(s.charAt(0),getPermutations(s.substring(1));
On the first call you pass arguements (charAt(0), substring of length s.length-1), then (charAt(1), substring of length s.length-2)... for O(n) calls.
What's more important is the # of elements in List temp each time we enter createPermsForCurrentChar.
First, let's analyze the function as a standalone thing:
Let's say there are k elements in List<String> temp, and they have monotonically increasing lengths, denoted by L=current length, beginning with L=1 and ending with L=k.
The outer-for loop will iterate k times, this is easy.
The inner for loop will iterate L times.
Our complexity is O(k"L"). L is in quotation marks because it changes each time, let's see what it looks like: First outer loop iteration, the inner loop executes once. Second outer loop ieration, the inner loop executes twice, and so on until the inner loop executes k times giving us 1+2+3+4+...k = O(k^2).
So createPermsForCurrentChar is O(k^2) complexity, where k is the number of elements in List<String> temp (and also the size of the longest string in temp). What we want to know now is this - How many elements will be in List<string> temp for each call?
When we finally reach the base case in our recursion, we're passing the second last character of our string, and the last character of our string to createPermsForCurrentChar, so k=1. It will create a single string of length O(k).
This allows the next execution to pop off the stack and call createPermsForCurrentChar again, this time with k=2. Then k=3, k=4, k=5, etc.
We know that createPermsForCurrentChar is being called O(n) times due to our recurrence relation, so k will eventually = n. (1 + 2 + 3 + ... + n) = O(n^2). Taking into account the complexity of createPermsForCurrentChar, we get (1^2 + 2^2 + 3^2 + ... n^2) = (1/3)n^3 + (1/2)n^2 + (1/6)n (from http://math2.org/math/expansion/power.htm).
Since we only care about our dominating value, we can say that the algorithm is O(n^3).

Related

Counting the frequency of target element in an unsorted array using divide and conquer

Suppose I have an unsorted array A of n integers and an integer b. I want to write an algorithm to compute the frequency of b in A (i.e., count the number of times b appears in A) using divide and conquer.
Here is a recursive divide-and-conquer algorithm to count the frequency of b in the array A:
Divide the array A into two sub-arrays: left half and right half.
Recursively count the frequency of b in left half of A and in right half of A.
Combine the results from step 2: the frequency of b in A is equal to the sum of the frequency of b in left half and the frequency of b in right half.
If the length of the array A is 1, return 1 if A[0] equals b, otherwise return 0.
The recurrence relation of the algorithm is T(n) = 2T(n/2) + O(1), where O(1) is the time to divide the array and combine the results. The solution of the recurrence is T(n) = O(n), so the time complexity of the algorithm is O(n).
This is because each recursive call divides the array into two sub-arrays of equal size, and each element is visited once at the bottom level of the recursion. Therefore, the algorithm visits each element of the array once, leading to a linear time complexity.
Correct me If I'm wrong.
Let's just use concrete elements and do the math. The constant step during the splitting is O(1) so let's call it c and the constant step at the very end (Returning 1 or 0 for length 1 array) is just one step.
So then:
T(n) = 2T(n/2) + c
T(1) = 1
We make the educated guess (or ansatz, if you want to use fancy language) that T(n) = a*n + b, i.e., a linear function. Let's plug that into the relations:
T(n) = a*n + b = 2 * (a*n/2 + b) + c
from which it follows after a bit of basic math that b = -c.
Next, we plug the ansatz into the base case:
T(1) = a*1 + b = a + b = 1
from which we can deduce that a = 1 - b = 1 + c.
So there! We solved for a and b without making a mess and indeed we have
T(n) = (1 + c) * n - c
which is indeed O(n).
Note that this is the "pedestrian" way. If we're not interested in the actual coefficients a and b but really just in the complexity, we can be more efficient like so:
T(n) = 2 T(n/2) + O(1) = 4 T(n/4) + 2 * O(1) + O(1) = ...
= 2^k T(1) + 2^(k-1) O(1) + 2^(k-2) O(1) ... + O(1)
= 2^k O(1) + 2^(k-1) O(1) + ...
where k = log_2(n).
Summing up all those precoefficients then we get roughly
T(n) = 2^(k+1) O(1) = 2 * n * O(1) = O(n)

Runtime complexity of recursive permutation function

I wrote this code that returns all the permutation of the provided string. Now I want to calculate the run-time complexity and need help in that.
The code recursively calls the permutationRecursively function N times (for every character of the string i.e. st) and then there are two for loops one is looping through all the permutations returns back from the recursive call (i.e. for a it will be ['a'] or for ab it will be ['ab', 'ba'] and so on) and then each pair of permutation. I am really confused about this part. What will be complexity of this specific part?
I assume that for all the recursive calls it will be O(N) and then for inner loops, it will be O(A*B). So the total would be O(N*A*B). Is it correct?
def permutationRecursively(st):
if(len(st) < 2):
return [st]
else:
permutations = permutationRecursively(st[0:-1])
newPermutations = []
wordToInsert = st[-1]
for permutationPair in permutations:
for index in range(len(permutationPair)+1):
newPermutations.append(permutationPair[0:index]+wordToInsert+permutationPair[index:])
return newPermutations
start_time = time.time()
permutationRecursively("abbc")
print("--- %s seconds ---" % (time.time() - start_time))
Your function works by first recursively calling itself on an input of size n-1. Then it loops through each element of the result (of which there are (n-1)!), and for each element, it does O(n²) work (since len(permutationPair)+1 has length n and the string concatenation is O(n).
Hence we get the following recurrence relation for the time complexity T(n):
T(n) = T(n-1) + (n-1)! n²
The asymptotic behaviour of this relation is as follows:
T(n) ∈ Θ((n-1)! n²) = Θ(n!n)
So, in particular, T(n) ∉ O(n!).

Time complexity of the word break recursive solution?

What is the time complexity of the recursive solution to this in the code taken from: http://www.geeksforgeeks.org/dynamic-programming-set-32-word-break-problem/ :
// returns true if string can be segmented into space separated
// words, otherwise returns false
bool wordBreak(string str)
{
int size = str.size();
// Base case
if (size == 0) return true;
// Try all prefixes of lengths from 1 to size
for (int i=1; i<=size; i++)
{
// The parameter for dictionaryContains is str.substr(0, i)
// str.substr(0, i) which is prefix (of input string) of
// length 'i'. We first check whether current prefix is in
// dictionary. Then we recursively check for remaining string
// str.substr(i, size-i) which is suffix of length size-i
if (dictionaryContains( str.substr(0, i) ) &&
wordBreak( str.substr(i, size-i) ))
return true;
}
// If we have tried all prefixes and none of them worked
return false;
}
I'm thinking its n^2 because for n calls to the method, it worst case does (n-1) work (iterates over the rest of string recursively?). Or is it exponential/n!?
I'm having a tough time figuring out Big(O) of recursive functions such as these. Any help is much appreciated!
The answer is exponential, to be precise O(2^(n-2)). (2 power n-2)
In each call you are calling the recursive function with length 1,2....n-1(in worst case). To do the work of length n you are recursively doing the work of all the strings of length n-1, n-2, ..... 1. So T(n) is the time complexity of your current call, you are internally doing a work of sum of T(n-1),T(n-2)....T(1).
Mathematically :
T(n) = T(n-1) + T(n-2) +.....T(1);
T(1) = T(2) = 1
If you really don't know how to solve this, an easier way to solve the above recurrence is by just substitute values.
T(1) = T(2) = 1
T(3) = T(1) + T(2) = 1+1 =2; // 2^1
T(4) = T(1)+ T(2) + T(3) = 1+1+2 =4; //2^2
T(5) = T(1) + T(2) +T(3) +T(4) = 1+1+2+4 =8; //2^3
So if you substitute first few values, it will be clear that the Time complexity is 2^(n-2)
The short version:
The worst-case runtime of this function is Θ(2n), which is surprising because it ignores the quadratic amount of work done by each recursive call simply splitting the string into pieces and checking which prefixes are words.
The longer version: let's imagine we have an input string consisting of n copies of the letter a followed by the letter b. (we'll abbreviate this as aⁿb), and create a dictionary containing the words a, aa, aaa, ..., aⁿ.
Now, what will the recursion do?
First, notice that none of the recursive calls will return true, because there's no way to account for the b at the end of the string. This means that each recursive call will be to a string of the form aᵏb. Let's denote the amount of time required to process such a string as T(k). Each one of these calls will fire off k smaller calls, one to each suffix of aᵏb.
However, we also have to account for the other contributors to the runtime. In particular, calling string::substr to form a substring of length k takes time O(k). We also need to factor in the cost of checking if a prefix is a word. The code isn't shown here for how to do this, but assuming we use a trie or a hash table we can assume that the cost of checking if a string of length k is a word is O(k) as well. This means that, at each point where we make a recursive call, we will do O(n) work - some amount of work to check if the prefix is a word, and some amount of work to form the substring corresponding to the suffix.
Therefore, we get that
T(k) = T(0) + T(1) + T(2) + ... + T(k-1) + O(k2)
Here, the first part of the recurrence corresponds to each of the recursive calls, and the second part of the recurrence accounts for the cost of making each of the substrings. (There are n substrings, each of which takes time O(n) to process). Our goal is to solve this recurrence, and just for simplicity we'll assume that T(0) = 1.
To do this, we'll use the "expand and contract" technique. Let's write out the values of T(k) and T(k+1) next to each other:
T(k) = T(0) + T(1) + T(2) + ... + T(k-1) + O(k2)
T(k+1) = T(0) + T(1) + T(2) + ... + T(k-1) + T(k) + O(k2)
Subtracting this first expression from the second gives us that
T(k+1) - T(k) = T(k) + O(k),
or that
T(k+1) = 2T(k) + O(k).
How did we get O(k) out of the difference of two O(k2) terms? It's because (k + 1)2 - k2 = 2k + 1 = O(k).
This is an easier recurrence to work with, since each term just depends on the previous one. For simplicity, we're going to assume that the O(k) term is literally just k, giving the recurrence
T(k+1) = 2T(k) + k.
This recurrence solves to T(k) = 2k+1 - k - 1. To see this, we can use a quick inductive argument. Specifically:
T(0) = 1 = 2 - 1 = 20+1 - 0 - 1
T(k+1) = 2T(k) + k = 2(2k - k - 1) + k
= 2k+1 - 2k - 2 + k
= 2k+1 - k - 2
= 2k+1 - (k + 1) - 1
Therefore, we get that our runtime is Θ(2n), since we can ignore the lower-order n term.
I was very surprised to see this, because this means that the quadratic work done by each recursive call does not factor into the overall runtime! I would have initially guessed that the runtime would be something like Θ(n · 2n) before doing this analysis. :-)
I believe the answer should actually be O(2^(n-1)). You can see a proof as this as well as a worst-case example here:
https://leetcode.com/problems/word-break/discuss/169383/The-Time-Complexity-of-The-Brute-Force-Method-Should-Be-O(2n)-and-Prove-It-Below
One intuitive way I think about the complexity here is, how many ways are there to add spaces in the string here or break the word here ?
for a 4 letter word:
no of ways to break at index 0-1 * no of ways to break at index 1-2 * no of ways to break at index 2-3 = 2 * 2 * 2.
2 is there to signify the two options => you break it, you don't break it
O(2^(n-1)) is the recursive complexity of the wordbreak then ;)

What is the complexity of an arithmetic progression?

I dont really understand how to calculate the complexity of a code. I was told that i need to look on the number of actions that are done on each item in my code. So when I have a loop that runs over an array and based on the idea of arithmetic progression (I want to calculate the sum from every index till the end of the array) which means at first i pass over n cells and the second time n-1 cells and so on... why is the complexity considerd O(N^2) and not O(n) ?
As I see it, n + n-1 +n-2 + n-c.. is xn -c , In other words O(n). SO WHY am i wrong ?
As I see it, n + n-1 +n-2 + n-c.. is xn -c , In other words O(n). SO WHY am i wrong ?
Actually, it is not true. The sum of this arithmetic progression is n*(n-1)/2 = O(n^2)
P.S I have read your task : you need only one loop over an array using the previous results, so you can solve this one with O(n) complexity.
for i=1 to n
result[i] = a[i]+result[i-1]
What your code is telling to do is the following :-
traverse array from 1 to n
traverse array from 2 to n
... similarly after total n-1 iterations
traverse array's nth element
As you can notice that array traversing of cells is decreasing in order of 1.
Each traversal is being guided by loop which is increasing upto value of i. The whole code is wrapped under a function of n.
The concrete idea for number of actions performed on each item of the array is :-
for ( i = 1 to n )
for ( j = i to n )
traverse array[j] ;
Hence, complexity of your code = O(n^2) and the order is clearly in AP as it forms the series n + (n-1) + ... + 1 with a common difference of 1.
I hope it is clear...
The time complexity is: 1 + 2 + ... + n.
This is equal to n(n+1)/2.
For example, for n = 3: 1 + 2 + 3 = 6
and 3(4)/2 = 12/2 = 6
n(n+1)/2 = (n^2 + n) / 2 which is O(n^2) because we can remove constant factors and lower order terms.
As an arithmetic progression has a closed form solution, its efficient computation is o(1): that is its computation time does not depend on the number of elements.
If you were to use a loop then it would be o(n) as the execution time would be linear on the number of elements.
You're adding up n numbers whose average value is (n/2) because they range from 1 to n. Thus n times (n/2) = n^2 / 2. We don't care about the constant multiple, so O(n^2).
You are getting it wrong somewhere! The sum of an arithmetic progression is of the order of n^{2}
To clear your doubts on arithmetic progression, visit this link: http://www.mathsisfun.com/algebra/sequences-sums-arithmetic.html
And as you said, you face difficulty in finding the complexity of any code, you can read from these two links:
http://discrete.gr/complexity/
http://www.cs.cmu.edu/~adamchik/15-121/lectures/Algorithmic%20Complexity/complexity.html
Good enough to get you going and help you understand how to find the complexity of most of the algorithms.

Recurrence Relation based off Pseudo Code (Time complexity)

Consider the element uniqueness problem, in which we are given a range, i, i + 1, . . . , j, of indices for an array, A, and we want to determine if the elements of this range, A[i], A[i+1], . . . , A[j], are all unique, that is, there is no repeated element in this group of array entries. Consider the following (inefficient) recursive algorithm.
public static boolean isUnique(int[] A, int start, int end) {
if (start >= end) return true; // the range is too small for repeats
// check recursively if first part of array A is unique
if (!isUnique(A,start,end-1) // there is duplicate in A[start],...,A[end-1]
return false;
// check recursively if second part of array A is unique
if (!isUnique(A,start+1,end) // there is duplicate in A[start+1],...,A[end]
return false;
return (A[start] != A[end]; // check if first and last are different
}
Let n denote the number of entries under consideration, that is, let n = end − start + 1. What is an upper is upper bound on the asymptotic running time of this code fragment for large n? Provide a brief and precise explanation.
(You lose marks if you do not explain.) To begin your explanation, you may say how many recursive calls the
algorithm will make before it terminates, and analyze the number of operations per invocation of this algorithm.
Alternatively, you may provide the recurrence characterizing the running time of this algorithm, and then solve it
using the iterative substitution technique?
This question is from a sample practise exam for an Algorithms class this is my current answer can some one please help verify if im on the right track
Answer:
The recurrence equation:
T(n) = 1 if n = 1,
T(n) = 2T(n-1) if n > 1
after solving using iterative substitution i got
2^k * T (n-k) and I solved this to O(2^(n-1)) and I simplified it O(2^n)
Your recurrence relation should be T(n) = 2T(n-1) + O(1) with T(1) = O(1). However this doesn't change the asymptotics, the solution is still T(n) = O(2^n). To see this you can expand the recurrence relation to get T(n) = O(1) + 2(O(1) + 2(O(1) + ...)) so you have T(n) = O(1) * (1 + 2 + 4 = ... + 2^n) = O(1) * (2^(n+1) - 1) = O(2^n).

Resources