Determine complexity for a recursive function - asymptotic-complexity

I have a problem in determining the recurrence relations of the following code:
public static void Method1(String S){
if(S.length()>1){
System.out.print(S.charAt(S.length()-1));
Method1(S.substring(1,S.length()-1));
System.out.print(S.charAt(0));
}
if(S.length()==1)
System.out.print(S.charAt(0));
}
I understand that this code reverses the given string and the terminating condition is when the string.length()=0 but I can't understand how to determine the recurrence relation and complexity. what steps should I consider to solve this kinds of questions

You could simply rewrite it as iterative, which shouldn't change the complexity. Anyhow, consider a single call of the recursive function:
What is its complexity excluding recursive calls?
There are two calls to print() with the first and last character, or the only one in case of the terminating call. The complexity of that is probably constant. Further, there is a substring() call for the parameter to the recursive call. This call is probably linear, but it could also be constant, it depends on the implementation.
Then, how many recursive calls are there per call?
There is exactly one recursive call, making this pretty simple.
Further, how deep will this recurse?
Each call gives two less characters to the succeeding one, so the number of recursive calls is linear to the input length.
Summary
For the overall complexity, you only multiply the different steps with each other to get the overall complexity. In this case, it's pretty lousy concerning what the function does.

Related

Does "&&" in an if statement affects the time complexity?

Let's just say if there was an if statement like
if(arraylist.indexOf(n) < 14 && arraylist.indexof(m) < 20) {
doSomething();
}
Would the if statement be O(n) or O(n^2)? The time complexity of indexOf(n) is O(n). I tried searching the answer on google but can't seem to find it.
You don't have a language tag here, but I'll assume it supports short-circuiting (most do).
The complexity with be whatever arraylist.indexOf is times 2. So, if you're correct and the complexity is O(n), the complexity of your if (assuming no short-circuiting) is O(2n) (and since you'll typically simplify complexity by dropping constants, O(n)).
It is not unreasonable to be worried about hidden performance implications of a using O(N) function. However, in this case, there is not additional penalty. Worst case occurs when the indexOf is called for both sides of the &&. The time complexity does not change in this case, since the O(N) is applied at most twice.
I have encountered hidden complexity issues in using concatenation functions, where the concatenation requires traversal of the original value before appending the additional data (e.g., C's strcat). A caller that is unaware of the traversal is occurring may write a loop that performs the concatenation on each iteration, not knowing this is triggering an order of magnitude increase in time complexity.

Does any divide-and-conquer algorithm use recursion

I am arguing with a fellow student, as he wants to convince me that there is a possibility that a divide-and-conquer algorithm can be implemented without the use of recursion.
Is this truly the case?
Any algorithm that can be implemented with recursion can also be implemented non-recursively.
Recursion and iteration are equally expressive: recursion can be replaced by iteration with an explicit stack, while iteration can be replaced with tail recursion. Which approach is preferable depends on the problem under consideration and the language used.
http://en.wikipedia.org/wiki/Recursion_%28computer_science%29#Recursion_versus_iteration
There's an important thing to understand: using or not recursion is an implementation decision. Recursion is not necessary to add computing power (at least not to a Turing complete language). Look up "Tail recursion" for an easy example of how to transform a recursive function to a non recursive one (in the case of a divide-et-impera algorithm you can remove at most 1 of the recursive calls with this method).
A function/algorithm that is computable with recursion is computable also without it. What matters is if the language with which you implement the algorithm is Turing complete or not.
Let's make an example. The mergesort algorithm can be implemented also in a non recursive way using a queue as an auxiliary data structure to keep track of the various merges.

What are "relevant operations" in analysis of algorithms?

I've been studying analysis of algorithms, and multiple times I've come across the idea that to define the time complexity of an algorithm I have to find the number of "relevant operations" performed by the algorithm. However, not a single source (Cormen, Skiena, Sedgewick...), at least to my knowledge, mentions HOW I can know what operations are in fact relevant. Any help?
In general you just count the operations, that you care about in your given context.
E.g.
fib(n)
{
if(n < 2)
return 1;
return fib(n-1) + fib(n-2)
}
The operations you can find in this code are:
if statement
comparison
return a value
recursive call
addition
Every operation takes some time to be executed, but every operation is executed really fast (assuming the input n is a 64 bit number and the program runs on a "normal" computer). So the complexity of this function has to come from the recusive calls and especially not from the call it self, but from the number of calls. So you count the recursive calls and forget about the other operations, knowing that they run fast, means O(1) (constant time).
The reason why you can ignore some operations, is that you calculate the complexity in terms of o, O, Θ, Ω or ω (Landau-Notation), which allows you to calculate in asymtotics, where slwo growing parts and constants doesn't play any role.
I think there is no easy answer how to know, what operations are relevant. There are easy examples (like the one I gave), but in general it is not that easy. Then you have to count everything that could be relevant until you notice, that it's not.

Calculating Big O of a recursive function

I have a method which checks if an array is a Heap. On every recursive call, I create 2 new recursive calls on the left subtree and the right subtree to traverse the nodes and check that their values are correct.
I want to calculate BigO of this. I think that in the worst case it is O(n) because if it IS a Heap, then it never stops early and needs to visit every node. I think the best case is O(3), and that would occur when it checks the very first left subtree and right subtree and both return false (not a Heap).
My question is: does this logic make sense? I think it does, but whenever I see the time complexity of recursive functions they always seem to be in some form of logarithmic time. It is almost as if there is some mysterious quality to recursive functions that nobody is explicitly stating. How come recursive functions often times process things in logarithmic time? And is my above logic valid?
Yes it makes sense. The reason that you see most algorithms take logarithmic time is because it repeats over something and keeps divide the scope by some factor.
Yes, that makes sense. Only one of the three cases of the Master Theorem (though arguably the most interesting) has a logarithm.

Is best Big O Time efficiency always the same as best Space efficiency for recursive solutions?

If a recursive solution ends up calling itself consecutively for say, ~N times, before going back up a level, the space efficiency is O(N) at best, because each of the N calls uses up a certain amount of stack space.
Does this also imply the time efficiency is also O(N) at best, because the code inside the recursive function is similar to an inner loop code that gets run ~N times?
In addition to #Ben's answer there is also the case of "tail recursion" where the current stack frame is removed and replaced by the callee's stack frame, but only when the caller's last action is to return the result of a callee. This can result in O(n) time functions having O(1) space when implemented in an entirely functional language.
No, but your observation has some truth in it - basically if you know that any algorithm (recursive or otherwise, since we don't distinguish the two; and there isn't anything really that could distinguish them, it's more a matter of style) for a given problem has space complexity at least f(n), it must have time complexity at least f(n), too.
No, since each step of the recursive algorithm can take longer than O(1). If each step take O(n) then the total time complexity is O(n^2).

Resources