I am arguing with a fellow student, as he wants to convince me that there is a possibility that a divide-and-conquer algorithm can be implemented without the use of recursion.
Is this truly the case?
Any algorithm that can be implemented with recursion can also be implemented non-recursively.
Recursion and iteration are equally expressive: recursion can be replaced by iteration with an explicit stack, while iteration can be replaced with tail recursion. Which approach is preferable depends on the problem under consideration and the language used.
http://en.wikipedia.org/wiki/Recursion_%28computer_science%29#Recursion_versus_iteration
There's an important thing to understand: using or not recursion is an implementation decision. Recursion is not necessary to add computing power (at least not to a Turing complete language). Look up "Tail recursion" for an easy example of how to transform a recursive function to a non recursive one (in the case of a divide-et-impera algorithm you can remove at most 1 of the recursive calls with this method).
A function/algorithm that is computable with recursion is computable also without it. What matters is if the language with which you implement the algorithm is Turing complete or not.
Let's make an example. The mergesort algorithm can be implemented also in a non recursive way using a queue as an auxiliary data structure to keep track of the various merges.
Related
Why should one choose recursion over iteration, when a solution has the same time complexity for both cases but better space complexity for iterative?
Here's a particular example of a case where there are extra considerations. Tree search algorithms can be defined recursively (because each subtree of a tree is a tree) or iteratively (with a stack). However, while a recursive search can work perfectly for finding the first leaf with a certain property or searching over all leaves, it does not lend itself to producing a well-behaved iterator: an object or function state that returns a leaf, and later when called again returns the next leaf, etc. In an iterative design the search stack can be stored as a static member of the object or function, but in a recursive design the call stack is lost whenever the function returns and is difficult or expensive to recreate.
Iteration is more difficult to understand in some algorithms. An algorithm that can naturally be expressed recursively may not be as easy to understand if expressed iteratively. It can also be difficult to convert a recursive algorithm into an iterative algorithm, and verifying that the algorithms are equivalent can also be difficult.
Recursion allows you to allocate additional automatic objects at each function call. The iterative alternative is to repeatedly dynamically allocate or resize memory blocks. On many platforms automatic allocation is much faster, to the point that its speed bonus outweighs the speed penalty and storage cost of recursive calls. (But some platforms don't support allocation of large amounts of automatic data, as mentioned above; it's a trade-off.)
recursion is very beneficial when the iterative solutions requires that you simulate recursion with a stack. Recursion acknowledges that the compiler already manages a stack to accomplish precisely what you need. When you start managing your own, not only are you likely re-introducing the function call overhead you intended to avoid; but you're re-inventing a wheel (with plenty of room for bugs) that already exists in a pretty bug-free form.
Some Benefits for Recursion
Code is Perfect Elegant (compared to loops)
very useful in backtracking data structures like LinkedList, Binary Search Trees as the recursion works by calling itself in addition stack made especially for this recursive calls and each call chained by its previous one
Is there any problem that can only be solved in recursion or iteration. If not, can all algorithms be represented in either form having the same complexity?
PS: I'm talking about the theoretical complexity(O, theta and omega) and not about the time taken when implemented in real world systems.
Recursion and iteration are equally expressive: recursion can be replaced by iteration with an explicit stack, while iteration can be replaced with tail recursion.[1]
So no, every problem that can be solved iterative can be solved with recursion and vice-versa. If you do 1:1 conversion, Big-O notation stays the same. It can, however, still be better to use an iterative algorithm over a recursive because you can do different things.
1: https://en.wikipedia.org/wiki/Recursion_(computer_science)#Recursion_versus_iteration
I am more comfortable with implementing recursive methods over iterative ones. While studying for the exam, i implemented a recursive BFS(Breadth-First Search) using Queues, but while searching online for a recursive BFS that uses Queues, i kept on reading that BFS is an iterative algorithm not a recursive one. So is there any reason to choose one over the other?
Iterative is more efficient for the computer. Recursive is more efficient for the programmer and more elegant (perhaps).
The problem with recursive is each recursive call pushes the state/frame onto the call stack, which can quickly lead to resource exhaustion (stack overflow!) for deep recursion. But, the solutions are often easier to code and read.
Iterative performs better because it's all done in the local frame. However, converting recursive to iterative can reduce readabiity due to the introduction of variables to cater for the progression of the algorithm.
Chose whatever implementation is easiest to code and maintain. Only worry if you have a demonstrated problem.
Iterative and recursive both have same time complexity.difference is: recursive programs need more memory as each recursive call pushes state of the program into stack and stackoverflow may occur.but recursive code is easy to write and manage.You can reduce the space complexity of recursive program by using tail recursion.
Iterative implementations are usually faster. One example is fibonacci series. It's faster to implement it in a simple loop over a recursive solution.
More discussion here Recursion or Iteration?
I believe that all the problems that have iterative logic can be solved using iterations, but can we solve any problem using recursion? Can recursion always substitute iteration? Please provide a proof to your answer if you can. Also assume that we have an infinite stack or we run the program on a Turing machine. I don't care if this proof is a theoretical proof. (that's why I mentioned the Turing Machine)
Yes, recursion can always substitute iteration, this has been discussed before. Quoting from the linked post:
Because you can build a Turing complete language using strictly iterative structures and a Turning complete language using only recursive structures, then the two are therefore equivalent.
Explaining a bit: we know that any computable problem can be solved by a Turing machine. And it's possible to construct a programming language A without recursion, that is equivalent to a Turing machine. Similarly, it's possible to build a programming language B without iteration, equal in computational power to a Turing machine.
Therefore, if both A and B are Turing-complete we can conclude that for any iterative program there must exist an equivalent recursive program, and vice versa. This is a theoretical result, in the sense that it doesn't give you any hints on how to derive one recursive program from an arbitrary iterative program, or vice versa.
Yes. There is a type of recursion called tail recursion, which is directly translatable to iteration. One can be converted to the other without any problem. Thus, all iterative solutions can be converted into recursive solutions.
In fact, many compilers can detect that you are doing tail recursion, and then convert it into for-loop type code for efficiency.
A recursion should be used when a loop is not needed, but the method must repeat itself in a certain case. For example, zipping a folder. If there is a subfolder, it should call itself (recursive). Recursion could substitute iterations if you wanted to, but it's not recommended. Most people just use iterations and only use recursions when they are needed.
I am not sure but I had heard of an algorithm which can only be achieved by recursion. Does anyone know of such thing?
You can always simulate recursion by keeping your own stack. So no.
You need to clarify what kind of recursion you are talking about. There's algorithmic recursion and there's recursion in the implementation. It is true that any recursive algorithm allows for a straightforward non-recursive implementation - one can easily do it by using the standard trick of simulating the recursion with manual stack.
However, your question mentions algorithms only. If one assumes that it is specifically about algorithmic recursion, then the answer is yes, there are algorithms that are inherently and unavoidably recursive. In general case it is not possible to replace an inherently recursive algorithm with a non-recursive one. The simplest way to build an inherently recursive algorithm is to take an inherently recursive data structure first. For example, let's say we need to traverse a tree with only parent-to-child links (and no child-to-parent links). It is impossible to come up with a reasonable non-recursive algorithm to solve this problem.
So, that's one example for you: traversal of a tree, which has only parent-to-child links cannot be performed by a non-recursive algorithm.
Another example of an inherently recursive algorithm is the well-known QuickSort algorithm. QuickSort is always recursive, and it cannot be turned into a non-recursive algorithm simply because if you succeed in doing this it will no longer be QuickSort anymore. It will be a completely different algorithm. Granted, this sounds as an exercise in pure semantics, but nevertheless that's also something that is worth mentioning.
If I remember my algorithms correctly, there is nothing doable by recursion that cannot be done with a stack and a loop. I don't have the formal proof here at my fingertips, however.
Edit: it occurs to me that the answer, possibly, is that the only thing doable by recursion that is not doable with a stack+loop, is a stack overflow?
The following compares a recursive vs non-recursive implementations: http://www.sparknotes.com/cs/recursion/whatisrecursion/section1.html
Excerpt:
Given that recursion is in general less efficient, why would we use it? There are two situations where recursion is the best solution:
The problem is much more clearly solved using recursion: there are many problems where the recursive solution is clearer, cleaner, and much more understandable. As long as the efficiency is not the primary concern, or if the efficiencies of the various solutions are comparable, then you should use the recursive solution.
Some problems are much easier to solve through recursion: there are some problems which do not have an easy iterative solution. Here you should use recursion. The Towers of Hanoi problem is an example of a problem where an iterative solution would be very difficult. We'll look at Towers of Hanoi in a later section of this guide.
Are you just looking for a practical example of recursion? Recently my friends and I implemented the Haar Wavelet function (as an exercise to start learning Ruby), which seemed to require recursion. Unless anybody has an implementation of it without recursion?
I would imagine any time one doesn't know the depth of the stack one is iterating over, recursion is the logical way to go. Sure, it may be doable with some hacked up loops, but is that good code?