Recursive Algorithms

From NovaOrdis Knowledge Base
Jump to navigation Jump to search

Internal

Overview

A recursive algorithm solves a problem by calling itself recursively one or more times to deal with closely related subproblems, usually of smaller size. The running time of recursive algorithms can be estimated using recurrences, or recurrence equations. A recursive algorithm requires a base case, where the input is sufficiently small that the output can be immediately computed, rather than recursing further.

A typical class of recursive algorithms are "divide and conquer" algorithms, explained below. Note that not all recursive algorithms are "divide and conquer" algorithms. The Huffman algorithm is a recursive greedy algorithm that does not follows the "divide and conquer" paradigm.

The fundamental intuition behind why a recursive algorithm may be better than a brute force iterative algorithms is that while the number of subproblems to be resolved grows, the size of the subproblem, and the work required to solve it, decreases, and in some cases, the rate of subproblem proliferation (RSP) is smaller than the rate of work shrinkage (RWS).

Divide and Conquer

A common recursive technique is "divide and conquer". The "divide and conquer" is an algorithm design paradigm that has proven useful in solving a specific class of computational problems that involves breaking down a large problem into smaller subproblems and recursing. The technique consists in three steps, applied at each level of the recursion:

  1. Divide the problem into several subproblems that are similar to the original problem but smaller in size. The problems can be equal in size or not.
  2. Conquer the subproblems by solving them recursively, applying the same procedure in which we're currently in. If the subproblem size is small enough, it may be solved through iterative techniques, or it may be a trivial case that does not need any work. When we reach this stage, we say that the recursion "bottomed out" and we have gotten down to the base case.
  3. Combine the solutions to the subproblems to create a solution to the original problem. This is usually the step that requires the largest amount of ingenuity to complete the work efficiently.

Divide and Conquer Algorithms

Recursive Algorithms Complexity

As long as the subproblems are equally sized, the "divide and conquer" algorithm complexity can be described using the Master Method. For other cases, the complexity can be determined using recursion trees and other methods. More details about recursive and "divide and conquer" algorithm complexity in:

Recursive Algorithms Complexity