Recursive Algorithms: Difference between revisions
(2 intermediate revisions by the same user not shown) | |||
Line 15: | Line 15: | ||
==Divide and Conquer Algorithms== | ==Divide and Conquer Algorithms== | ||
* [[Merge_Sort#Overview|Merge sort]]. This is the canonical "divide and conquer" algorithm example | * [[Merge_Sort#Overview|Merge sort]]. This is the canonical "divide and conquer" algorithm example. | ||
* <span id='Binary_Search'></span>[[Binary Search|Binary search]] of sorted or unsorted arrays | * <span id='Binary_Search'></span>[[Binary Search|Binary search]] of sorted or unsorted arrays. | ||
* [[Quicksort| | * [[Quicksort|Quicksort]]. | ||
* <span id='q23wLp'></span>[[Karatsuba Multiplication|Karatsuba multiplication]], which multiplies two n-digit integers in a time better than n<sup>2</sup> | * <span id='q23wLp'></span>[[Karatsuba Multiplication|Karatsuba multiplication]], which multiplies two n-digit integers in a time better than n<sup>2</sup>. | ||
* <span id='X4slMN'></span>[[Matrix_Multiplication#Strassen|Strassen's algorithm for matrix multiplication]] | * <span id='X4slMN'></span>[[Matrix_Multiplication#Strassen|Strassen's algorithm for matrix multiplication]]. | ||
* <span id='b3slp3'></span>[[Inversions in an Array|Finding the number of inversions in an array]] | * <span id='b3slp3'></span>[[Inversions in an Array|Finding the number of inversions in an array]]. | ||
* <span id='Rt3vlM1'></span>[[Maximum Subarray Problem|Maximum subarray problem]] | * <span id='Rt3vlM1'></span>[[Maximum Subarray Problem|Maximum subarray problem]]. | ||
* <span id='R23wL0'>[[Closest Pair Problem|Closest pair problem]] | * <span id='R23wL0'>[[Closest Pair Problem|Closest pair problem]]. | ||
=Recursive Algorithms Complexity= | =Recursive Algorithms Complexity= | ||
As long as | As long as the subproblems are equally sized, the "divide and conquer" algorithm complexity can be described using the Master Method. For other cases, the complexity can be determined using recursion trees and other methods. More details about recursive and "divide and conquer" algorithm complexity in: | ||
{{Internal|Recursive_Algorithms_Complexity#Overview|Recursive Algorithms Complexity}} | {{Internal|Recursive_Algorithms_Complexity#Overview|Recursive Algorithms Complexity}} |
Latest revision as of 19:17, 9 November 2021
Internal
Overview
A recursive algorithm solves a problem by calling itself recursively one or more times to deal with closely related subproblems, usually of smaller size. The running time of recursive algorithms can be estimated using recurrences, or recurrence equations. A recursive algorithm requires a base case, where the input is sufficiently small that the output can be immediately computed, rather than recursing further.
A typical class of recursive algorithms are "divide and conquer" algorithms, explained below. Note that not all recursive algorithms are "divide and conquer" algorithms. The Huffman algorithm is a recursive greedy algorithm that does not follows the "divide and conquer" paradigm.
The fundamental intuition behind why a recursive algorithm may be better than a brute force iterative algorithms is that while the number of subproblems to be resolved grows, the size of the subproblem, and the work required to solve it, decreases, and in some cases, the rate of subproblem proliferation (RSP) is smaller than the rate of work shrinkage (RWS).
Divide and Conquer
A common recursive technique is "divide and conquer". The "divide and conquer" is an algorithm design paradigm that has proven useful in solving a specific class of computational problems that involves breaking down a large problem into smaller subproblems and recursing. The technique consists in three steps, applied at each level of the recursion:
- Divide the problem into several subproblems that are similar to the original problem but smaller in size. The problems can be equal in size or not.
- Conquer the subproblems by solving them recursively, applying the same procedure in which we're currently in. If the subproblem size is small enough, it may be solved through iterative techniques, or it may be a trivial case that does not need any work. When we reach this stage, we say that the recursion "bottomed out" and we have gotten down to the base case.
- Combine the solutions to the subproblems to create a solution to the original problem. This is usually the step that requires the largest amount of ingenuity to complete the work efficiently.
Divide and Conquer Algorithms
- Merge sort. This is the canonical "divide and conquer" algorithm example.
- Binary search of sorted or unsorted arrays.
- Quicksort.
- Karatsuba multiplication, which multiplies two n-digit integers in a time better than n2.
- Strassen's algorithm for matrix multiplication.
- Finding the number of inversions in an array.
- Maximum subarray problem.
- Closest pair problem.
Recursive Algorithms Complexity
As long as the subproblems are equally sized, the "divide and conquer" algorithm complexity can be described using the Master Method. For other cases, the complexity can be determined using recursion trees and other methods. More details about recursive and "divide and conquer" algorithm complexity in: