Matrix Multiplication: Difference between revisions
No edit summary |
|||
Line 22: | Line 22: | ||
However, applying [[Master_Method#Overview|Mater Method]] to infer the time complexity of the recursive multiplication algorithm for a = 8 (we invoke recursively eight times: AE, BG, AF, BH, CE, DG, CF and DH), b = 2 (we multiply matrices twice as small, each matrix of size n/2 * n/2) and the combine step complexity is O(n<sup>2</sup>) so d = 2. a/b<sup>d</sup> is 8/2<sup>2</sup> = 2, so according to the Master Method, the complexity is bounded by O(n<sup>log<sub>b</sub>a</sup>) = O(n<sup>3</sup>), the same as the straightforward iterative method. | However, applying [[Master_Method#Overview|Mater Method]] to infer the time complexity of the recursive multiplication algorithm for a = 8 (we invoke recursively eight times: AE, BG, AF, BH, CE, DG, CF and DH), b = 2 (we multiply matrices twice as small, each matrix of size n/2 * n/2) and the combine step complexity is O(n<sup>2</sup>) so d = 2. a/b<sup>d</sup> is 8/2<sup>2</sup> = 2, so according to the Master Method, the complexity is bounded by O(n<sup>log<sub>b</sub>a</sup>) = O(n<sup>3</sup>), the same as the straightforward iterative method. | ||
An improvement to this upper bound is provided by the [[#Strassen|Strassen method]], which cleverly proposes just 7 recursive calls. | |||
= | =<span id='Strassen'></span>Strassen's Algorithm for Matrix Multiplication= | ||
The asymptotic complexity is Θ(n<sup>log7</sup>). | |||
<font color=darkgray>TODO [[CLRS]] page 75.</font> | <font color=darkgray>TODO [[CLRS]] page 75.</font> | ||
Revision as of 19:54, 20 September 2021
Internal
Overview
The straightforward iterative algorithm for matrix multiplication in case of two matrices M (mxn) and N (nxp) is m * n * p. For simplicity, we assume the matrices are square, with the number of rows and columns equal to n. In this case the obvious algorithm for multiplication is O(n3).
One may think that a divide and conquer recursive algorithm could help, and the obvious divide and conquer algorithm would be to divide each matrix in equal blocks and perform recursive invocations multiplying the blocks:
│ A B │ │ E F │ M = │ │ N = │ │ │ C D │ │ G H │
In this case final result is obtained by multiplying the blocks as follows:
│ AE+BG AF+BH │ M x N = │ │ │ CE+DG CF+DH │
However, applying Mater Method to infer the time complexity of the recursive multiplication algorithm for a = 8 (we invoke recursively eight times: AE, BG, AF, BH, CE, DG, CF and DH), b = 2 (we multiply matrices twice as small, each matrix of size n/2 * n/2) and the combine step complexity is O(n2) so d = 2. a/bd is 8/22 = 2, so according to the Master Method, the complexity is bounded by O(nlogba) = O(n3), the same as the straightforward iterative method.
An improvement to this upper bound is provided by the Strassen method, which cleverly proposes just 7 recursive calls.
Strassen's Algorithm for Matrix Multiplication
The asymptotic complexity is Θ(nlog7).
TODO CLRS page 75.