Karatsuba Multiplication: Difference between revisions
(19 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
=Internal= | =Internal= | ||
* [[ | * [[Recursive_Algorithms#Divide_and_Conquer|Divide and Conquer Algorithms]] | ||
= | * [[Matrix_Multiplication#Strassen|Strassen's Matrix Multiplication]] | ||
=Overview= | |||
Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n<sup>2</sup>) time complexity. | |||
A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying [[Master_Method#Overview|master theorem]], the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n<sup>1</sup>) so d=1. a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, we find ourselves in the [[Master_Method#Case_3|Case 3]] of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(n<sup>log<sub>b</sub>a</sup>) = O(n<sup>2</sup>), no better than the straightforward iterative method. | |||
=Gauss Optimization= | |||
Assuming that we divide each integer in two halves, as in the naive recursive method M = [A B], N = [C D], Gauss optimization consists in observing that: | |||
<font size=-1> | |||
<font | (A + B)(C + D) = AC + AD + BC + BD | ||
</font> | </font> | ||
The subproducts required to compute M * N are: | |||
<font size=-1> | |||
A B | |||
C D | |||
─────── | |||
AD BD + | |||
AC BC | |||
</font> | |||
which is: | |||
<font size=-1> | |||
10<sup>n</sup> AC + 10<sup>n/2</sup> (AD + BC) + BD | |||
</font> | |||
= | So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD = AC + AD + BC + BD - AC - BD with an O(n) operation after performing just '''one''' multiplication of sub-parts. | ||
The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(n<sup>log 3</sup>) as shown in the [[#Time_Complexity|Time Complexity]] section below. This is the same type of optimization applied in the case of [[Matrix_Multiplication#Strassen|Strassen's matrix multiplication algorithm]]. | |||
=Time Complexity= | |||
< | If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3 of the master theorem]], and the running time complexity is O(n<sup>log<sub>2</sub>3</sup>)=O(n<sup>log 3</sup>), which is better than O(n<sup>2</sup>). | ||
=Playground= | =Playground= |
Latest revision as of 19:11, 9 November 2021
Internal
Overview
Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n2) time complexity.
A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying master theorem, the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n1) so d=1. a/bd = 4/21 = 2, we find ourselves in the Case 3 of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(nlogba) = O(n2), no better than the straightforward iterative method.
Gauss Optimization
Assuming that we divide each integer in two halves, as in the naive recursive method M = [A B], N = [C D], Gauss optimization consists in observing that:
(A + B)(C + D) = AC + AD + BC + BD
The subproducts required to compute M * N are:
A B C D ─────── AD BD + AC BC
which is:
10n AC + 10n/2 (AD + BC) + BD
So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD = AC + AD + BC + BD - AC - BD with an O(n) operation after performing just one multiplication of sub-parts.
The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(nlog 3) as shown in the Time Complexity section below. This is the same type of optimization applied in the case of Strassen's matrix multiplication algorithm.
Time Complexity
If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/bd = 3/21 > 1, so we are also in Case 3 of the master theorem, and the running time complexity is O(nlog23)=O(nlog 3), which is better than O(n2).