Karatsuba Multiplication: Difference between revisions
(→TODO) |
|||
Line 33: | Line 33: | ||
=Time Complexity= | =Time Complexity= | ||
If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3 of the master theorem]], and the running time complexity is O(n<sup>log<sub>2</sub>3</sup>)=O(n<sup>log 3</sup>), which is better than O(n<sup>2</sup>). | If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3 of the master theorem]], and the running time complexity is O(n<sup>log<sub>2</sub>3</sup>)=O(n<sup>log 3</sup>), which is better than O(n<sup>2</sup>). | ||
=Overview= | =Overview= |
Revision as of 20:04, 21 September 2021
Internal
Overview
Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n2) time complexity.
A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying master theorem, the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n1) so d=1. a/bd = 4/21 = 2, we find ourselves in the Case 3 of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(nlogba) = O(n2), no better than the straightforward iterative method.
Gauss Optimization
Assuming that we divide each integer in two halves, as in the naive recursive method M = [A B], N = [C D], Gauss optimization consists in observing that:
(A + B)(C + D) = AC + AD + BC + BD
The subproducts required to compute M * N are:
A B C D ─────── AD BD + AC BC
which is:
10n AC + 10n/2 (AD + BC) + BD
So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD = AC + AD + BC + BD - AC - BD with an O(n) operation.
The gain is that we reduce the number of recursive calls applied to problem halves from 4 to 3, which results in an O(nlog 3) as shown in the Time Complexity section below.
Time Complexity
If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/bd = 3/21 > 1, so we are also in Case 3 of the master theorem, and the running time complexity is O(nlog23)=O(nlog 3), which is better than O(n2).
Overview
Apply the Gauss' trick and end up with three recursive calls instead of four. This yields a O(n*logn) complexity. It if was four, the recursive complexity it would have been O(n2).
TODO