Karatsuba Multiplication: Difference between revisions

From NovaOrdis Knowledge Base
Jump to navigation Jump to search
Line 2: Line 2:
* [[Algorithms#q23wLp|Algorithms | Divide and Conquer]]
* [[Algorithms#q23wLp|Algorithms | Divide and Conquer]]
=Overview=
=Overview=
Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n<sup>2</sup>) time complexity. A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying [[Master_Method#Overview|master theorem]], the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n<sup>1</sup>) so d=1. a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, we find ourselves in the [[Master_Method#Case_3|Case 3]] of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic bound is O(n<sup>log<sub>b</sub>a</sup>) = O(n<sup>2</sup>), no better than the straightforward iterative method.
Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n<sup>2</sup>) time complexity. A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying [[Master_Method#Overview|master theorem]], the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n<sup>1</sup>) so d=1. a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, we find ourselves in the [[Master_Method#Case_3|Case 3]] of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(n<sup>log<sub>b</sub>a</sup>) = O(n<sup>2</sup>), no better than the straightforward iterative method.


=Time Complexity=
=Time Complexity=

Revision as of 19:54, 21 September 2021

Internal

Overview

Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n2) time complexity. A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying master theorem, the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n1) so d=1. a/bd = 4/21 = 2, we find ourselves in the Case 3 of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(nlogba) = O(n2), no better than the straightforward iterative method.

Time Complexity

In absence of Gauss optimization, the naive recursive algorithm makes 4 recursive calls (a=4), each call on half of the problem (b=2). Upon the exit from recursion, the combine phase performs additions using a number of operations proportional to the size of the current problem, so the combine phase is O(n1) (d=1). a/bd = 4/21 = 2, so we are in Case 3 of the master theorem, and the running time is upper bound by O(nlog24)=O(n2), no better than the straightforward iterative algorithm.

If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/bd = 3/21 > 1, so we are also in Case 3 of the master theorem, and the running time complexity is O(nlog23)=O(nlog 3), which is better than O(n2).

TODO

  • problem statement
  • naïve solution
  • Gauss trick
  • complexity
  • Karatsuba: explain the key idea – apply the master theorem to demonstrate that the complexity is still O(n2). The key insight for Karatsuba algorithm is the Gauss’ trick to reduce four sub-parts recursive multiplications to 3, and the complexity is ?

Link to Strassen.

Overview

Apply the Gauss' trick and end up with three recursive calls instead of four. This yields a O(n*logn) complexity. It if was four, the recursive complexity it would have been O(n2).


TODO

Playground

https://github.com/ovidiuf/playground/tree/master/learning/stanford-algorithms-specialization/01-karatsuba
[Next]