Karatsuba Multiplication: Difference between revisions
Line 2: | Line 2: | ||
* [[Algorithms#q23wLp|Algorithms | Divide and Conquer]] | * [[Algorithms#q23wLp|Algorithms | Divide and Conquer]] | ||
=Time Complexity= | =Time Complexity= | ||
In absence of Gauss optimization, the naive recursive algorithm makes 4 recursive calls (a=4), each call on half of the problem (b=2). Upon the exit from recursion, the combine phase performs additions using a number of operations proportional to the size of the current problem, so the combine phase is O(n<sup>1</sup>) (d=1). a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, so we are in [[Master_Method#Case_3|Case 3 | In absence of Gauss optimization, the naive recursive algorithm makes 4 recursive calls (a=4), each call on half of the problem (b=2). Upon the exit from recursion, the combine phase performs additions using a number of operations proportional to the size of the current problem, so the combine phase is O(n<sup>1</sup>) (d=1). a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, so we are in [[Master_Method#Case_3|Case 3 of the master theorem]], and the running time is upper bound by O(n<sup>log<sub>2</sub>4</sup>)=O(n<sup>2</sup>), no better than the straightforward iterative algorithm. | ||
If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3 | If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3 of the master theorem]], and the running time complexity is O(n<sup>log<sub>2</sub>3</sup>)=O(n<sup>log 3</sup>), which is better than O(n<sup>2</sup>). | ||
=TODO= | =TODO= |
Revision as of 18:41, 21 September 2021
Internal
Time Complexity
In absence of Gauss optimization, the naive recursive algorithm makes 4 recursive calls (a=4), each call on half of the problem (b=2). Upon the exit from recursion, the combine phase performs additions using a number of operations proportional to the size of the current problem, so the combine phase is O(n1) (d=1). a/bd = 4/21 = 2, so we are in Case 3 of the master theorem, and the running time is upper bound by O(nlog24)=O(n2), no better than the straightforward iterative algorithm.
If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/bd = 3/21 > 1, so we are also in Case 3 of the master theorem, and the running time complexity is O(nlog23)=O(nlog 3), which is better than O(n2).
TODO
- problem statement
- naïve solution
- Gauss trick
- complexity
- Karatsuba: explain the key idea – apply the master theorem to demonstrate that the complexity is still O(n2). The key insight for Karatsuba algorithm is the Gauss’ trick to reduce four sub-parts recursive multiplications to 3, and the complexity is ?
Link to Strassen.
Overview
Apply the Gauss' trick and end up with three recursive calls instead of four. This yields a O(n*logn) complexity. It if was four, the recursive complexity it would have been O(n2).
TODO