Karatsuba Multiplication: Difference between revisions

From NovaOrdis Knowledge Base
Jump to navigation Jump to search
 
(18 intermediate revisions by the same user not shown)
Line 1: Line 1:
=Internal=
=Internal=
* [[Algorithms#q23wLp|Algorithms | Divide and Conquer]]
* [[Recursive_Algorithms#Divide_and_Conquer|Divide and Conquer Algorithms]]
=Time Complexity=
* [[Matrix_Multiplication#Strassen|Strassen's Matrix Multiplication]]
In absence of Gauss optimization, the naive recursive algorithm makes 4 recursive calls (a=4), each call on half of the problem (b=2). Upon the exit from recursion, the combine phase  performs additions using a number of operations proportional to the size of the current problem, so the combine phase is O(n<sup>1</sup>) (d=1). a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, so we are in [[Master_Method#Case_3|Case 3]] of the master theorem, and the running time is upper bound by O(n<sup>log<sub>2</sub>4</sup>)=O(n<sup>2</sup>), no better than the straightforward iterative algorithm.
 
=Overview=
Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n<sup>2</sup>) time complexity.
 
A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying [[Master_Method#Overview|master theorem]], the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n<sup>1</sup>) so d=1. a/b<sup>d</sup> = 4/2<sup>1</sup> = 2, we find ourselves in the [[Master_Method#Case_3|Case 3]] of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(n<sup>log<sub>b</sub>a</sup>) = O(n<sup>2</sup>), no better than the straightforward iterative method.
 
=Gauss Optimization=


If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3]] of the master theorem, and the running time complexity is O(n<sup>log<sub>2</sub>3</sup>)=O(n<sup>log 3</sup>), which is better than O(n<sup>2</sup>).
Assuming that we divide each integer in two halves, as in the naive recursive method M = [A B], N = [C D], Gauss optimization consists in observing that:


=TODO=
<font size=-1>
<font color=darkkhaki>
(A + B)(C + D) = AC + AD + BC + BD
* problem statement
* naïve solution
* Gauss trick
* complexity
* Karatsuba: explain the key idea – apply the master theorem to demonstrate that the complexity is still O(n2). The key insight for Karatsuba algorithm is the Gauss’ trick to reduce four sub-parts recursive multiplications to 3, and the complexity is ?
</font>
</font>


Link to [[Matrix_Multiplication#Strassen|Strassen]].
The subproducts required to compute M * N are:
<font size=-1>
          A B
          C D
      ───────
        AD BD +
    AC BC
</font>
which is:
<font size=-1>
  10<sup>n</sup> AC + 10<sup>n/2</sup> (AD + BC) + BD
</font>


=Overview=
So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD = AC + AD + BC + BD  - AC - BD with an O(n) operation after performing just '''one''' multiplication of sub-parts.


Apply the Gauss' trick and end up with three recursive calls instead of four. This yields a O(n*logn) complexity. <font color=darkkhaki>It if was four, the recursive complexity it would have been O(n<sup>2</sup>)</font>.
The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(n<sup>log 3</sup>) as shown in the [[#Time_Complexity|Time Complexity]] section below. This is the same type of optimization applied in the case of [[Matrix_Multiplication#Strassen|Strassen's matrix multiplication algorithm]].


 
=Time Complexity=
<font color=darkkhaki>TODO</font>
If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/b<sup>d</sup> = 3/2<sup>1</sup> > 1, so we are also in [[Master_Method#Case_3|Case 3 of the master theorem]], and the running time complexity is O(n<sup>log<sub>2</sub>3</sup>)=O(n<sup>log 3</sup>), which is better than O(n<sup>2</sup>).


=Playground=
=Playground=

Latest revision as of 19:11, 9 November 2021

Internal

Overview

Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n2) time complexity.

A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying master theorem, the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n1) so d=1. a/bd = 4/21 = 2, we find ourselves in the Case 3 of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(nlogba) = O(n2), no better than the straightforward iterative method.

Gauss Optimization

Assuming that we divide each integer in two halves, as in the naive recursive method M = [A B], N = [C D], Gauss optimization consists in observing that:

(A + B)(C + D) = AC + AD + BC + BD

The subproducts required to compute M * N are:

         A B
         C D
     ───────
       AD BD +
    AC BC

which is:

 10n AC + 10n/2 (AD + BC) + BD 

So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD = AC + AD + BC + BD - AC - BD with an O(n) operation after performing just one multiplication of sub-parts.

The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(nlog 3) as shown in the Time Complexity section below. This is the same type of optimization applied in the case of Strassen's matrix multiplication algorithm.

Time Complexity

If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/bd = 3/21 > 1, so we are also in Case 3 of the master theorem, and the running time complexity is O(nlog23)=O(nlog 3), which is better than O(n2).

Playground

https://github.com/ovidiuf/playground/tree/master/learning/stanford-algorithms-specialization/01-karatsuba
[Next]