Karatsuba Multiplication: Difference between revisions

From NovaOrdis Knowledge Base
Jump to navigation Jump to search
Line 27: Line 27:
</font>
</font>


So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD =  AC + AD + BC + BD  - AC - BD with an O(n) operation.
So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD =  AC + AD + BC + BD  - AC - BD with an O(n) operation after performing just '''one''' multiplication of sub-parts.


The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(n<sup>log 3</sup>) as shown in the [[#Time_Complexity|Time Complexity]] section below. This is the same type of optimization applied in the case of [[Matrix_Multiplication#Strassen|Strassen's matrix multiplication algorithm]].
The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(n<sup>log 3</sup>) as shown in the [[#Time_Complexity|Time Complexity]] section below. This is the same type of optimization applied in the case of [[Matrix_Multiplication#Strassen|Strassen's matrix multiplication algorithm]].

Revision as of 20:07, 21 September 2021

Internal

Overview

Karatsuba multiplication is an integer multiplication algorithm that takes two integer of n digits. The straightforward iterative solution involves multiplying the first integer with each digit of the second integer, shifting and adding the results, so it involves at least two n loop, for a O(n2) time complexity.

A naive recursive solution in absence of the Gauss optimization involves dividing each integer M and N in two halves M = [A B], N=[C D], recursively computing AD, BD, AC and CB, shifting and adding. Applying master theorem, the number of subproblems is 4 (a=4), the size of the subproblem is n/2 (b=2) and the amount of work required to combine the solutions after the recursive call is O(n1) so d=1. a/bd = 4/21 = 2, we find ourselves in the Case 3 of the master theorem, where the number of subproblems proliferate more rapidly than the shrinkage of work per subproblem, so the asymptotic running time bound is O(nlogba) = O(n2), no better than the straightforward iterative method.

Gauss Optimization

Assuming that we divide each integer in two halves, as in the naive recursive method M = [A B], N = [C D], Gauss optimization consists in observing that:

(A + B)(C + D) = AC + AD + BC + BD

The subproducts required to compute M * N are:

         A B
         C D
     ───────
       AD BD +
    AC BC

which is:

 10n AC + 10n/2 (AD + BC) + BD 

So if we compute AC, BD and (A+B)(C+D), we can compute AD + BC = (A+B)(C+D) - AC - BD = AC + AD + BC + BD - AC - BD with an O(n) operation after performing just one multiplication of sub-parts.

The key insight for the Karatsuba algorithm is that applying Gauss optimization we reduce four sub-parts recursive multiplications to 3. This translates into reduction of the number of recursive calls applied to problem halves from 4 to 3, which results in an O(nlog 3) as shown in the Time Complexity section below. This is the same type of optimization applied in the case of Strassen's matrix multiplication algorithm.

Time Complexity

If we apply the Gauss optimization, we reduce the number of recursive calls to 3 (a=3), the rest of the parameters remaining the same. a/bd = 3/21 > 1, so we are also in Case 3 of the master theorem, and the running time complexity is O(nlog23)=O(nlog 3), which is better than O(n2).

Playground

https://github.com/ovidiuf/playground/tree/master/learning/stanford-algorithms-specialization/01-karatsuba
[Next]