Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Pi19404
July 28, 2013
Contents
Contents
Algorithms and Integer Multiplication
0.1 Introduction . . . . . . . . . . . . . . . . . . . . . 0.2 Recursive method for Integer Multiplication 0.3 Karatsuba Multiplication /Gauss Trick . . . . 0.3.1 Implementation . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3
3 3 5 6 6
2 | 7
= 10
( (
N=2)a N=2)c
+b +d
= 10
(N=2)
We are assuming that we have even number of digits for simplicity We are required to compute
X
= 10 (a
c)
+ 10
N=2
(a
+b
c)
+ (b
d)
(4)
All digits being multiplied have (N=2) digits. can also be computed using same approach
3 | 7
Thus idea would be to recursively compute ac,ad,bc,bd.Each of which are a smaller sum problems of multiplications of (N=2) digits. We compute the produts and add the suitably to get the result. We need to evaluate how this algorithm compares with the classical algorithm for multiplication. Let T (N ) denote the worst case runtime of the algorithm for N digit numbers. Since we are using a recursive function of run times of recursive calls. Let consider the simplest case of T (1) constant This is base case.
T (N )
= 1
we also perform some addition operations apart from recursive calls which are denoted by O(N )
T (N )
4
T (N=2)
+ O (N )
(5)
There are 4 multiplication of size (N=2).Thus basically a problem of size N is divided into 4 sub problem of size (N=2). the results of which result in addition of 4 (N=2) digit numbers. Addition of 2 N digit number is approximately linear time. computation required is expressed as T (N ) 4 T (N=2) + O(N ) ,which again needs to expressed in terms of N .This is done using the master method. In general the computation required can be expressed as
T (N )
aT (
N b
) + O (N )T (N )
O (1)
for small N
where
1
4 | 7
The input to master method is a recursive relation and the master method tells is upper bound on running time of the algorithms. In general it depends on the parameters a,b,d
T (N
O (N logN )
d l
O (N
O (N ogb a)
if if if
d d
a < b a > b
= 4; b = 2; d = 1
O (N og2 4)
l
and
2
a > b
O (N
requires multiplications (a c); (a d + b c); (b d).ie it does not require explicity quantities ad; bc just their sum. if we consider multiplication
(a + b)
= 10 (a
c)
+ 10
N=2
(a
+b
c)
+ (b
d)
(6)
+ d) =
ac
+ d)
ac
+ bd
Instead of 4 multiplication we can compute required components using 3 multiplications and 3 addition/subtraction operations. In this case we have divided N digit problems to 3 problem. computation required can be expressed as T (N ) ,which again needs to expressed in terms of N . using the master method we can compute
a > b
d
(N=2)
digit
3
T (N=2)
O (N )
= 3; b = 2; d = 1
and
T (N )
O (N og2 3)
O (N
1:58
5 | 7
Algorithms and Integer Multiplication This method provides a complexity less than
< O (N
The base algorithm will take time of the form M (N ) = aN 2 + b N + d and the new algorithm takes 3 M (N=2) + d N + e which comes out to 3=4aN 2 + 3=2bN + dN + e + 3c. thus we can observe a speedup of 3=4 in product (N 2 )and speedup by a factor of 3=2 in linear (N ) operations.
0.3.1 Implementation
Ill be implementing all the algorithms using haskel. Since recursion is basic mechanism to loop in Haskell ,it fits in naturally within recursive algorithmic structure of divide and conquer algorithms. The code can be found at https://github.com/pi19404/m19404/blob/ master/Algorithm/multiplication/mul1.hs
6 | 7
Bibliography
Bibliography
[1] [2]
html.
url: http://www.haskell.org/tutorial/functions.
7 | 7