Sei sulla pagina 1di 15

Merge Sort

• Sorting Schemes can be classified as internal (sorting data already in memory)


and external (sorting collection of data from secondary storage).

• So far we have been looking at internal schemes.

• What if we want to sort, say for example records from files? Sorts covered so
far are not the best way to do this. Inefficient, especially for large data sets like
those we expect to see in files.

• Merge Sort is a sorting algorithm that is both useful for internal and external
sorting.

• We will cover Merge Sort for internal sorting of array data for now but the
extension to external sorting is trivial

Programming and Data Structures 1

Merge Sort (2)


• Many useful algorithms are recursive

• These algorithms employ a divide and conquer approach.

• Think of merging two lists of sorted numbers.

• All that is needed is to look at the top items and add the
appropriate one each time

• The merge sort algorithm closely follows the Divide and


Conquer paradigm. It operates as follows,

Programming and Data Structures 2

1
Merge Sort – Divide & Conquer
• Divide: Divide the n-element sequence to be sorted into two
subsequences of n/2 elements each.

• Conquer: Sort the two subsequences recursively (I.e. by using


merge sort again on the subsequences).

• Combine: Merge the two sorted subsequences to produce the


sorted answer.

• This procedure will bottom out when the list to be sorted is of length 1
so we must take this into account in our algorithm design.

Programming and Data Structures 3

Merge Sort – Visual Example


75 55 15 20 85 30 35 10 60 40 50 25 45 80 70 65
• Divide list into 2 sublist, so at next level of recursion we have :

75 55 15 20 85 30 35 10 Sub-list 1

60 40 50 25 45 80 70 65 Sub-list 2

• We keep splitting our sublists into smaller sublists until we can’t split any further
(This is a result of sorting our sublist before mergine by recursively calling ourselves).

• Eventually we will end up with distinct pairs of already sorted sublists which need to be
merged together.

75 55 15 20 85 30 35 10 60 40 50 25

45 80 70 65

Programming and Data Structures 4

2
Merge Sort – Visual Example (2)
• After the first merge we end up with half as many sublists as before the merge :

55 75 15 20 30 85 10 35 40 60 25 50 45 80 65 70

• We again merge the susequent sorted sublists together again, thus reducing the number
of sublists again to half :

15 20 55 75 10 30 35 85 25 40 50 60 45 65 70 80

• Repeating the same procedure again we end up with 2 sorted sublists to be merged :

10 15 20 30 35 55 75 85 25 40 45 50 60 65 70 80

• Merging these two sorted sublist together we end up with our original listed sorted! :

10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85

Programming and Data Structures 5

Merge Sort – Pseudo Code


• If we leave the merge procedure as a subroutine in the algorithm the
MergeSort algorithm can be written very simply as :

MergeSort(List[], leftIndex, rightIndex)


Begin:
if leftIndex < rightIndex then
mid = (leftIndex + rightIndex) / 2
MergeSort(List[], leftIndex, mid) // split left sublist
MergeSort(List[], mid + 1, rightIndex) // split right sublist
Merge(List[], leftIndex, mid, rightIndex) // merge sorted sublists
endif
End

Programming and Data Structures 6

3
Stable Sort
• The prime disadvantage of mergesort is that extra space
proportional to N is needed in a straightforward
implementation.

• Definition: A sorting method is said to be “stable” if it


preserves the relative order of duplicate keys in the file

• If we have an alphabetised list of students and their year of


graduation is sorted by year, a stable method produces a
list in which people in the same class are still in
alphabetical order, but a non-stable method is likely to
produce a list with no vestige of the original alphabetic
order.

Programming and Data Structures 7

Stable Sorting (2)


(a) Initial list sorted on name only
N am e Y ear
A dam s 1
B la c k 2
B ro w n 4
Jackson 2
Jo nes 4
S m ith 1
T h om pson 4
W a s h in g to n 2
W h ite 3
W ilso n 3

(b) Unstable Year Sort (c) Stable Year Sort


Name Year Name Year
Adams 1 Adams 1
Smith 1 Smith 1
Washington 2 Black 2
Jackson 2 Jackson 2
Black 2 Washington 2
White 3 White 3
Wilson 3 Wilson 3
Thompson 4 Brown 4
Brown 4 Jones 4
Jones 4 Thompson 4

Programming and Data Structures 8

4
Stable Sorting Algorithms
• Most of the simple sorting methods are stable. However,
many of the sophisticated algorithms are not stable.

• The following is a table of the sorting algorithms you have


met or mentioned, showing those that are stable.
In s e rtio n S o rt S ta b le
S e le c tio n S o rt U n s ta b le
B u b b le S o rt S ta b le
S h e ll S o rt S ta b le
Q u ic k s o rt U n s ta b le
M e rg e S o rt S ta b le

• Most of the unstable sorts can be made stable by indexing


or some other method, but this adds to their complexity.

Programming and Data Structures 9

Merging
• Given two ordered input files, we can combine them into one ordered
output file simply by keeping track of the smallest element in each file
and entering a loop where the smaller of the two elements is moved to
the output.

• If merge sort is implemented using a “stable merge”, then the result of


the sort is stable.

• Method of merging to be presented is “Abstract in-place merge”

• This algorithm merges by copying the second array into an array aux
(auxiliary array), in reverse order back to back with the first.

• It is a stable merge procedure.

Programming and Data Structures 10

5
Abstract In-Place Merge - Example
• Lets take a sub-list from our MergeSort example : 15 20 55 75 10 30 35 85

• Filling the auxillary array with the contents of both sub arrays in a specific order

15 20 55 75 10 30 35 85

Auxillary array 15 20 55 75 85 35 30 10

• Now we go through auxillary array at each end points and do our comparison to sort and
merge into our original list

Programming and Data Structures 11

Abstract In-Place Merge – Example (2)


Auxillary array/list Part of original array/list

15 20 55 75 85 35 30 10 10 20 55 75 10 30 35 85

15 20 55 75 85 35 30 10 15 55 75 10 30 35 85

20 55 75 85 35 30 10 15 20 75 10 30 35 85
55 75 85 35 30 10 15 20 30 10 30 35 85

55 75 85 35 10 15 20 30 35 30 35 85

55 75 85 10 15 20 30 35 55 35 85

75 85 10 15 20 30 35 55 75 85

85 10 15 20 30 35 55 75 85

Programming and Data Structures 12

6
Merging – Pseudo Code
Begin
Item aux[maxN]

Merge(A, left, mid, right)


for i = mid+1 downto left+1 // starts at the end and goes backwards
aux[i-1] = A[i-1]
endfor
for j = mid to right-1 // go forwards but copies in reverse order
aux[right+mid-j] = A[j+1]
endfor
for k = left to right // starting from left to right
if aux[i] < aux[j] then // compare ‘start’ and ‘end’ values
A[k] = aux[i] // then put appropriate value back
i=i+1 // into our array, thus sorting and
else // merging. Then update appropriat
A[k] = aux[j] // index pointer.
j=j–1
endif
endfor
End

Programming and Data Structures 13

Abstract In-place Merge


• This is known as an abstract in-place merge because the procedure
uses a local auxiliary array only and merges the two halves of the input
array.

• The first two loops copy the halves into the auxiliary array in a
particular manner.
- The first loop copies from the mid-point down to the first array
position as normal.
- The second, however, copies from position m+1 to the final position,
but in reverse order.

• The last loop then does the sorted merging by taking care of the data
checking at either end of the auxilary array and moving the auxilary
array start and end pointers on as appropriate.

Programming and Data Structures 14

7
Analysis of Merge Sort
• The running time of a recursive algorithm is often described by a
recurrence equation.

• A recurrence for the running time of divide and conquer algorithm is


based on the three steps of the basic paradigm.

• Divide: The divide step just computes the middle of the ‘subarray’,
which takes constant time, O(1)

• Conquer: We recursively solve two sub-problems, each of size n/2,


which contributes 2T(n/2) to the running time.

• Combine: The Merge procedure will check the top items from the
two sorted arrays and therefore should not have to check more than n
items. Therefore, the Merge procedure will contribute O(n) to the
running time.

Programming and Data Structures 15

Analysis of Merge Sort (2)


• For the Merge-Sort then, we can write the running time function as,
T(n) = O(1) if n = 1,
T(n) = 2T(n/2) + O(n) + O(1) if n > 1.

• When we add O(n) and O(1) we are adding a linear function of n, that
is O(n). Therefore the recurrence relation becomes,
T(n) = O(1) if n = 1,
T(n) = 2T(n/2) + O(n) if n > 1.

• Using the iteration method to solve this recurrence relation we can


work out that it is :
T(n) = O(n lg n)

Programming and Data Structures 16

8
Analysis of Merge Sort (3)
• So, Merge Sort with its Θ(n lg n) running time, outperforms insertion sort,
whose running time is Θ(n2).

• There are advantages and disadvantages with all algorithms, including


merge sort

• Merge sort runs in O(n lg n) time, no matter what the input. This can be an
advantage, but in some cases it can be a liability

• For example, Quicksort is also O(n lg n) but in special cases can often
improve on that.

• Merge sort will always be guaranteed to have upper bound O(n lg n)

• Merge sort, however, is a stable algorithm, whereas Quicksort is not stable.

Programming and Data Structures 17

QuickSort – A Brief Discussion


• One of the most popular ways of sorting.

• Utilises a Divide and Conquer approach to progressively simplify the data into
manageable chunks.

• QuickSort is another O(n lg n) sorting algorithm.

• Algorithm is recursive like MergeSort.

• Basic Idea :Utilises a Pivot and sorts all values less than the pivot to the left of the
pivot and all values greater than the pivot to the right of the pivot.

• This produces two further sublists minus the pivot value (at this stage the pivot
value should be sorted into the right place in the list) which in turn gets solved in
the same way.

• Algorithm continues like this until all values in the list is entirely sorted.
Programming and Data Structures 18

9
QuickSort – Visual Example
4 2 6 9 3 1 5 8 7 10

• Initial Pivot : 4
• Scanning from left to right, placing values left or right of pivot as appropriate :

2 3 1 4 10 7 8 5 9 6

• We now have two sublists which we deal with in a similar manner :

2 3 1 4 10 7 8 5 9 6

• We choose 2 to be the pivot point in the 1st sublist and 10 to be the pivot point in the 2nd sublist and
apply the same procedure as before to each sublist, thus breaking them down into further sublists
and placing the current pivots into their appropriate places in the list.
• All values in the list will eventually act as a pivot to some sublist and be placed into the appropriate
place in the list due to the sorting process, thus ordering the list.
1 2 3 4 5 6 7 8 9 10

Programming and Data Structures 19

QuickSort – Pseudo Code

Begin
Quicksort(List[], lptr, rptr) { // take in a list, a left (start) and a right (end) pointer
if (lptr < rptr) { // if there is more than one element in the list
ppoint = Partition(List, lptr, rptr) // split into two sublists, return where in main list
//pivot resides after split
Quicksort(List, lptr, ppoint - 1) // Quicksort left sublist
Quicksort(List, ppoint+1 ,rptr) // Quicksort right sublist
}
// else there is only one element in the list so no need to sort
}
End

Programming and Data Structures 20

10
The Partition Method – Pseudo Code
Begin
partition(List[], left, right) {
i = left // Init i to start of left of sublist.
j = right + 1 // Init j to end of sublist plus one place (reason why in second loop).
pivot=List[left] // Let pivot be the first value in our list.
do {
do { // Scan from left to right until we find a value less than the pivot.
i=i+1
} while (List[i] < pivot)
do { // Scan from right to left until we find a value greater than the pivot.
j=j-1 // Since we decrement first before doing our check this is why j = right + 1 above.
} while (List[j] > pivot)
if (I < j) // If our search is not exhausted yet – i.e. left pointer is still less than right,
swap (List[i], List[j]) // swap the current pointed to left and right values around.
// i.e. this creates two sublists of values. One which contains values less than
// the pivot and one that contains values greater than the pivot
} while (i < j) // Keep doing this while our search is not exhausted. When we loop again we start
// off from where our left and right pointers left off from the last iteration.

swap(List[left], List[j]) // Put pivot into appropriate place – right pointer should have stopped at this place.
return j // Return the position in which our pivot value now resides in the list.
}
End

Programming and Data Structures 21

Sorting in Linear Time


• All sorting algorithms covered so far required that we compare values and then swapped them in order
to do our sorting.

• Is it possible to do sorting without having to compare values and swapping?

• Ans : Yes.

• There are a number of linear time sorting algorithms that do not require you to compare values and
swap.

• These algorithms work by either finding a pattern or grouping the data to be sorted according to some
criteria.

• Three algorithms that do such are :


- Bucket Sort (also known as Bin Sort)
- Radix Sort (escientally a multi phased bucket sort algorithm)
- Counting Sort (sorts by counting the frequency of occurrences)

• Lets have a brief look at bucket sort : (the basic idea is presented but pseudo code and code is left as an
exercise)

Programming and Data Structures 22

11
Bucket Sort
• Find a common variable in data to be sorted and place into “buckets” set aside for such commonalities.

• Remove data from buckets starting from highest to lowest. This will sort the data for us.

• Example : 3 2 5 10 7 9 6 8 1 8 Our list to be sorted

• Since the data to be sorted is in the range of 1 to 10 we can set up our buckets to “store” values
between the ranges 1 to 10.
• Values that match a buckets “criteria” will be placed into that bucket appropriately.

10 9 8 7 6 5 4 3 2 1  Our sorting criteria

 Our bucket with values from our list in each bucket that
correspond to the criteria

• Removing data from


8 buckets starting from
10 9 8 7 6 5 3 2 1 1 2 3 5 6 7 8 8 9 10 highest to lowest criteria
sorts the list for us

Programming and Data Structures 23

12
 

 


•  
   
•   


• 

   
  
  


• 


   




  
 

!"     




• #  $ 

  
 


!  %&  

 
    
' (  
) ! *
+ ( ,    
- !,= !,.
     
   

/ ( = ' 
0 != !.!%
     
  
  

*( ,=   " 
 !,= ,
'!,= !,1

// Array B now holds the sorted sequence



2 
 





 3 6 4 1 3 4 1 4

  




• 3 4 "
 "

4
   (  



• 5
  
 )1 

! 
(  

0 0 0 0 0 0
!
 


• &  1 
  ( 4


  ! 


3 6 4 1 3 4 1 4 2 0 2 3 0 1
      !
  
  

 

• #   


! 
( $   
 

2 2 4 7 7 8
!
 


• 
 

6 
!

   
6 


    4   ! 2 2 4 7 7 8

   


  



  1   4   ! 2 2 4 6 7 8

   


  



 1  4 4   ! 1 2 4 6 7 8

  


  




 1 3 4 4   ! 1 2 4 5 7 8

  


  



 1 1 3 4 4   ! 1 2 3 5 7 8

  


  



 1  1 3 4 4 4   ! 0 2 3 5 7 8

  


  



 1  1 3 4  4 4 6  ! 0 2 3 4 7 8
 
  



 1  1 3 3 4 4 4 6  ! 0 2 3 4 7 7

  
  



•   ! 
    ""

1 1 3 3 4 4 4 6 0 2 2 4 7 7
      !
  
  



• The list B is the unsorted list A now sorted!!

• Counting sort is also a stable sorting algorithm.