Sei sulla pagina 1di 69

Sorting Algorithms

Bubble sort
Compare each element (except the last one) with its neighbor to the right
If they are out of order, swap them This puts the largest element at the very end The last element is now in the correct and final place

Compare each element (except the last two) with its neighbor to the right
If they are out of order, swap them This puts the second largest element next to last The last two elements are now in their correct and final places

Compare each element (except the last three) with its neighbor to the right
Continue as above until you have no unsorted elements on the left
2

Example of bubble sort


7 2 8 5 4 2 7 8 5 4
2 7 8 5 4 2 7 5 8 4 2 7 5 4 8 2 7 5 4 8 2 5 4 7 8 2 4 5 7 8

2 7 5 4 8
2 5 7 4 8 2 5 4 7 8

2 5 4 7 8
2 4 5 7 8

2 4 5 7 8
(done)

Code for bubble sort


public static void bubbleSort(int[] a) { int outer, inner; for (outer = a.length - 1; outer > 0; outer--) { // counting down for (inner = 0; inner < outer; inner++) { // bubbling up if (a[inner] > a[inner + 1]) { // if out of order... int temp = a[inner]; // ...then swap a[inner] = a[inner + 1]; a[inner + 1] = temp; } } } }

Analysis of bubble sort


for (outer = a.length - 1; outer > 0; outer--) { for (inner = 0; inner < outer; inner++) { if (a[inner] > a[inner + 1]) { // code for swap omitted } } } Let n = a.length = size of the array The outer loop is executed n-1 times (call it n, thats close enough) Each time the outer loop is executed, the inner loop is executed Inner loop executes n-1 times at first, linearly dropping to just once On average, inner loop executes about n/2 times for each execution of the outer loop In the inner loop, the comparison is always done (constant time), the swap might be done (also constant time) Result is n * n/2 + k, that is, O(n2/2 + k) = O(n2)
5

Loop invariants
You run a loop in order to change things Oddly enough, what is usually most important in understanding a loop is finding an invariant: that is, a condition that doesnt change In bubble sort, we put the largest elements at the end, and once we put them there, we dont move them again
The variable outer starts at the last index in the array and decreases to 0 Our invariant is: Every element to the right of outer is in the correct place That is, for all j > outer, if i < j, then a[i] <= a[j] When this is combined with outer == 0, we know that all elements of the array are in the correct place

Selection sort
Given an array of length n,
Search elements 0 through n-1 and select the smallest
Swap it with the element in location 0

Search elements 1 through n-1 and select the smallest


Swap it with the element in location 1

Search elements 2 through n-1 and select the smallest


Swap it with the element in location 2

Search elements 3 through n-1 and select the smallest


Swap it with the element in location 3

Continue in this fashion until theres nothing left to search


7

Example and analysis of selection sort


7 2 8 5 4 2 7 8 5 4
2 4 8 5 7 2 4 5 8 7 2 4 5 7 8

The selection sort might swap an array element with itself--this is harmless, and not worth checking for Analysis:
The outer loop executes n-1 times The inner loop executes about n/2 times on average (from n to 2 times) Work done in the inner loop is constant (swap two array elements) Time required is roughly (n-1)*(n/2) You should recognize this as O(n2)
8

Code for selection sort


public static void selectionSort(int[] a) { int outer, inner, min; for (outer = 0; outer < a.length - 1; outer++) { // outer counts down min = outer; for (inner = outer + 1; inner < a.length; inner++) { if (a[inner] < a[min]) { min = inner; }} // a[min] is least among a[outer]..a[a.length - 1] int temp = a[outer]; a[outer] = a[min]; a[min] = temp; // Invariant: for all i <= outer, if i < j then a[i] <= a[j] } }

Insertion sort
sorted next to be inserted
temp 10 3 4 7 12 14 14 20 21 33 38 10 55 9 23 28 16
less than 10

3 4 7 10 12 14 14 20 21 33 38 55 9 23 28 16
sorted

while some elements unsorted: Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted Move all the elements after the insertion location up one position to make space for the new 10

An insertion sort of an array of five integers


11

Insertion Sort Analysis


{ insertionSort(vector<int> &a) int j; for ( int p = 1; p < a.size(); ++p ) { int tmp = a[p]; for (j=p; j > 0 && tmp < a[j-1]; j--) /* compare */ a[j] = a[j-1]; /*move */ a[j] = tmp; /* insert */ }

12

Analysis of insertion sort


We run once through the outer loop, inserting each of n elements; this is a factor of n On average, there are n/2 elements already sorted
The inner loop looks at (and moves) half of these This gives a second factor of n/4

Hence, the time required for an insertion sort of an array of n elements is proportional to n2/4 Discarding constants, we find that insertion sort is O(n2)
13

Summary
Bubble sort, selection sort, and insertion sort are all O(n2) we can do much better than this with somewhat more complicated sorting algorithms Within O(n2),
Bubble sort is very slow, and should probably never be used for anything Selection sort is intermediate in speed Insertion sort is usually the fastest of the three--in fact, for small arrays (say, 10 or 15 elements), insertion sort is faster than more complicated sorting algorithms

Selection sort and insertion sort are good enough for small arrays
14

Quicksort

Divide and Conquer


Recursive in structure
Divide the problem into sub-problems that are similar to the original but smaller in size Conquer the sub-problems by solving them recursively. If they are small enough, just solve them in a straightforward manner. Combine the solutions to create a solution to the original problem

Comp 122

Quicksort I
To sort a[left...right]:
1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and all a[p+1...right] are >= a[p] 1.2. Quicksort a[left...p-1] 1.3. Quicksort a[p+1...right] 2. Terminate

Partitioning (Quicksort II)


A key step in the Quicksort algorithm is partitioning the array
We choose some (any) number p in the array to use as a pivot We partition the array into three parts:

numbers less than p

numbers greater than or equal to p

Partitioning II
Choose an array value (say, the first) to use as the pivot Starting from the left end, find the first element that is greater than or equal to the pivot Searching backward from the right end, find the first element that is less than the pivot Interchange (swap) these two elements Repeat, searching from where we left off, until done

Partitioning
To partition a[left...right]:
1. Set p = a[left], l = left + 1, r = right; 2. while l < r, do 2.1. while l < right && a[l] < p { l = l + 1 } 2.2. while r > left && a[r] >= p { r = r 1} 2.3. if l < r { swap a[l] and a[r] } 3. a[left] = a[r]; a[r] = p; 4. Terminate

Example of partitioning
choose pivot: search: swap: search: swap: search:

swap:
search: swap with pivot:

436924312189356 436924312189356 433924312189656 433924312189656 433124312989656 433124312989656 433122314989656 433122314989656 133122344989656

(left > right)

Best case Complexity


We cut the array size in half each time So the depth of the recursion is log2n At each level of the recursion, all the partitions at that level do work that is linear in n O(log2n) * O(n) = O(n log2n)

Merge sort

Merge Sort
Sorting Problem: Sort a sequence of n elements into non-decreasing order. Divide: Divide the n-element sequence to be sorted into two subsequences of n/2 elements each

Conquer: Sort the two subsequences recursively using merge sort.


Combine: Merge the two sorted subsequences to produce the sorted answer.

Merge Sort: Idea


Divide into two halves
Recursively sort

A:

FirstPart

SecondPart

FirstPart

SecondPart

Merge

A is sorted!

25

Merge Sort: Algorithm


Merge-Sort (A, n) if n=1 return else Space: n1 n2 n/2 create array L[n1], R[n2] n Time: n for i 0 to n1-1 do L[i] A[i] for j 0 to n2-1 do R[j] A[n1+j] Merge-Sort(L, n1) Recursive Merge-Sort(R, n2) Call Merge(A, L, n1, R, n2 )
26

Merge-Sort: Merge
Sorted

A: merge
Sorted

Sorted

L:

R:
27

Merge-Sort: Merge Example


Keep track of smallest element in each sorted half. Insert smallest of two elements into auxiliary array. Repeat until done

A:

L:

R: 3

28

Merge-Sort: Merge Example


A:
3 1
k=0

15 28

10

14

L:
1 3
i=0

R:
15 2 28 6 30 8 3 6
j=0
29

10 4

14 5

22 7

Merge-Sort: Merge Example


A:
1 2 5
k=1

15 28

30

10

14

L:
1 3 2 5
i=1

R:
15 6 28 8 3 6
j=0
30

10 4

14 5

22 7

Merge-Sort: Merge Example


A:
1 2 3 15 28
k=2

30

10

14

L:
1 2 6
i=2

R:
8 3 6
j=0
31

10 4

14 5

22 7

Merge-Sort: Merge Example


A:
1 2 3 4
k=3

10

14

L:
1 2 6
i=2

R:
8 3 6 10 4
j=1
32

14 5

22 7

Merge-Sort: Merge Example


A:
1 2 3 4 5
k=4

10

14

L:
1 2 6
i=2

R:
8 3 6 10 4 14 5
j=2
33

22 7

Merge-Sort: Merge Example


A:
1 2 3 4 5 6
k=5

10

14

L:
1 2 6
i=2

R:
8 3 6 10 4 14 5 22 7
j=3
34

Merge-Sort: Merge Example


A:
1 2 3 4 5 6 7
k=6

14

L:
1 2 6 8
i=3

R:
3 6 10 4 14 5 22 7
j=3
35

Merge-Sort: Merge Example


A:
1 2 3 4 5 5 7 8 14
k=7

L:
1 3 2 5 15 6 28 8
i=3

R:
3 6 10 4 14 5 22 7
j=4
36

Merge-Sort: Merge Example


A:
1 2 3 4 5 6 7 8
k=8

L:
1 3 2 5 15 6 28 8
i=4

R:
3 6 10 4 14 5 22 7
j=4
37

merge(A,L,n1,R,n2)

i j 0 for k 0 to n1+n2-1 if i < n1 if j = n2 or L[i] R[j] A[k] L[i] i i + 1 else if j < n2 A[k] R[j] j j + 1

38

Heapsort

Why study Heapsort?


Heapsort is always O(n log n)
Quicksort is usually O(n log n) but in the worst case slows to O(n2) Quicksort is generally faster, but Heapsort is better in time-critical applications

Heapsort is a really cool algorithm!

What is a heap?
Definitions of heap:
1. A large area of memory from which the programmer can allocate blocks as needed, and deallocate them (or allow them to be garbage collected) when no longer needed 2. A balanced, left-justified binary tree in which no node has a value greater than the value in its parent

These two definitions have little in common Heapsort uses the second definition

Balanced binary trees


Recall:
The depth of a node is its distance from the root The depth of a tree is the depth of the deepest node

A binary tree of depth n is balanced if all the nodes at depths 0 through n-2 have two children
n-2 n-1 n

Balanced

Balanced

Not balanced

Left-justified binary trees


A balanced binary tree is left-justified if:
all the leaves are at the same depth, or all the leaves at depth n+1 are to the left of all the nodes at depth n

Left-justified

Plan of attack
First, we will learn how to turn a binary tree into a heap Next, we will learn how to turn a binary tree back into a heap after it has been changed in a certain way Finally we will see how to use these ideas to sort an array

The heap property


A node has the heap property if the value in the node is as large as or larger than the values in its children
12 8 3 8 12 12 8 12 14

Blue node has heap property

Blue node has heap property

Blue node does not have heap property

All leaf nodes automatically have the heap property A binary tree is a heap if all nodes in it have the heap property

siftUp
Given a node that does not have the heap property, you can give it the heap property by exchanging its value with the value of the larger child 12 8 14 8 14 12

Blue node does not have heap property

Blue node has heap property

This is sometimes called sifting up Notice that the child may have lost the heap property

Constructing a heap
A tree consisting of a single node is automatically a heap We construct a heap by adding nodes one at a time:
Each time we add a node, we may destroy the heap property of its parent node To fix this, we sift up But each time we sift up, the value of the topmost node in the sift may increase, and this may destroy the heap property of its parent node We repeat the sifting up process, moving up in the tree, until either We reach nodes whose values dont need to be swapped (because the parent is still larger than both children), or We reach the root

Constructing a heap III


8 10
1

8 8

10 8
2

10 5
3

10 8 12 5 8 12

10 5 8 10

12 5

Other children are not affected


12 10 8 14 5 8 14 10 12 5 8 12 10 14 5

The node containing 8 is not affected because its parent gets larger, not smaller The node containing 5 is not affected because its parent gets larger, not smaller The node containing 8 is still not affected because, although its parent got smaller, its parent is still greater than it was originally

A sample heap
Heres a sample binary tree after it has been heapified
25
22 17

19 18 14 21

22 3 9

14 11

15

Notice that heapified does not mean sorted Heapifying does not change the shape of the binary tree; this binary tree is balanced and left-justified because it started out that way

Removing the root


Notice that the largest number is now in the root Suppose we discard the root:
11 22 17

19 18 14 21

22 3 9

14 11

15

How can we fix the binary tree so it is once again balanced and left-justified?

The reHeap method I


Our tree is balanced and left-justified, but no longer a heap However, only the root lacks the heap property
11 22 17

19 18 14 21

22 3 9

14

15

We can siftUp() the root After doing this, one and only one of its children may have lost the heap property

The reHeap method II


Now the left child of the root (still the number 11) lacks the heap property
22 11 17

19 18 14 21

22 3 9

14

15

We can siftUp() this node After doing this, one and only one of its children may have lost the heap property

The reHeap method III


Now the right child of the left child of the root (still the number 11) lacks the heap property:
22 22 17

19 18 14 21

11 3 9

14

15

We can siftUp() this node After doing this, one and only one of its children may have lost the heap property but it doesnt, because its a leaf

The reHeap method IV


Our tree is once again a heap, because every node in it has the heap property
22 22 17

19 18 14 11

21 3 9

14

15

Once again, the largest (or a largest) value is in the root We can repeat this process until the tree becomes empty This produces a sequence of values in order largest to smallest

Sorting
What do heaps have to do with sorting an array? Heres the neat part:
Because the binary tree is balanced and left justified, it can be represented as an array All our operations on binary trees can be represented as operations on arrays To sort:
heapify the array; while the array isnt empty { remove and replace the root; reheap the new root node; }

Mapping into an array


25 22 17

19

22

14

15

18
0 1

14
2 3

21
4 5 6

3
7

9
8 9

11
10 11 12

25 22 17 19 22 14 15 18 14 21 3

9 11

Notice:
The left child of index i is at index 2*i+1 The right child of index i is at index 2*i+2 Example: the children of node 3 (19) are 7 (18) and 8 (14)

Removing and replacing the root


The root is the first element in the array The rightmost node at the deepest level is the last element Swap them...
0 1 2 3 4 5 6 7 8 9 10 11 12

25 22 17 19 22 14 15 18 14 21 3

9 11

10

11

12

11 22 17 19 22 14 15 18 14 21 3

9 25

...And pretend that the last element in the array no longer existsthat is, the last index is 11 (9)

Reheap and repeat


Reheap the root node (index 0, containing 11)...
0 1 2 3 4 5 6 7 8 9 10 11 12

11 22 17 19 22 14 15 18 14 21 3
0 1 2 3 4 5 6 7 8 9 10

9 25
11 12

22 22 17 19 21 14 15 18 14 11 3

9 25

10

11

12

9 22 17 19 22 14 15 18 14 21 3 22 25

...And again, remove and replace the root node Remember, though, that the last array index is changed Repeat until the last becomes first, and the array is sorted!

Radix sort

Linear time sorting


Can we do better (linear time algorithm) if the input has special structure (e.g., uniformly distributed, every numbers can be represented by d digits)? Yes. Counting sort, radix sort

Counting Sort
Assume N integers to be sorted, each is in the range 1 to M. Define an array B[1..M], initialize all to 0 O(M) Scan through the input list A[i], insert A[i] into B[A[i]] O(N) Scan B once, read out the nonzero integers O(M) Total time: O(M + N) if M is O(N), then total time is O(N) Can be bad if range is very big, e.g. M=O(N2)

N=7, M = 9,

Want to sort 8 1 9 5 2 6 3 5 6 8 9 1 2 3 Output: 1 2 3 5 6 8 9

Counting sort
What if we have duplicates? B is an array of pointers. Each position in the array has 2 pointers: head and tail. Tail points to the end of a linked list, and head points to the beginning. A[j] is inserted at the end of the list B[A[j]] Again, Array B is sequentially traversed and each nonempty list is printed out. Time: O(M + N)

Counting sort
M = 9,

Wish to sort 8 5 1 5 9 5 6 2 7
1 2 5 6 7 8 9

5
5

Output: 1 2 5 5 5 6 7 8 9

Radix Sort
Extra information: every integer can be represented by at most k digits
d1d2dk where di are digits in base r d1: most significant digit dk: least significant digit

Radix Sort
Algorithm
sort by the least significant digit first (counting sort) => Numbers with the same digit go to same bin reorder all the numbers: the numbers in bin 0 precede the numbers in bin 1, which precede the numbers in bin 2, and so on sort by the next least significant digit continue this process until the numbers have been sorted on all k digits

Radix Sort
Least-significant-digit-first
Example: 275, 087, 426, 061, 509, 170, 677, 503

// base 10 // FIFO
// d times of counting sort

// scan A[i], put into correct slot

// re-order back to original array

Potrebbero piacerti anche