Sei sulla pagina 1di 46

Lecture 6 Trees & Sort

Linked Lists

Binary Search Trees


A Binary Search Tree (BST) is a binary tree with the following properties:

The key of a node is always greater than the keys of the nodes in its left subtree The key of a node is always smaller than the keys of the nodes in its right subtree

Binary Search Trees: Examples root


root 14 C A D 10 root 8 14 10 8 11 15 16 11 15 18 16

Building a BST

ild a BST from a sequence of nodes read one a time Inserting C A B L M (in this order!)

ample:

1) Insert C C

2) Insert A C A

Building a BST
3) Insert B A B A 4) Insert L A B C L B C 5) Insert M C L M

Building a BST
Is there a unique BST for letters A B C L M ?

NO! Different input sequences result in


different trees
Inserting: A B C L M A B C L M A B Inserting: C A B L M C L M

greater than key[x]. We can do this without comparing keys! Two cases for finding the successor: Case 1: The right subtree of x is not empty. The successor of x is the minimum of the right subtree. Case 2: The right subtree of x is empty. The successor of x (if it exists) is the lowest ancestor of x whose left child is also an ancestor x. (i.e. x is in the rightmost path of the left subtree of successor of x)

Finding the Successor to a We want Idea: node to find the node with the smallest key

Deleting from a BST


To delete node with key x first you need to search for it. Once found, apply one of the following three cases CASE A: x is a leaf
p q r q p

x
delete x

r BST property maintained

Deleting from a BST cont.


Case B: x is interior with only one subtree

r x q L delete x

q L BST property maintained

Deleting from a BST cont.


Case C: x is interior with two subtrees
r r x q W ts Z r q W t delete x s Z r delete x

BST property maintained

Complexity of Searching with BST


What is the complexity of searchBST ? It depends on:

the key x The other data The shape of the tree (full, arbitrary)

Complexity Analysis: We are interested in best case, worst case and average case

Complexity of Searching with BST


If all nodes in the tree exist If all levels are full except then it is called a full BST for the last level then it is called minimum-level BST

Complexity of Searching with BST


Therefore, for a full BST with N nodes the following holds for searchBST: best time analysis O(1) worst time analysis O(log N)

average case analysis O(log N)

Complexity of Searching with BST


Now that we now the run-time of searchBST what about the other
operations? Insertion Deletion Find Min Find Max Search O(log N) O(log N) O(log N) O(log N) O(log N)

Sorting

Insertion Sort Merge Sort

Sunday, April 22, 2012

15

Sorting

Big idea:

Inserting an element into a sorted list in the appropriate position retains the order. So what?

Start with a singleton list sorted trivially. Repeatedly insert elements one at a time while keeping it sorted.

Leads to sorting technique known as Insertion Sort

Sunday, April 22, 2012

16

Sorting by Insertion

Consider the top-down design for sorting. Reduce the problem of sorting a list A[0..n-1]

Into a sub-problem of sorting list A[0..n-2] and Composing the solution for the original problem from the solution for the sub-problem by

Inserting A[n-1] into the sorted sub-list A[0..n-2]

A list of size 1 is an atomic solution.

With a trivial solution: it is already sorted

This can be translated into a straightforward recursive implementation.


17

Sunday, April 22, 2012

Insertion Sort (Cont)

Mechanism (in reordering a sequence):

Originally: we have a n-elements sequence and p elements (defined when sorting starts) in the correct order Step 1: Initially p = 1 Step 2: Let the first p elements be sorted. Step 3: Insert the (p+1)th element properly in the list so that now p+1 elements are sorted. Step 4: increment p, stop if p=n now, otherwise go to step (3)
18

Sunday, April 22, 2012

Insertion Sort (Cont)

Example:

Sunday, April 22, 2012

19

An Example: Insertion Sort


InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } }

Sunday, April 22, 2012

20

Insertion Sort

Complexity:

Insertion of a single element: O(j) where j is size of list. Insertion Sorting: Insert single element into lists from size 1 to size-1:

1 + 2 + + (size 1) = O(size2)

Sunday, April 22, 2012

21

Insertion Sort (Cont)

Running time analysis:

Best Case: input already sorted before


Number of comparisons for each new number: 1 Running time is 1 + 1 + 1 + + 1 = n1 = O(n) Number of comparisons for each new number: p Running time is 1 + 2 + 3 + + (n1) = n(n-1)/2 = O(n2) Average case: on the average there will be approximately (n-1)/2 comparisons in the inner loop. +2/2++n-1/2=n(n-1)/4=O(n2)
22

Worse Case: input in reverse order


Sunday, April 22, 2012

Advantages

simple implementation efficient for (quite) small data sets adaptive, i.e. efficient for data sets that are already substantially sorted: the time complexity is O(n + d), where d is the number of inversions more efficient in practice than most other simple quadratic (i.e. O (n2)) algorithms such as selection sort or bubble sort: the average running time is n2/4, and the running time is linear in the best case stable, i.e. does not change the relative order of elements with equal keys in-place, i.e. only requires a constant amount O(1) of additional memory space online, i.e. can sort a list as it receives it
23

Sunday, April 22, 2012

Merge Sort

7 2 9 4 2 4 7 9 7 2 2 7 7 7
Merge Sort

9 4 4 9 9 9 4 4
24

2 2

Divide-and-Conquer

Divide the problem into a number of subproblems. Conquer the subproblems by solving them recursively. If the subproblem sizes are small enough, however, just solve the subproblems in a straightforward manner. Combine the solutions to the subproblems into the solution for the original problem. The base case for the recursion are subproblems of size 0 or 1
Merge Sort 25

Merge-Sort

Divide: Divide the n-element sequence to be sorted into two subsequences of n/2 elements each. Conquer: Sort the two subsequences recursively using merge sort. Combine: Merge the two sorted subsequences to produce the sorted answer.

Merge Sort

26

Merging two sorted


Once Arraythe sublists are sorted, the next

step in the merge sort algorithm is to merge the sorted sublists. Suppose L1 and L2 are two sorted lists as follows:

L1: 2, 7, 16, 35 L2: 5, 20, 25, 40, 50

Merge L1 and L2 into a third list, say L3. The merge process is as follows: repeatedly compare, using a loop, the elements of L1 with the elements of L2 and copy the smaller element into L3.
27

Example

First compare L1[1] with L2[1] and see that L1[1] < L2[1], so copy L1[1] into L3[1] (see Figure 9.23). (Notice that i, j , and k are set to 1.) 28

Example (Cont)

After the first iteration, i = 2, j = 1, and k = 2. Next, compare L1[2] with L2[1] as shown in Figure 9.24.
29

Example (Cont)

After the second iteration, i = 2, j = 2, and k = 3. Next, compare L1[2] with L2[2] as shown in Figure 9.25.
30

Example (Cont)

After the third iteration, i = 3, j = 2, and k = 4. Next, compare L1[3] with L2[2] as shown in Figure 9.26.

31

Example (Cont)

After the fourth iteration, i = 4, j = 2, and k = 4. Next, compare L1[4] with L2[2] as shown in Figure 9.27. After the fifth iteration, i = 4, j = 3, and k = 5. Continue this process until all elements of one list are copied into the third list. Copy the remaining elements of the list (that has 32

Merge-Sort Tree

An execution of merge-sort is depicted by a binary tree

each node represents a recursive call of merge-sort and stores


unsorted sequence before the execution and its partition sorted sequence at the end of the execution

the root is the initial call the leaves are calls on subsequences of size 0 or 1
7 2 9 4 2 4 7 9

7 2 2 7

9 4 4 9

7 7

2 2
Merge Sort

9 9

4 4
33

Execution Example

Partition
7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
34

Execution Example (cont.)

Recursive call, partition


7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
35

Execution Example (cont.)

Recursive call, partition


7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
36

Execution Example (cont.)

Recursive call, base case


7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
37

Execution Example (cont.)

Recursive call, base case


7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
38

Execution Example (cont.)

Merge
7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
39

Execution Example (cont.)

Recursive call, , base case, merge


7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
40

Execution Example (cont.)

Merge
7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 8 6

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
41

Execution Example (cont.)

Recursive call, , merge, merge


7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 6 8

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
42

Execution Example (cont.)

Merge
7 2 9 4 3 8 6 1 1 2 3 4 6 7 8 9

7 2 9 4 2 4 7 9

3 8 6 1 1 3 6 8

7 2 2 7

9 4 4 9

3 8 3 8

6 1 1 6

7 7

2 2

9 9

4 4
Merge Sort

3 3

8 8

6 6

1 1
43

Psedocode for mergesort


MERGE(A, lb, mb, ub) 1 n1 mb-lb+1 n2 ub-mb 2. create arrays L[0... n1 ] and R[0...n2 ] j=0 for i lb to mb do L[j++] A[i] j=0; for i mb+1 to ub do R[j++] A[i]
Sunday, April 22, 2012

3. i 0, j 0, k lb while (i<n1 && j <n2) if(L[i]<=R[j]) A[k]=L[i], i++; else A[k]=R[j], j++; k++; 4. while (i<n1 ) A[k]=L[i], i++;k++ while(j<n2) A[k]=R[j], j++;k++;
44

Psedocode for mergesort


MERGE-SORT(A, lb, ub) if lb < ub { mb (lb + ub)/2 MERGE-SORT(A, lb, mb) MERGE-SORT(A, mb+1, ub) MERGE(A, lb,mb,ub) }

Sunday, April 22, 2012

45

Analysis of Merge-Sort

The height h of the merge-sort tree is O(log n)

at each recursive call we divide in half the sequence,

The overall amount or work done at the nodes of depth i is O(n)


we partition and merge 2i sequences of size n/2i we make 2i+1 recursive calls

Thus, the total running time of merge-sort is O(n log n)


depth #seqs size 0 1 n 1 i 2 2i n/2 n/2i
Merge Sort 46

Potrebbero piacerti anche