6-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 6 LECTURE

Embed Size (px)

Citation preview

  • Slide 1
  • 6-0 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 LECTURE 7
  • Slide 2
  • 6-1 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Decrease-by-Constant-Factor Algorithms In this variation of decrease-and-conquer, instance size is reduced by the same factor (typically, 2) The Problems : Binary search and the method of bisection Binary search and the method of bisection Exponentiation by squaring Exponentiation by squaring Multiplication la russe (Russian peasant method) Multiplication la russe (Russian peasant method) Fake-coin puzzle Fake-coin puzzle Josephus problem Josephus problem
  • Slide 3
  • 6-2 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Russian Peasant Multiplication The problem: Compute the product of two positive integers Can be solved by a decrease-by-half algorithm based on the following formulas. For even values of n: For odd values of n: n * m = * 2m n * m = * 2m + m if n > 1 n * m = * 2m + m if n > 1 n2n2n2n2 n 1 2
  • Slide 4
  • 6-3 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Example of Russian Peasant Multiplication Compute 20 * 26 n m n m 20 26 20 26 10 52 10 52 5 104 104 5 104 104 2 208 + 2 208 + 1 416 416 1 416 416 520
  • Slide 5
  • Chapter 6 Transform-and-Conquer Copyright 2007 Pearson Addison-Wesley. All rights reserved.
  • Slide 6
  • 6-5 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Transform and Conquer This group of techniques solves a problem by a transformation b to a simpler/more convenient instance of the same problem (instance simplification) b to a different representation of the same instance (representation change) b to a different problem for which an algorithm is already available (problem reduction)
  • Slide 7
  • 6-6 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Instance simplification - Presorting Solve a problems instance by transforming it into another simpler/easier instance of the same problem Presorting Many problems involving lists are easier when list is sorted. b searching b computing the median (selection problem) b checking if all elements are distinct (element uniqueness) Also: b Topological sorting helps solving some problems for dags. b Presorting is used in many geometric algorithms.
  • Slide 8
  • 6-7 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 How fast can we sort ? Efficiency of algorithms involving sorting depends on efficiency of sorting. Theorem (see Sec. 11.2): log 2 n! n log 2 n comparisons are necessary in the worst case to sort a list of size n by any comparison-based algorithm. Note: About nlog 2 n comparisons are also sufficient to sort array of size n (by mergesort).
  • Slide 9
  • 6-8 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Searching with presorting Problem: Search for a given K in A[0..n-1] Presorting-based algorithm: Stage 1 Sort the array by an efficient sorting algorithm Stage 2 Apply binary search Stage 2 Apply binary search Efficiency: (nlog n) + O(log n) = (nlog n) Good or bad? Why do we have our dictionaries, telephone directories, etc. sorted?
  • Slide 10
  • 6-9 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Element Uniqueness with presorting b Presorting-based algorithm Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 1: sort by efficient sorting algorithm (e.g. mergesort) Stage 2: scan array to check pairs of adjacent elements Stage 2: scan array to check pairs of adjacent elements Efficiency : (nlog n) + O(n) = (nlog n) Efficiency : (nlog n) + O(n) = (nlog n) b Brute force algorithm Compare all pairs of elements Efficiency: O(n 2 )
  • Slide 11
  • 6-10 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Instance simplification Gaussian Elimination Given: A system of n linear equations in n unknowns with an arbitrary coefficient matrix. Transform to: An equivalent system of n linear equations in n unknowns with an upper triangular coefficient matrix. Solve the latter by substitutions starting with the last equation and moving up to the first one. a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2 a 22 x 2 + + a 2n x n = b 2 a n1 x 1 + a n2 x 2 + + a nn x n = b n a nn x n = b n
  • Slide 12
  • 6-11 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Gaussian Elimination (cont.) The transformation is accomplished by a sequence of elementary operations on the systems coefficient matrix (which dont change the systems solution): for i 1 to n-1 do replace each of the subsequent rows (i.e., rows i+1, , n) by a difference between that row and an appropriate multiple of the i-th row to make the new coefficient in the i-th column of that row 0
  • Slide 13
  • 6-12 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Example of Gaussian Elimination Solve 2x 1 - 4x 2 + x 3 = 6 3x 1 - x 2 + x 3 = 11 x 1 + x 2 - x 3 = -3 Gaussian elimination 2 -4 1 6 2 -4 1 6 2 -4 1 6 2 -4 1 6 3 -1 1 11 row2 (3/2)*row1 0 5 -1/2 2 3 -1 1 11 row2 (3/2)*row1 0 5 -1/2 2 1 1 -1 -3 row3 (1/2)*row1 0 3 -3/2 -6 row3(3/5)*row2 1 1 -1 -3 row3 (1/2)*row1 0 3 -3/2 -6 row3(3/5)*row2 2 -4 1 6 2 -4 1 6 0 5 -1/2 2 0 5 -1/2 2 0 0 -6/5 -36/5 0 0 -6/5 -36/5 Backward substitution Backward substitution x 3 = (-36/5) / (-6/5) = 6 x 3 = (-36/5) / (-6/5) = 6 x 2 = (2+(1/2)*6) / 5 = 1 x 2 = (2+(1/2)*6) / 5 = 1 x 1 = (6 6 + 4*1)/2 = 2 x 1 = (6 6 + 4*1)/2 = 2
  • Slide 14
  • 6-13 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Pseudocode and Efficiency of Gaussian Elimination Stage 1: Reduction to the upper-triangular matrix for i 1 to n-1 do for j i+1 to n do for k i to n+1 do A[j, k] A[j, k] - A[i, k] * A[j, i] / A[i, i] //improve! for j i+1 to n do for k i to n+1 do A[j, k] A[j, k] - A[i, k] * A[j, i] / A[i, i] //improve! Stage 2: Backward substitution for j n downto 1 do t 0 t 0 for k j +1 to n do for k j +1 to n do t t + A[j, k] * x[k] x[j] (A[j, n+1] - t) / A[j, j] t t + A[j, k] * x[k] x[j] (A[j, n+1] - t) / A[j, j] Efficiency: (n 3 ) + (n 2 ) = (n 3 ) Efficiency: (n 3 ) + (n 2 ) = (n 3 ) Read the Pseudocode code for the algorithm and find its efficiency?
  • Slide 15
  • 6-14 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Taxonomy of Searching Algorithms b List searching sequential searchsequential search binary searchbinary search b Tree searching binary search treebinary search tree binary balanced trees: AVL (Adelson-Velsky-Landis) trees, red-black treesbinary balanced trees: AVL (Adelson-Velsky-Landis) trees, red-black trees multiway balanced trees: 2-3 trees, 2-3-4 trees, B treesmultiway balanced trees: 2-3 trees, 2-3-4 trees, B trees b Hashing open hashing (separate chaining)open hashing (separate chaining) closed hashing (open addressing)closed hashing (open addressing)
  • Slide 16
  • 6-15 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Binary Search Tree Arrange keys in a binary tree with the binary search tree property: K K Example: 5, 3, 1, 10, 12, 7, 9
  • Slide 17
  • 6-16 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Dictionary Operations on Binary Search Trees Searching straightforward Insertion search for key, insert at leaf where search terminated Deletion 3 cases: deleting key at a leaf deleting key at node with single child deleting key at node with two children Efficiency depends of the trees height: log 2 n h n-1 Efficiency depends of the trees height: log 2 n h n-1 Thus all three operations have Thus all three operations have worst case efficiency: (n) worst case efficiency: (n) average case efficiency: (log n) average case efficiency: (log n) Bonus: inorder traversal (Left-root-right) produces sorted list Bonus: inorder traversal (Left-root-right) produces sorted list
  • Slide 18
  • 6-17 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Balanced Search Trees Attractiveness of binary search tree is marred by the bad (linear) worst-case efficiency. Two ideas to overcome it are: b to rebalance binary search tree when a new insertion makes the tree too unbalanced AVL trees AVL trees red-black trees red-black trees b to allow more than one key per node of a search tree 2-3 trees 2-3 trees 2-3-4 trees 2-3-4 trees B-trees B-trees
  • Slide 19
  • 6-18 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Balanced trees: AVL trees Definition An AVL tree is a binary search tree in which, for every node, the difference between the heights of its left and right subtrees, called the balance factor, is at most 1 (with the height of an empty tree defined as -1) Tree (a) is an AVL tree; tree (b) is not an AVL tree
  • Slide 20
  • 6-19 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Rotations If a key insertion violates the balance requirement at some node, the subtree rooted at that node is transformed via one of the four rotations. 1- Single L-rotation (LL) b becomes the new root. a takes ownership of b's left child as its right child, or in this case, null. b takes ownership of a as its left child.
  • Slide 21
  • 6-20 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Rotations 2- Right Rotation (RR) b b becomes the new root. b c takes ownership of b's right child, as its left child. In this case, that value is null. b b takes ownership of c, as it's right child.
  • Slide 22
  • 6-21 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Rotations 3- Left-Right Rotation (LR) or "Double left b the right subtree having a negative balance b So, perform a right rotation on the right subtree. b Think of our right subtree, isolated from our main tree, and perform a right rotation on it. b Add the root b Do the left rotation Single Left Rotation right Rotation Left Rotation Stuck !!!
  • Slide 23
  • 6-22 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Rotations 4- Right-Left Rotiation (RL) or "Double right performing a left rotation our left subtree The tree can now be balanced using a single right rotation Single right Rotation Stuck !!! right Rotation
  • Slide 24
  • 6-23 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 General case: Single R-rotation c becomes the new root. r takes ownership of c's right child, as its left child. c takes ownership of r, as it's right child.
  • Slide 25
  • 6-24 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 General case: Double LR-rotation
  • Slide 26
  • 6-25 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 AVL tree construction - an example Construct an AVL tree for the list 5, 6, 8, 3, 2, 4, 7
  • Slide 27
  • 6-26 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 AVL tree construction - an example (cont.)
  • Slide 28
  • 6-27 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Analysis of AVL trees b h 1.4404 log 2 (n + 2) - 1.3277 average height: 1.01 log 2 n + 0.1 for large n (found empirically) average height: 1.01 log 2 n + 0.1 for large n (found empirically) b Search and insertion are O(log n) b Deletion is more complicated but is also O(log n) b Disadvantages: frequent rotationsfrequent rotations complexitycomplexity b A similar idea: red-black trees (height of subtrees is allowed to differ by up to a factor of 2)
  • Slide 29
  • 6-28 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Multiway Search Trees Definition A multiway search tree is a search tree that allows more than one key in the same node of the tree. Definition A node of a search tree is called an n-node if it contains n-1 ordered keys (which divide the entire key range into n intervals pointed to by the nodes n links to its children): Note: Every node in a classical binary search tree is a 2-node Note: Every node in a classical binary search tree is a 2-node k 1 < k 2 < < k n-1 < k 1 [k 1, k 2 ) k n-1
  • Slide 30
  • 6-29 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 2-3 Tree Definition A 2-3 tree is a search tree that b may have 2-nodes and 3-nodes b height-balanced (all leaves are on the same level) A 2-3 tree is constructed by successive insertions of keys given, with a new key always inserted into a leaf of the tree. If the leaf is a 3-node, its split into two with the middle key promoted to the parent.
  • Slide 31
  • 6-30 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Construct A 2-3 tree Video
  • Slide 32
  • 6-31 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 2-3 tree construction an example (try it yourself) Construct a 2-3 tree the list 9, 5, 8, 3, 2, 4, 7
  • Slide 33
  • 6-32 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Analysis of 2-3 trees b log 3 (n + 1) - 1 h log 2 (n + 1) - 1 b Search, insertion, and deletion are in (log n) b The idea of 2-3 tree can be generalized by allowing more keys per node 2-3-4 trees2-3-4 trees B-treesB-trees Each node of a b-tree may have a variable number of keys and children Each node of a b-tree may have a variable number of keys and children
  • Slide 34
  • 6-33 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Heaps and Heapsort Definition A heap is a binary tree with keys at its nodes (one key per node) such that: b It is essentially complete, i.e., all its levels are full except possibly the last level, where only some rightmost keys may be missing The key at each node is keys at its children
  • Slide 35
  • 6-34 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 34 The heap property b A node has the heap property if the value in the node is as large as or larger than the values in its children b All leaf nodes automatically have the heap property b A binary tree is a heap if all nodes in it have the heap property 12 83 Blue node has heap property 12 8 Blue node has heap property 12 814 Blue node does not have heap property
  • Slide 36
  • 6-35 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 35 siftUp b Given a node that does not have the heap property, you can give it the heap property by exchanging its value with the value of the larger child b This is sometimes called sifting up b Notice that the child may have lost the heap property 14 812 Blue node has heap property 12 814 Blue node does not have heap property
  • Slide 37
  • 6-36 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Illustration of the heaps definition a heap not a heap Note: Heaps elements are ordered top down (along any path down from its root), but they are not ordered left to right
  • Slide 38
  • 6-37 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Some Important Properties of a Heap b Given n, there exists a unique binary tree with n nodes that is essentially complete, with h = log 2 n is essentially complete, with h = log 2 n b The root contains the largest key b The subtree rooted at any node of a heap is also a heap b A heap can be represented as an array
  • Slide 39
  • 6-38 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Heaps Array Representation Store heaps elements in an array (whose elements indexed, for convenience, 1 to n) in top-down left-to-right order Example: b Left child of node j is at 2j b Right child of node j is at 2j+1 b Parent of node j is at j/2 b Parental nodes are represented in the first n/2 locations 9 1 53 42 1 2 3 4 5 6 9 5 3 1 4 2
  • Slide 40
  • 6-39 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Constructing a heap I b A tree consisting of a single node is automatically a heap b We construct a heap by adding nodes one at a time: Add the node just to the right of the rightmost node in the deepest levelAdd the node just to the right of the rightmost node in the deepest level If the deepest level is full, start a new levelIf the deepest level is full, start a new level b Examples: Add a new node here
  • Slide 41
  • 6-40 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 40 Constructing a heap II b Each time we add a node, we may destroy the heap property of its parent node b To fix this, we sift up b But each time we sift up, the value of the topmost node in the sift may increase, and this may destroy the heap property of its parent node b We repeat the sifting up process, moving up in the tree, until either We reach nodes whose values dont need to be swapped (because the parent is still larger than both children), orWe reach nodes whose values dont need to be swapped (because the parent is still larger than both children), or We reach the rootWe reach the root
  • Slide 42
  • 6-41 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 41 Constructing a heap III 8 8 10 8 85 85 12 10 125 8 105 8 123 4
  • Slide 43
  • 6-42 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 42 Other children are not affected b The node containing 8 is not affected because its parent gets larger, not smaller b The node containing 5 is not affected because its parent gets larger, not smaller b The node containing 8 is still not affected because, although its parent got smaller, its parent is still greater than it was originally 12 105 814 12 145 810 14 125 810
  • Slide 44
  • 6-43 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 43 A sample heap b Heres a sample binary tree after it has been heapified b Notice that heapified does not mean sorted b Heapifying does not change the shape of the binary tree; this binary tree is balanced and left-justified because it started out that way 19 1418 22 321 14 119 15 25 1722
  • Slide 45
  • 6-44 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 44 Removing the root (animated) b Notice that the largest number is now in the root b Suppose we discard the root: b How can we fix the binary tree so it is once again balanced and left-justified? b Solution: remove the rightmost leaf at the deepest level and use it for the new root 19 1418 22 321 14 119 15 1722 11
  • Slide 46
  • 6-45 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 45 The reHeap method I b Our tree is balanced and left-justified, but no longer a heap b However, only the root lacks the heap property We can siftUp() the root b After doing this, one and only one of its children may have lost the heap property 19 1418 22 321 14 9 15 1722 11
  • Slide 47
  • 6-46 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 46 The reHeap method II Now the left child of the root (still the number 11 ) lacks the heap property We can siftUp() this node b After doing this, one and only one of its children may have lost the heap property 19 1418 22 321 14 9 15 1711 22
  • Slide 48
  • 6-47 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 47 The reHeap method III Now the right child of the left child of the root (still the number 11 ) lacks the heap property: We can siftUp() this node b After doing this, one and only one of its children may have lost the heap property but it doesnt, because its a leaf 19 1418 11 321 14 9 15 1722
  • Slide 49
  • 6-48 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 48 The reHeap method IV b Our tree is once again a heap, because every node in it has the heap property b Once again, the largest (or a largest) value is in the root b We can repeat this process until the tree becomes empty b This produces a sequence of values in order largest to smallest 19 1418 21 311 14 9 15 1722
  • Slide 50
  • 6-49 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Step 0: Initialize the structure with keys in the order given Step 1: Starting with the last (rightmost) parental node, fix the heap rooted at it, if it doesnt satisfy the heap condition: keep exchanging it with its largest child until the heap condition holds Step 2: Repeat Step 1 for the preceding parental node Heap Construction (bottom-up)
  • Slide 51
  • 6-50 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Example of Heap Construction Construct a heap for the list 2, 9, 7, 6, 5, 8
  • Slide 52
  • 6-51 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Pseudopodia of bottom-up heap construction
  • Slide 53
  • 6-52 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 52 Sorting b What do heaps have to do with sorting an array? b Heres the neat part: Because the binary tree is balanced and left justified, it can be represented as an arrayBecause the binary tree is balanced and left justified, it can be represented as an array All our operations on binary trees can be represented as operations on arraysAll our operations on binary trees can be represented as operations on arrays To sort:To sort: heapify the array; heapify the array; while the array isnt empty { while the array isnt empty { remove and replace the root; remove and replace the root; reheap the new root node; } reheap the new root node; }
  • Slide 54
  • 6-53 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 53 Mapping into an array b Notice: The left child of index i is at index 2*i+1The left child of index i is at index 2*i+1 The right child of index i is at index 2*i+2The right child of index i is at index 2*i+2 Example: the children of node 3 (19) are 7 (18) and 8 (14)Example: the children of node 3 (19) are 7 (18) and 8 (14) 19 1418 22 321 14 119 15 25 1722 252217192214151814213911 0 1 2 3 4 5 6 7 8 9 10 11 12
  • Slide 55
  • 6-54 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 54 Removing and replacing the root b The root is the first element in the array b The rightmost node at the deepest level is the last element b Swap them... ...And pretend that the last element in the array no longer existsthat is, the last index is 11 (containing the value 9) 252217192214151814213911 0 1 2 3 4 5 6 7 8 9 10 11 12 112217192214151814213925 0 1 2 3 4 5 6 7 8 9 10 11 12
  • Slide 56
  • 6-55 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 55 Reheap and repeat Reheap the root node (index 0, containing 11 )... b Remember, though, that the last array index is changed b Repeat until the last becomes first, and the array is sorted! 22 17192114151814113925 0 1 2 3 4 5 6 7 8 9 10 11 12 922171922141518142132225 0 1 2 3 4 5 6 7 8 9 10 11 12 112217192214151814213925 0 1 2 3 4 5 6 7 8 9 10 11 12...And again, remove and replace the root node
  • Slide 57
  • 6-56 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Insertion of a New Element into a Heap b Insert the new element at last position in heap. b Compare it with its parent and, if it violates heap condition, exchange them b Continue comparing the new element with nodes up the tree until the heap condition is satisfied Example: Insert key 10 Efficiency: O(log n)
  • Slide 58
  • 6-57 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Stage 1: Construct a heap for a given list of n keys Stage 2: Repeat operation of root removal n-1 times: Exchange keys in the root and in the last (rightmost) leaf Decrease heap size by 1 If necessary, swap new root with larger child until the heap condition holds Heapsort
  • Slide 59
  • 6-58 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 58 Analysis I b Heres how the algorithm starts: heapify the array; heapify the array; Heapifying the array: we add each of n nodes Each node has to be sifted up, possibly as far as the rootEach node has to be sifted up, possibly as far as the root Since the binary tree is perfectly balanced, sifting up a single node takes O(log n) time Since we do this n times, heapifying takes n*O(log n) time, that is, O(n log n) timeSince we do this n times, heapifying takes n*O(log n) time, that is, O(n log n) time
  • Slide 60
  • 6-59 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 59 Analysis II b Heres the rest of the algorithm: while the array isnt empty { while the array isnt empty { remove and replace the root; remove and replace the root; reheap the new root node; } reheap the new root node; } We do the while loop n times (actually, n-1 times), because we remove one of the n nodes each time Removing and replacing the root takes O(1) time Therefore, the total time is n times however long it takes the reheap method
  • Slide 61
  • 6-60 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 60 Analysis III b To reheap the root node, we have to follow one path from the root to a leaf node (and we might stop before we reach a leaf) b The binary tree is perfectly balanced Therefore, this path is O(log n) long And we only do O(1) operations at each nodeAnd we only do O(1) operations at each node Therefore, reheaping takes O(log n) timesTherefore, reheaping takes O(log n) times Since we reheap inside a while loop that we do n times, the total time for the while loop is n*O(log n), or O(n log n)
  • Slide 62
  • 6-61 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 61 Analysis IV b Heres the algorithm again: heapify the array; heapify the array; while the array isnt empty { while the array isnt empty { remove and replace the root; remove and replace the root; reheap the new root node; } reheap the new root node; } We have seen that heapifying takes O(n log n) time The while loop takes O(n log n) time The total time is therefore O(n log n) + O(n log n) This is the same as O(n log n) time
  • Slide 63
  • 6-62 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 A priority queue is the ADT of a set of elements with numerical priorities with the following operations: find element with highest priorityfind element with highest priority delete element with highest prioritydelete element with highest priority insert element with assigned priority (see below)insert element with assigned priority (see below) b Heap is a very efficient way for implementing priority queues b Two ways to handle priority queue in which highest priority = smallest number Priority Queue
  • Slide 64
  • 6-63 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Horners Rule For Polynomial Evaluation Given a polynomial of degree n p(x) = a n x n + a n-1 x n-1 + + a 1 x + a 0 and a specific value of x, find the value of p at that point. Two brute-force algorithms: p 0 p a 0 ; power 1 for i n downto 0 do for i 1 to n do power 1 power power * x power 1 power power * x for j 1 to i do p p + a i * power for j 1 to i do p p + a i * power power power * x return p power power * x return p p p + a i * power p p + a i * power return p
  • Slide 65
  • 6-64 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Horners Rule Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = Example: p(x) = 2x 4 - x 3 + 3x 2 + x - 5 = = x(2x 3 - x 2 + 3x + 1) - 5 = = x(2x 3 - x 2 + 3x + 1) - 5 = = x(x(2x 2 - x + 3) + 1) - 5 = = x(x(2x 2 - x + 3) + 1) - 5 = = x(x(x(2x - 1) + 3) + 1) - 5 = x(x(x(2x - 1) + 3) + 1) - 5 Substitution into the last formula leads to a faster algorithm Substitution into the last formula leads to a faster algorithm Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: Same sequence of computations are obtained by simply arranging the coefficient in a table and proceeding as follows: coefficients2-1 3 1-5 x=3 x=3
  • Slide 66
  • 6-65 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Horners Rule pseudocode Efficiency of Horners Rule: # multiplications = # additions = n
  • Slide 67
  • 6-66 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Computing a n (revisited) Left-to-right binary exponentiation Initialize product accumulator by 1. Scan ns binary expansion from left to right and do the following: If the current binary digit is 0, square the accumulator (S); if the binary digit is 1, square the accumulator and multiply it by a (SM). Example: Compute a 13. Here, n = 13 = 1101 2. binary rep. of 13: 1 1 0 1 SM SM S SM accumulator: 1 1 2 *a=a a 2 *a = a 3 (a 3 ) 2 = a 6 (a 6 ) 2 *a= a 13 (computed left-to-right) Efficiency: (b-1) M(n) 2(b-1) where b = log 2 n + 1
  • Slide 68
  • 6-67 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Computing a n (cont.) Right-to-left binary exponentiation Scan ns binary expansion from right to left and compute a n as the product of terms a 2 i corresponding to 1s in this expansion. Example Compute a 13 by the right-to-left binary exponentiation. Here, n = 13 = 1101 2. 1 1 0 1 a 8 a 4 a 2 a : a 2 i terms a 8 * a 4 * a : product (computed right-to-left) Efficiency: same as that of left-to-right binary exponentiation
  • Slide 69
  • 6-68 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Problem Reduction This variation of transform-and-conquer solves a problem by a transforming it into different problem for which an algorithm is already available. To be of practical value, the combined time of the transformation and solving the other problem should be smaller than solving the problem as given by another method.
  • Slide 70
  • 6-69 Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. 6 Examples of Solving Problems by Reduction b computing lcm(m, n) via computing gcd(m, n) b counting number of paths of length n in a graph by raising the graphs adjacency matrix to the n-th power b transforming a maximization problem to a minimization problem and vice versa (also, min-heap construction) b linear programming b reduction to graph problems (e.g., solving puzzles via state- space graphs)