Greedy Note

Embed Size (px)

Citation preview

  • 7/30/2019 Greedy Note

    1/20

    Greedy Method: Definition

    An algorithm which always takes the best immediate, or local, solution while finding an answer. Greedyalgorithms will always find the overall, or globally, optimal solutionfor some optimization problems, bu

    may find less-than-optimal solutions for some instances of other problems.

    A greedy algorithm always makes the choice that looks best at the moment.

    It makes a local optimal choice in the hope that this choice will lead to a globally optimal

    solution. Greedy algorithms yield optimal solutions for many (but not all) problems.

    Elements of the greedy strategy

    1. Determine the optimal substructure of the problem.

    2. Develop a recursive solution.

    3. Prove that at any stage of the recursion, one of the optimal choices is the greedy choice. Thus, it is

    always safe to make the greedy choice.4. Show that all but one of the subproblems induced by having make the greedy choice are empty.

    5. Develop a recursive algorithm that implements the greedy strategy.6. Convert the recursive algorithm to an iterative algorithm.

    Alternatively, we could have fashioned our optimal substructure with a greedy choice in

    mind.

    Designing a greedy algorithm

    1. Cast the optimization problem as one in which we make a choice and are left with one subproblem

    to solve.

    2. Prove that there is always an optimal solution to the original problem that makes the greedy choice,so that the greedy choice is always safe.

    3. Demonstrate that, having made the greedy choice , what remains is a subproblem with the property

    that if we combine an optimal solution to the subproblem with the greedy choice we have made, we

    arrive at an optimal solution to the original problem.

    No general way to tell if a greedy algorithm is optimal, but two key ingredients are:

    Greedy-choice property: A global optimal solution can be achieved by making a local

    optimal (optimal) choice.

    Optimal substructure: An optimal solution to the problem within its optimal solution to

    subproblem.

    Designing Greedy Algorithms1. Cast the optimization problem as one for which:

    we make a choice and are left with only one subproblem to solve

    2. Prove the GREEDY CHOICE

    that there is always an optimal solution to the original problem that makes the greedy

    choice

    3. Prove the OPTIMAL SUBSTRUCTURE:

    the greedy choice + an optimal solution to the resulting subproblem leads to an optimal

    solution

    http://e/class/optimalsoltn.htmlhttp://e/class/optimalsoltn.htmlhttp://e/class/optimization.htmlhttp://e/class/optimization.htmlhttp://e/class/optimalsoltn.htmlhttp://e/class/optimization.html
  • 7/30/2019 Greedy Note

    2/20

    Greedy Approach VS Dynamic Programming (DP)

    Greedy and Dynamic Programming are methods for solving optimization problems.

    Greedy algorithms are usually more efficient than DP solutions.

    However, it is quite often that you need to use dynamic programming since the optimal

    solution cannot be guaranteed by a greedy algorithm.

    DP provides efficient solutions for some problems for which a brute force approach woul

    be very slow.

    To use Dynamic Programming we only need to show that the principle of optimalityapplies to the problem.

    Greedy Algorithm

    Start with a solution to a small subproblem

    Build up to a solution to the whole problem

    Make choices that look good in the short term

    Disadvantage: Greedy algorithms dont always work ( Short term solutions can be diastrous in th

    long term). Hard to prove correct

    Advantage: Greedy algorithm work fast when they work. Simple algorithm, easy to implement

    Structure Greedy Algorithm

    Initially the set of chosen items is empty i.e., solution set.

    At each step

    o item will be added in a solution set by using selection function.

    o IF the set would no longer be feasible

    reject items under consideration (and is never consider again).

    o ELSE IF set is still feasible THEN

    add the current item.

    Definitions of feasibility

    A feasible set (of candidates) is promising if it can be extended to produce not merely a solution, but an

    optimal solution to the problem. In particular, the empty set is always promising why? (because

    an optimal solution always exists)

    Unlike Dynamic Programming, which solves the subproblems bottom-up, a greedy strategy usuallyprogresses in a top-down fashion, making one greedy choice after another, reducing each problem to a

    smaller one.

    Greedy-Choice Property

    The "greedy-choice property" and "optimal substructure" are two ingredients in the problem that lend to

    a greedy strategy.

    Greedy-Choice Property

    It says that a globally optimal solution can be arrived at by making a locally optimal choice.

  • 7/30/2019 Greedy Note

    3/20

    Greedy Algorithm (General method)

    Procedure GREEDY(A,n)

    // A(1:n) contains the n inputs//

    solution //initialize the solution to empty//

    for i 1 to n do

    x SELECT(A)

    if FEASIBLE(solution,x)then solution UNION(solution,x)

    endif

    repeat

    return(solution)

    end GREEDY

    Characteristics and Features of Problems solved by Greedy Algorithms

    To construct the solution in an optimal way. Algorithm maintains two sets. One contains chosen items

    and the other contains rejected items.

    The greedy algorithm consists of four (4) function.

    1. A function that checks whether chosen set of items provide a solution.

    2. A function that checks the feasibility i.eFeasible() checks whether adding the selected value tothe current solution can result in a feasible solution (no constraints are violated).

    3. Theselect() chooses a candidate based on a local selection criteria, removes it from Candidate,

    and returns its value. i.e this function tells which of the candidates is the most promising.

    4. An objective function,solves() checks whether the problem is solved, which does not appear

    explicitly, gives the value of a solution.

    An Activity Selection Problem

    An activity-selection is the problem of scheduling a resource among several competing activity.

    Problem Statement

    Given a set S of n activities with and start time, Si and fi, finish time of an ith activity. Find the maximum size s

    of mutually compatible activities.

    Compatible ActivitiesActivities i andj are compatible if the half-open internal [si, fi) and [sj, fj)

    do not overlap, that is, i andj are compatible ifsi fj andsj fi

    The Activity Selection Problem --Consider the problem of scheduling activities into a room or auditorium over some period of time.

    Let X = the set of activities that have requested time in the auditorium where xi denotes the ith such

    request and it is delineated by its start and end times:

    xi = (si, fi)

  • 7/30/2019 Greedy Note

    4/20

    Let S be the set of scheduled activities, and P the problem of scheduling the largest number of requests

    the given time interval from T0 to Tf. P(xi) will be true if, after ordering (or selecting) according to a

    greedy choice property, the ith activity selected can be scheduled without causing a conflict with any

    previously scheduled activity.

    The first requirement is to determine an appropriate greedy-choice property and then to show that

    selection based solely upon this property will always yield an optimal solution. There are three possiblbases for the greedy-choice property:

    1. select according to earliest start time,

    2. select according to earliest finish time,

    3. select according to shortest duration.

    Although the third choice is intuitively appealing, the second choice -- select according to earliest finish

    time -- is the correct one.

    We need to prove that this is so.

    Proof by contradiction --Suppose S is a maximal solution set and it does notcontain the object with the earliest finish time x1.

    Then a maximal set S' can be formed by removing thefirstitem from S and replacing it with x1.

    (where byfirstwe mean the object in S with the earliest finish time.) No confilct with any of the other

    objects in S can occur since they must all start later than the end time of the object that was removed.

    Since nothing better can be done in scheduling the first event, the problem then becomes the taskof scheduling activities in the reduced time interval from f1 to Tf (the interval from the end of the

    first activity to the end of the scheduling period) from among the remaining choices with start

    times later than f1. (Note that all of the activities that start before x1 finishes can be eliminated

    because at best all they can do is replace the activity x1 with one that ends later.) The problem

    now is the same one as before with a shorter time interval and a reduced set of requests, so weapply the same argument in showing that the non-overlapping activity with the earliest end timecan be included in any maximal set for this sub-problem, and continue until the set of requests ha

    been exhausted..

    Greedy Algorithm for Selection Problem

    Part I. Sort the input activities by increasing finishing time.f1 f2 . . . fn

    Part II. Call GREEDY-ACTIVITY-SELECTOR(s, f)

    1. n = length [s]

    2. A={i}

    3. j = 1

    4. for i = 2 to n

    5. do if si fj

    6. then A= AU{i}

    7. j = i

    8. returnset A

  • 7/30/2019 Greedy Note

    5/20

    Operation of the algorithm

    Let 11 activities are given S= {p, q, r, s, t, u, v, w, x, y, z} start and finished times for proposed activitie

    are (1, 4), (3, 5), (0, 6), 5, 7), (3, 8), 5, 9), (6, 10), (8, 11), (8, 12), (2, 13) and (12, 14).

    A = {p} Initialization at line 2

    A = {p, s} line 6 - 1st iteration of FOR - loop

    A = {p, s, w} line 6 -2nd iteration of FOR - loop

    A = {p, s, w, z} line 6 - 3rd iteration of FOR-loop

    Out of the FOR-loop and Return A = {p, s, w, z}

    Example

    Time0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

    0 2

    1 4

    3 711 15

    3 10

    2 12

    11 13Activities

    1

    2

    3

    4

    5

    6

    7

    Time0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 150 15

    1 4

    11 15

    Activities

    1

    2

    3

    Select by start time

  • 7/30/2019 Greedy Note

    6/20

    Select by minimum duration

    Time0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

    1 8

    7 9

    8 15Activities

    1

    2

    3

    Select by finishing time

    Time0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 150 2

    1 4

    3 711 15

    3 10

    2 1211 13

    Activities

    12

    3

    4

    5

    6

    7

  • 7/30/2019 Greedy Note

    7/20

    Analysis

    Part I requires O(n lg n) time (use merge of heap sort).

    Part II i.e line no.4 to 7 requires (n) time assuming that activities were already sorted in part I by

    their finish time.

    Proof that greedy solution is optimal

    1. It is easy to see that there is an optimal solution to the problem that makes the greedy choice.

    Proof of 1.LetA be an optimal solution. Let activity 1 be the greedy choice. If 1 A the proof is done. If 1 A, we

    will show thatA=A-{a}+{1} is another optimal solution that includes 1.

    Let a be the activity with minimum finish time inA.

    Since activities are sorted by finishing time in the algorithm,f(1) f(a). Iff(1) s(a) we could

    add 1 toA and it could not be optimal. Sos(1)

  • 7/30/2019 Greedy Note

    8/20

    Knapsack problem

    Given some items, pack the knapsack to get the maximum total value. Each item has some weight and

    some value. Total weight that we can carry is no more than some fixed number W. So we must consider

    weights of items as well as their values.

    Statement A thief robbing a store and can carry a maximal weight ofw into their knapsack. There arn items and ith item weigh wi and profit is worthpi dollars. What items should thief take?

    There are two versions of the problem:

    1. 0-1 knapsack problem and

    2. Fractional knapsack problem

    1. Items are indivisible; you either take an item or not. Solved with dynamic programming

    2. Items are divisible: you can take any fraction of an item. Solved with agreedy algorithm.

    I. Fractional knapsack problem

    The setup is same, but the thief can take fractions of items, meaning that the items can be broken

    into smaller pieces so that thief may decide to carry only a fraction ofxi of item i, where 0 xi

    Exhibit greedy choice property.

    Greedy algorithm exists.

    Exhibit optimal substructure property.

    ?????

    II. 0-1 knapsack problem

    The setup is the same, but the items may not be broken into smaller pieces, so thief may decideeither to take an item or to leave it (binary choice), but may not take a fraction of an item.

    Exhibit No greedy choice property.

    No greedy algorithm exists.

    Exhibit optimal substructure property.

    Only dynamic programming algorithm exists.

  • 7/30/2019 Greedy Note

    9/20

    The Fractional Knapsack Problem (By Greedy Method)

    There are n items in a store. For i =1,2, . . . , n, item i has weight wi > 0 and profitpi > 0. Thief can carry

    maximum weight of M pounds in a knapsack. In this version of a problem the items can be broken into

    smaller piece, so the thief may decide to carry only a fractionxiof object i, where 0 xi 1. Item i

    contributesxiwi to the total weight in the knapsack, andxip i to the prpfit of the load.

    Given :- We are given n objects and a knapsack. Object i has a weight wi and the knapsack has a capacity

    M.

    If a fraction xi, 0 xi 1, of object I is placed into the knapsack the a profit of pixi is earned.

    Goal :- Choose items with maximum total benefit but with weight at most M.

    The objective is to obtain a filling of the knapsack that maximizes the total weight of all chosen

    objects to be at most M

    In Symbol, the fraction knapsack problem can be stated as follows.

    maximize pi xi1 i n

    subject to constraint wi xi M1 i n

    and0 xi 1 , 1 i n

    It is clear that an optimal solution must fill the knapsack exactly, for otherwise we could add a fraction

    one of the remaining objects and increase the value of the load. Thus in an optimal solution xiwi = W.

    Brute force Approach

    Generate all 2n subsetsDiscard all subsets whose sum of the weights exceed W (not feasible)

    Select the maximum total benefit of the remaining (feasible) subsets

    What is the run time?

    O(n 2n)

    GREEDY APPROACH WITH EXAMPLE

    There are 5 objects that have a price and weight list below, the knapsack can contain at most 100 Lbs.

    Method 1 choose the least weight first

    Total Weight = 10 + 20 + 30 + 40 = 100 Total Price = 20 + 30 + 66 + 40 = 156

    Price ($US) 20 30 66 40 60

    weight (Lbs.) 10 20 30 40 50

  • 7/30/2019 Greedy Note

    10/20

    Method 2 choose the most expensive first Total Weight = 30 + 50 + 20 = 100

    Total Price = 66 + 60 + 20 = 146

    Method 3 choose the most price/ weight first

    Total weight = 30 + 10 + 20 + 40 = 100 Total Price = 66 + 20 + 30 + 48 = 164

    An Optimal Greedy Algorithm for Knapsack with Fractions (KWF)

    In this problem a fraction of any item may be chosen

    The following algorithm provides the optimal benefit:

    The optimal solution is given by Method 3 from the above three methods

    The greedy algorithm uses the maximum benefit per unitselection criteria

    1. Sort items in decreasing pi / wi.2. Add items to knapsack (starting at the first) until there are no more items, or the next item to be added

    exceeds W.3. If knapsack is not yet full, fill knapsack with a fraction of next unselected item.

    Algorithm GREEDY_KNAPSACK(m,n)

    1. //P[1:n] and W[1:n] contain the profits and weights respectively of the n objects ordered so thatP[i]/W[i] > P[i+1]/W[i+1]. m is the knapsack size and X(1:n) is the solution vector//

    2. real P(1:n), W(1:n), X(1:n),U, m;3. {

    4. for i= 1to n do X[i] = 0.0;5. U = m ;

    6. for i = 1 to n do

    7. {8. if (W[i] > U) then break;

    9. X[i] = 1.0 ; U = U W[i];

    10. }11. if ( I

  • 7/30/2019 Greedy Note

    11/20

    Part I requires O(n lg n) time for sorting the pi/wi in nonincreasing order(use merge or heap sort).

    Part II i.e line no.4 to 5 and line 6 to 10 requires O(n) and O(n) time so combinigly it is O(n) time

    assuming that activities were already sorted in part I by their finish time.

    Correctness: Suppose there is a better solution

    4. there is an item i with higher value than a chosen item j, but xi0 and vi p2/w2 > > pn/wn

    Let X = (x1,x2,,xn) be the solution generated by GREEDY_KNAPSACK

    Let Y = (y1,y2,,yn) be any feasible solution

    We want to show that

    If all the xi are 1, then the solution is clearly optimal (It is the only solution) otherwise, let

    k be the smallest number such that xk< 1.

    Consider each of following blocks

    1 1 1 1 1 1 1 1 1 1 1 1

    1 2 n

    1

    .. .. 1xkk

    0 0 .. .. 0 0

    1 2 nk

    1 1 .. .. 1xkk

    0 0 .. .. 0 0

    1 2 nk

    =

    =n

    i

    iii pyx1

    )(

    =

    1

    1

    )(k

    i i

    iiiiw

    pwyx

    k

    kkkk

    w

    pwyx )( +

    +=

    +n

    ki i

    iiiiw

    pwyx

    1

    )(

    1 32

  • 7/30/2019 Greedy Note

    12/20

    =

    1

    1

    )(k

    i i

    iiiiw

    pwyx

    =

    1

    1

    )(k

    i k

    kiiiw

    pwyx

    k

    kkkkw

    pwyx )(

    k

    kkkkw

    pwyx )(

    +=

    n

    ki i

    iiiiw

    pwyx

    1

    )( +=

    n

    ki k

    kiiiw

    pwyx

    1

    )(

    =

    =

    n

    i

    iii pyx1

    )( =

    n

    i k

    kiii

    w

    pwyx

    1

    )(

    =

    n

    i

    iii

    k

    k wyxw

    p

    1

    )(

    Since W always > 0, therefore

    0)(1

    =

    n

    iiii pyx

  • 7/30/2019 Greedy Note

    13/20

    JOB Scheduling with deadlines

    Problem: Given a set of jobs {1,. . ., n} to execute that each takes one unit time. At any instant t= 1,. . ., we can

    execute exactly one job. Job i, 1 =0.And only one machine is available

    Goal: Maximize the profit with a feasible schedule.

    What is the feasible solution of this problem :- the feasible solution for this problem is a subset j of given jobset such that each job in this subset can be completed by its deadline.

    Then its value is the sum of the profits of the jobs in this subset j or Pii j

    A greedy algorithm: Iteratively constructing the schedule by adding at each step thejob with the highest profit piamong those not yet considered provided that resulting set ojobs remains feasible.

    Definition: A set of jobs is Feasible if there exists one (feasible) sequence that allows alljobs in the set to meet their respective deadlines.

    Finding a feasible schedule

    How to determine if a set is feasible without trying all possible sequences?Suppose a setJ is feasible, we can arrange it in a way such that the correspondingdeadlines is ordered according to the increasing values, sayj1,, jksuch that dj1,., djk

    To find the optimal solution and feasibility of jobs we are required to find a subset J such that each job o

    this subset can be completed by its deadline. The value of a feasible solution J is the sum of profits of althe jobs in J.

    Steps in finding the subset J are as follows:

    a. pi i J is the objective function chosen for optimization measure.

    b. Using this measure, the next job to be included should be the one which increases pi i J.c. Begin with J = and pi =0 i J

    d. Add a job to J which has the largest profit

    e. Add another job to this J keeping in mind the following condition:

    i. Search for job which has the next maximum profit.

    ii. See if this job is union with J is feasible or not.

    iii. If yes go to step (e) and continue else go to (iv)iv. Search for the job with next maximum profit and go to step (b)

    f. Terminate when addition of no more jobs is feasible.

  • 7/30/2019 Greedy Note

    14/20

    Illustration

    Consider 5 jobs with profits (p1,p2,p3,p4,p5) = (20,15,10,5,1)

    and maximum delay allowed (d1,d2,d3,d4,d5) = (2,2,1,3,3).

    Here maximum number of jobs that can be completed is

    = Min(n,maxdelay(di))

    = Min(5,3)

    = 3.Hence there is a possibility of doing 3 jobs.There are 3 units of time

    Time Slot

    [0-1] [1-2] [2-3] ProfitJob

    1 - yes - 20

    2 yes - - 153 cannot accommodate

    4 - - yes 5

    40

    In the first unit of time job 2 is done and a profit of 15 is gained, in the second unit job 1 is done and a profit 20 obtained finally in the 3rd unit since the third job is not available 4th job is done and 5 is obtained as the profit in

    the above job 3 and 5 could not be accommodated due to their deadlines.

    Algorithm:

    Step 1: Sort pi into nonincreasing order. After sorting p1 p2 p3 pi.Step 2: Add the next job i to the solution set if i can be completed by its deadline. Assign i to time slot [r-1, r],

    where r is the largest integer such that 1 r di and [r-1, r] is free.Step 3: Stop if all jobs are examined. Otherwise, go to step 2.

    Algorithm JS(d,j,n)1. //d[i] >=1 , 1=p[2]>=

    >=p[n].j[i] is the Ith job in the optimal solution , 1

  • 7/30/2019 Greedy Note

    15/20

    AnalysisIn the above algorithm we have two parameters like n is the no. of jobs and s is the no. of jobs included in thesolution j.

    The while loop in line no 10 iterated at most k times. And each iteration takes (1) time. If the condition in line

    11 is true then lines 14 and 15 are executed these lines take (k-r) times to insert job i. Hence the total time for

    each iteration of for loop of line no 6 is (k). this loop is iterated n-1 times . if s is the final value of k i.e s is theno of jobs in the final solution then the total time needed by the algorithm is (sn) .since s gbis impossible since otherwiseJ {a} U {b} is feasible and more profitablethan the optimalJ. Ifgb > ga, thenJ {a} U {b} is feasible and the greedy algorithm would have chosen bbefor

    a. Therefore only the casega=gbremains. This shows the greedy algorithm works well.

  • 7/30/2019 Greedy Note

    16/20

    Minimum Spanning TreesWhat is a Spanning Tree?

    A tree is a connected undirected graph that contains no cycles

    Aspanning tree of a graph G is a subgraph ofG that is a tree and contains all the vertices ofG

    Properties of a Spanning Tree

    The spanning tree of a n-vertex undirected graph has exactly n 1 edges

    It connects all the vertices in the graph

    A spanning tree has no cycles

    Definition of MST

    Let G=(V,E) be a connected, undirected graph.

    For each edge (u,v) inE, we have a weight w(u,v) specifying the cost (length of edge) to connect u and v

    We wish to find a (acyclic) subset TofEthat connects all of the vertices in Vand whose total weight is

    minimized. Since the total weight is minimized, the subset T must be acyclic (no circuit).

    Thus, Tis a tree. We call it a spanning tree.

    The problem of determining the tree T is called the minimum-spanning-tree problem.

    A graph and one of its minimum costs spanning tree

    GENERAL METHOD

    Growing a MST(Generic Algorithm)

    GENERIC_MST(G,w)

    1 A:={}2 while A does not form a spanning tree do

    3 find an edge (u,v) that is safe for A

    4 A:=A U{(u,v)}

    5 return A

    Set A is always a subset of some minimum spanning tree. This property is called the invariant Property. An edge (u,v) is a safe edge for A if adding the edge to A does not destroy the invariant. A safe edge is just the CORRECT edge to choose to add to T.

    Greedy Algorithms

    Kruskal's algorithm. Start with T = . Consider edges in ascending order of cost. Insert edge e in T unless doinso would create a cycle.Prim's algorithm. Start with some root node s and greedily grow a tree T from s outward. At each step, add th

    cheapest edge e to T that has exactly one endpoint in T.

  • 7/30/2019 Greedy Note

    17/20

    The algorithms of Kruskal and Prim

    The two algorithms are elaborations of the generic algorithm. They each use a specific rule to determine a safe edge in line 3 of GENERIC_MST. In Kruskal's algorithm,

    The set A is a forest.

    The safe edge added to A is always a least-weight edge in the graph that connects two distincomponents.

    In Prim's algorithm,

    The set A forms a single tree. The safe edge added to A is always a least-weight edge connecting the tree to a vertex not in

    the tree.

    Kruskals Algorithm

    Basics of Kruskals Algorithm

    Attempts to add edges to A in increasing order of weight (lightest edge first)

    If the next edge does not induce a cycle among the current set of edges, then it is added to A. If it does, then this edge is passed over, and we consider the next edge in order.

    As this algorithm runs, the edges of A will induce a forest on the vertices and the trees of this

    forest are merged together until we have a single tree containing all vertices

    Detecting a Cycle

    We can perform a DFS on subgraph induced by the edges ofA, but this takes too much time.

    Use disjoint set UNION-FIND data structure. This data structure supports 3 operations:

    Make-Set(u): create a set containing u.Find-Set(u): Find the set that contains u.

    Union(u, v): Merge the sets containing u and v.

    Each can be performed in O(lg n) time.

    The vertices of the graph will be elements to be stored in the sets; the sets will be vertices in each tree of

    (stored as a simple list of edges).

    MST-Kruskal(G, w)

    1. A // initially A is empty

    2. for each vertex v V[G] // line 2-3 takes O(V) time

    3. do MAKE_SET(v) // make set for each vertex

    4. sort the edges ofEby nondecreasing weight w

    5. for each edge (u,v) E, in order bynondecreasing weight

    6. do if Find-Set(u) Find-Set(v) // u&v on different trees

    7. then A A {(u,v)}

    8. Union(u,v)

    9. returnA

    Total running time is O(|E| log|E|)

    Our implementation uses a disjoint-set data structure to maintain several disjoint sets of elements.

    Each set contains the vertices in a tree of the current forest. The operation FIND_SET(u) returns a representative element from the set that contains u.

  • 7/30/2019 Greedy Note

    18/20

    Thus, we can determine whether two vertices u and v belong to the same tree by testing whether

    FIND_SET(u) equals FIND_SET(v).

    The combining of trees is accomplished by the UNION procedure. Running time O(|E| lg (|E|)). (The analysis is not required.)

    EXAMPLE The edges are considered by the algorithm in sorted order by weight.

    The edge under consideration at each step is shown with a bold line edge.

    Analysis of Kruskal

    Lines 1-3 (initialization): O(V)

    Line 4 (sorting): O(E lg E)

    Lines 6-8 (set-operation): O(E log E)

    Total: O(E log E)

    Correctness

    Consider the edge (u, v) that the algorithm seeks to add next, and suppose that this edge does not

    induce a cycle inA. LetAdenote the tree of the forestA that contains vertex u. Consider the cut

    (A, V-A). Every edge crossing the cut is not inA, and so this cut respectsA, and (u, v) is the light

    edge across the cut (because any lighter edge would have been considered earlier by the algorithm)

    Thus, by the MST Lemma, (u,v) is safe.

    Example with disjoint set data structure

  • 7/30/2019 Greedy Note

    19/20

    Prims Algorithm

    Prims Algorithm constructs the minimum-cost spanning tree by selecting edges one at a time likeKruskals

    The greedy criterion: From the remaining edges, select a least-cost edge whose addition to the set of selected edges

    forms a tree

    Consequently, at each stage the set of selected edges forms a tree

    Prims algorithm

    Step 1: x V, Let A = {x}, B = V - {x}

    Step 2: Select (u, v) E, u A, v B (u, v) has the smallest weight between A and B

    Step 3: (u, v) is in the tree.

    A = A {v}, B = B - {v}

    Step 4: If B = , stop; otherwise, go to Step 2.

    Time complexity : O(n2), n = |V|.

    WRITE DOWN THE ALGORITHM OF PRIMS FROM SAHANI AND ITS

    ANALYSIS

    EXAMPLE

  • 7/30/2019 Greedy Note

    20/20