28
Laboratoire de l’Informatique du Parallélisme École Normale Supérieure de Lyon Unité Mixte de Recherche CNRS-INRIA-ENS LYON-UCBL n o 5668 The General Broadcast Scheduling Problem with uniform length and overlapping message sets Master Thesis of Sandeep Dey Advisor: Nicolas Schabanel 4th July 2005 École Normale Supérieure de Lyon 46 Allée d’Italie, 69364 Lyon Cedex 07, France Téléphone : +33(0)4.72.72.80.37 Télécopieur : +33(0)4.72.72.80.80 Adresse électronique : [email protected]

The General Broadcast Scheduling Problem with uniform ...Unité Mixte de Recherche CNRS-INRIA-ENS LYON-UCBL no 5668 The General Broadcast Scheduling Problem with uniform length and

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

  • Laboratoire de l’Informatique du Parallélisme

    École Normale Supérieure de LyonUnité Mixte de Recherche CNRS-INRIA-ENS LYON-UCBL no 5668

    The General Broadcast Scheduling

    Problem with uniform length andoverlapping message sets

    Master Thesis of

    Sandeep Dey

    Advisor: Nicolas Schabanel

    4th July 2005

    École Normale Supérieure deLyon

    46 Allée d’Italie, 69364 Lyon Cedex 07, FranceTéléphone : +33(0)4.72.72.80.37

    Télécopieur : +33(0)4.72.72.80.80Adresse électronique : [email protected]

  • The General Broadcast Scheduling Problem with

    uniform length and overlapping message sets

    Master Thesis of

    Sandeep Dey

    Advisor: Nicolas Schabanel

    4th July 2005

    Abstract

    The Broadcast Scheduing Problem consists of finding an infinite schedulethat broadcasts news items so as to minimize the average service timefor clients requesting subsets of the news items. In the present day era,the demand for such type of a broadcasting service is very high, sinceit allows for efficient dissemination of data to a large number of passiveclients such as in satellites, radios, cable TV networks etc.Previous work in this area concentrated on scheduling news items whena client required only one news item. Here we address a generalizationof the problem and consider the situation when a client could requestfor any subset of items. We conclude that the problem is NP-Hard andgive two new approximation algorithms, a randomized algorithm withan approximation factor of 2Hn and a 4 factor deterministic algorithm.To complete the analysis, we come up with a new lower bound.

    Keywords: Data broadcast, Approximation algorithms, Randomizedalgorithms, Lagrangian relaxation, Perfectly periodic schedules

  • 1 Introduction

    With the rapid growth of the Internet and its user base, and with theavailability of high bandwidth links to almost all places, networks have chan-ged the way data is delivered and distributed between computers. Withthese technological improvements, the cost of transeferring data has redu-ced greatly both monetarily and bandwidth wise, due to which a lot ofnew applications like distributing data on a stock exchange, traffic flow in-formations and audio-video broadcasts have come up. These systems havethousands of clients recieving data from a main central server e.g ADSL TV,radios etc. As such huge amounts of data have to be disseminated in realtime.

    However these advances in communication along with the increasing sizeof the network is testing the limits of a lot of assumptions which were madeinitially to design distributed systems. The principles and designs of thedistributed systems studeis here need to be checked to keep them up to datewith the technological advances.

    An important change in network technologies has been the advent ofsystems where servers should be capable of delivering large amounts of in-formation to a huge number of users, especially in popular events like theolympic games, etc. As a result new innovative delivery technologies likesatellite communication and cable networks have been devised to provideshared broadband internet access.

    Different from traditional networks, these new technologies have one dis-tinguishing feature, they support broadcasting much more naturally thanany systems before. In contrast to unicast, where an object on interest toa lot of clients had to be transferred to all of them individually, broadcastsends the object only once, thus making very efficient use of the sharedbandwidth.

    With the advent of wireless networks as one of the major developingareas in computer networks today, broadcasting gets a new meaning. Herethere are cases when the bandwidth from the source to the client is muchmore than in the reverse way e.g. when handheld wireless devices may beable to download traffic flow information from a centralized wireless server.In these cases the client device doesnt need an emitter if the download isthrough a broadcast.

    Broadcast systems are push-based systems, where the client does notrequest for data, but simply connects to the broadcast channel shared by allthe clients e.g radios . Basically the server ”pushes” the data to the clientsaccording to a schedule which does not depend on the incoming requests.These schedules generally are designed by user profiles which imply thepopularity of the meessages relative to each other.

    The data broadcasting protocols have a number of research and com-mercial applications. Boston Community inormation system (BCIS, 1982)

    1

  • was one of the first application of these protocols to deliver news and otherinformation to handheld radios. teletext and videotex systems [1, 9] alsouse these protocols. The ATIS (advance traffic information system) [17] pro-vides information to vehicules specially equipped with computers, to recievetraffic flow data by the same protocols and internet news delivery systemsmake use of the same.

    All the present problems deal with the the problem where either demandof two clients is exactly the same or the demand of two clients is totallydifferent. The idea we are driving to is a customised news service, whereevery client has his choice to choose any subset from a set of news items.Earlier on these kinds of systems were not present, where information wasntsegmented, people who wanted to find traffic flow on a particular streetwould get the whole information in one broadcast.

    We consider the issue of the broadcast newspaper. A broadcast newspa-per is an internet based news service. The central server broadcasts a set ofnews items. Different clients may want different news items e.g one may wantto download the sports and the entertainment news, while another one maylike to download the sports and the national news. The problem which isposed , is how often should one broadcast a news items. we will see that thegeneral ideas of the past dont work. New broadcast scheduleing algorithmsare needed to cope with the present problems. It is with this idea, that weundertook a study of the theoretical base of these practical problems andwere able to provide an efficient algorithm.

    1.1 State of Research

    The research on data broadcast problems in a setting where all messageshave the same length and the broadcast is done on a single channel withdiscrete time, started in the early 1980’s [9, 1, 2, 4, 11]. Ammar and Wong [1,2] analysed the periodic schedules, gave an algebraic expression for the Cost(defined as the average response time to user), a lower bound and provedthe existence of a periodic schedule which is optimum. Bar-Noy, Bhatia,Naor and Scheiber [4] prove that problem with broadcast costs are NP-Hardand are able to give a constant factor algorithm. Kenyon, Schabanel andYoung [11] design a PTAS for the problem. In his Thesis [15], Schabanelproposed several constant factor approximation algorithms for non uniformlength, preemption and both together.

    Our work draws very strongly from one another topic, perfectly periodicschedules. Hammed and Vaidya [19, 18] propose the weighted fair queuingto schedule broadcasts. Khanna and Zhou [12] show how to use indexingwith periodic scheduling to minimoze busy waiting. They also give an ap-proximation algorithm for designing periodic schedules. Bar-Noy et al [5]introduce the tree schedule design and the notion of perfect periodicity.

    2

  • 1.2 Our Contribution

    In this report, we address the issues of the broadcast scheduling problemwhere clients are free to choose any subset of messages. Schabanel [15, 16]worked on the special subcase of the same problem where no two subsetsoverlapped. As mentioned earlier, research has been done on broadcast sche-duling topics where each message has its own demand probability and withpreemption (meaning sets of messages are requested but no two sets overlap),but no previous algorithms have been proposed in the present problem.

    We propose two seperate algorithms for the beforementioned problem.The first one is a randomized approximation algorithm which has an approxi-mation factor of 2Hn, Hn being the harmonic function. The next algorithmis a deterministic approximation algorithm with an approximation factorof 4. Both the algorithms are simple and offer an intutive viewpoint of theproblem at hand.

    1.3 Organisation of the report

    The outline of the rest of the report is as follows :– The next section deals with the notations and the preliminaries invol-

    ved with the problem. First we introduce the model on which we definethe problem and then the proper notations employed while giving theproofs and the algorithms. Then we go on to prove the NP-Hardnessof the problem, which in turn implies that no polynomial time algo-rithm will exist unless P=NP. Moreover we look into the relations ofa periodic and a non-periodic schedule.

    – The 3rd section composes of our original contribution. In this sectionwe propose the randomized and the deterministic approximation al-gorithm which we have developed during the course of the internship.We also discuss the importance of the lower bound obtained and itsderivation.

    – The last section consists of the summary and the conclusions and somefurther problems along with some suggestions for further research onthose problems.

    2 Notation and Preliminaries

    In the present section, we introduce the model on which we base theproblem. Once the model is established, we go on present the broadcastingproblem on this model along with a couple of notations which will facilitatethe presentation of the proofs later on.

    3

  • 2.1 Model and Notation

    Consider that there is a news station. This news station broadcasts newsof all kinds but there is always a limit on the number of news items that thestation will broadcast on a given day. The clients recieve the news simplyby starting their radio sets. Since clients may want to listen to the news anytime of the day, hence the station keeps on broadcasting the news items allover the day not necessarily in a periodic manner. The clients which recievethe news may want news of a particular kind e.g. international, national,sports, entertainment, buisness etc. Any news item can belong to one ormany of these categories. A client after switching on his set, waits for sometime until a news item in the news category he is interested in starts broad-casting. He waits until all the news items in this particular category havebeen broadcasted and then switches the set off. The waiting time for thatclient is the total amount of time that his radio set was switched on. At anytime a lot of clients are accessing the news broadcast, but for simplicity wemake two assumptions.

    – News are organized in possibly overlapping categories.– A client only waits for only one news category (Strictly speaking this

    is not a restrictive assumption but if a client wants news from twocategories like sports and entertainment, then we can make anothernews category which has both sports and entertainment news in it)

    – The number of clients switching on their radio sets at any time isuniform the whole day.

    – The probability of clients waiting for a particular category of news isthe same at all times of the day.

    – The broadcast duration of all the news items is the same and newsitems are broadcasted one after another without any delay. We alsoassume that the broadcast time for any news item is unit time. If thenews item has a length larger than the unit, then we can break the newsitem into unit size and hence return back to our basic assumptions.

    A schedule is simply a way of broadcasting the news items one afteranother. The waiting time for a message at time t is the length of timeafter which it is first seen after time t. The average waiting time for a newscategory is average of the waiting times of clients waiting for that newscategory which switch on the sets over the whole day. The Cost of a scheduleis the average of waiting times of all clients all over the day. Our goal is tominimize this quantity.

    Now since the description of the problem has been given. We now des-cribe our model formally. We call the news items messages from now on.

    A set of n messages (news items) is given M = {M1,M2, . . . ,Mn}. Aset of subset(news categories) of M , ζ = {S1, S2, . . . , Sk} where Si ⊆ M , isgiven. S will be a variable denoting an element of ζ. A set (pi)i=1,2,...,k isalso given such that with each of the subsets Si is associated the probability

    4

  • WT(M )6

    t=0

    MMM M1 M3 M4 M2 M7 M6 1 M1 M3M4M267M4

    WT(S) where S={M ,M ,M }62 1

    Fig. 1 – New Periodic Schedule Constructed

    pi that a random client will want access the messages of Si. We use pS todenote the probability associated with the set S. So it is evident that

    k∑i=1

    pi = 1 (1)

    We define a schedule Sch as an infinite sequence Sch = s0s1 . . . wheresj ∈ {1, 2, . . . , n} for all j. If sj = i, we say that the message i is scheduled tobe broadcast at slot j. A schedule is periodic if it is an infinite concatenationof a finite sequence.

    Consider the length of time after which message Mi appears for the firsttime after time t. Call it WT (Mi, t) which is hence the waiting time for themessage Mi at time t. The waiting time for all the messages of S at time tis

    WT (S, t) = maxMi∈S

    WT (Mi, t) (2)

    Hence the waiting time for a request arriving at time t is

    Cost(Sch, t) =∑S∈ζ

    pS × maxMi∈S

    WT (Mi, t) (3)

    So the average cost for a schedule comes out to be :

    Cost(Sch) = limsupT→∞1T

    T∑t=1

    Cost(Sch, t) (4)

    Cost(Sch) = limsupT→∞1T

    T∑t=1

    ∑S∈ζ

    pS × maxMi∈S

    WT (Mi, t) (5)

    We use E(WT (Mi, t)) as the expected waiting time in the case when weare dealing with randomized algorithms.We use E(WT (Mi, t)) to denotethe expectation of WT (Mi, t). Notice that we have used Cost as functionfor the schedule at time t and as an average. The usage of Cost can beinterpreted from the arguments which we are using so there is no ambiguityin analysis.

    The problem is defined as :

    5

  • Definition 2.1. General Broadcasting Problem with uniform length :Given a set of messages M = (Mi)i=1,2,...,n and a set of subsets (Si)i=1,2,...,ksuch that Si ⊆ M , and a set of probabilities (pi)i=1,2,...,k, our aim is to finda schedule Sch such that

    Cost(Sch) ≤ Cost(Sch′) ∀ Sch′ (6)

    where Sch′varies over the set of all schedules

    We additionally define fi to be the frequencies associated with a messagein a periodic schedule which is the number of times a message is broadcasteddivided by the total number of slots for any schedule. τi denote the inverseof fi.

    τi =1fi

    (7)

    which implies τi is the ”period” of message Mi or the average of the numberof slots between two successive broadcast of Mi.

    2.2 NP Hardness of the problem

    We show that the above mentioned problem is a NP-Hard problem whichimplies that a polynomial time algorithm is unavailable (unless P=NP), thisalso implies that our best hopes lie with approximation and randomizedalgorithms.

    Theorem 2.1. The General Broadcasting Problem with uniform length isNP-Hard.

    Démonstration. Schabanel in [15, 16], proved that the same problem withthe preemptive condition, i.e no Si’s overlap i.e Si ∩ Sj = ∅, ∀i �= j, whereSi corresponds to the packet of messages Mi, is a NP-hard problem. Henceit proved that the General Broadcast Problem with uniform length is a NP-Hard problem.

    2.3 Optimal periodic schedule are arbitarily close to the op-timum

    We prove that there exists a Periodic schedule whose cost is less than �+ the optimum cost of the schedule for any �.

    Theorem 2.2. Let Sch be any schedule for a general broadcasting problem,then ∀� > 0 there exists a peiodic schedule PerSch�such that

    Cost(PerSch�) ≤ Cost(Sch) + � (8)

    6

  • Previous Schedule Sch

    Seq

    All messages from 1 to n

    repeat the prior Sch.

    Fig. 2 – New Periodic Schedule Constructed

    Démonstration. The basic trick is to take a subsequence from Sch whosecost is close enough to the optimum and after some additions use it as acycle for a periodic schedule. The Avg Cost refers to the average cost tilltime T .By definition

    Avg Cost(Sch, t) =1T

    T∑t=1

    Cost(Sch, t) (9)

    andCost(Sch) = limsupT→∞ Avg Cost(Sch, t) (10)

    Hence for all � , there exists a T ′ , such that for all T > T ′

    Avg Cost(Sch, T ) ≤ Cost(Sch) + �2

    (11)

    so we choose a T� > T ′ such that

    2�(n + Cost(Sch, 0)) × n − n ≤ T� (12)

    so

    (n + Cost(Sch, 0)) × nT� + n

    ≤ �2

    (13)

    We take the sequence Seq of messages of Sch from the beginning to slotT� and at the end of the Seq, add a small sequence Seq′ of messages oflength n consisting of all the messages M1,M2, . . . ,Mn,where the messagesare added in order of their first appearance in Sch after time T� in scheduleSch. We make a new periodic schedule PerSch� with period T� + n wherethe beforementioned sequence to be repeated. This is a periodic schedule.The costs of the slots of Seq remain the same or decrease since the addedsequence of Seq′ provides all the messages earlier than in Sch.

    For t ≤ T�Cost(PerSch�, t) ≤ Cost(Sch, t) (14)

    7

  • For T� < t ≤ T� + n

    Cost(PerSch�, t) ≤ Cost(Sch, 0) + n (15)

    since for the slots there the waiting time is less than or equal to the waitingtime of the first slot (periodic schedule).Since this is a periodic schedule thecost is equal to the Avg Cost of the first T� +n time slots. The contributionif the first T� slots is Avg Cost(Sch, T�) and the later n slots is less than orequal to Cost(Sch, 0) + n

    Cost(PerSch�) =Avg Cost(Sch�, T�) × T� + (Cost(Sch, 0) + n) × n

    T� + n(16)

    Cost(PerSch�) ≤ Avg Cost(Sch, T�) × T�T� + n

    +(Cost(Sch, 0) + n) × n

    T� + n(17)

    Cost(PerSch�) ≤ Cost(Sch) + �2 +�

    2(18)

    Cost(PerSch�) ≤ Cost(Sch) + � (19)

    Corollary 2.1. OPTperiodic schedules = OPTnon periodic schedules

    Hence proved that there exists a periodic schedule whose cost is arbitarilyclose to cost of the optimum schedule.

    3 Our Contribution

    Our contribution to this problem basically lies in giving efficient al-goithms for the general case of the broadcast schedulin problem, where theconsumer gets to choose what news items he wants to choose, includingmultiple choices from a predefined set of genres.

    This problem is a NP Hard one, which implies a polynomial time algo-rithm to find the optimum schedule can not be found unless P = NP. ForNP Hard or NP Complete optimization problems, the best known algorithmstend to be approximation or randomized algorithms. Approximation algo-rithms refer to algorithms for optimization problems which output resultsguaranteed to be less than a factor of the optimum solution. This factor ter-med as the approximation factor could be anything starting from a constantto a polynomial in the size of the problem. Randomized algorithms on theotherhand are algorithms which uses a string of random bits in its algorithm.

    We have proposed two new algorithms for the beforementioned problemout of which one is an approximation algorithm with a constant approxi-mation factor of 4 and another randomized approximation algorithm whichgives an approximation factor of 2 × Hn where

    Hn =1n

    + . . . +12

    +11

    (20)

    8

  • Consider a minimization problem and an approximation algorithm. Toshow that the output from the algorithm is less than a given factor α of theoptimum solution, we need to define a lower bound and show that

    1. the lower bound is less than the cost of the optimum solution.

    2. the output solution has a cost less than α times lower bound.

    Even to judge the effectiveness of a randomized algorithm, a lower boundis needed, the difference from the above being that, instead of using theoutput of the algorithm to compare to the lower bound, we use the averageof all outputs to compare to the lower bound.

    The rest of this chapter is organised as follows. The first section showsthe importance of lower bound and the lower bound we chose to analyseour algorithms. In the second section we present our randomized algorithm,analyzing it with the help of our lower bound. The third section brings usto the construction of a constant factor approximation algorithm.

    3.1 Bounding the optimum value

    The use of a appropriate lower bound is central to the analysis of anapproximation or a randomized algorithm of an optimization problem. Weexplain the relation of lower bound and the approximation factor in a littlemore detail.

    Given an minimization problem P characterized by– D a set of input instances– S(I) the set of all feasible solutions for an instance I ∈ D– f a function which assigns value to all solutions f : S(I) → R

    We have to find a optimum solution OPT (I) which has been the mini-mum value of f when compared to all other feasible solutions.

    If we device an approximation algorithm A for the aforementioned pro-blem with a approximation factor α, then we should make sure that,

    f(A(I))f(OPT (I))

    < α (21)

    where A(I) is the solution obtained from the algorithm. Since we do notknow what value the optimum has, so we try to bind it on the lower side bya lower bound L.

    Consider that we have two lower bounds L1 and L2 and assume L1L2 = βwhere β is not a constant, but an increasing function of n, the size of input.If the approximation algorithm A is analysed with the help of the lowerbound L2 then the approximation factor will come out to be α, while if itsanalyzed by the help of the lower bound L1, then the approximation factoris αβ which is definitely better than the aforementioned bound e.g. a constant

    9

  • factor algorithm may be wrongly analysed as a O(n) factor algorithm. So thelower bound is quite critical in the analysis of the approximation algorithm.

    Coming back to the central problem, previous research done on this topichas yielded a number of approximation algorithms. But since none of theprevious work done was concerned with overlapping message sets, hencepreviously considered lower bounds are not very appropriate in any analysishere.

    We use the notations developed in section 2.1. The estimated waitingtime for any single message Mi is τi on average. Without loss of generali-zation lets assume that S = {M1,M2, . . . ,Mq}.At any point of time, theestimated waiting time for the set of message packets S is greater than thewaiting time for any packet Mi for all i in {1, 2, . . . , q}.

    WT (S, t) ≥ WT (Mi, t) (22)WT (S, t) ≥ max

    Mi∈S(WT (Mi, t)) (23)

    WT (S, t) denotes estimated waiting time at time t for set of messages S.The cost function of a schedule Sch at time t is the sum of the cost of

    individual message packets S

    Cost(Sch, t) =∑S∈ζ

    pS × WT (S, t) (24)

    Hence,Cost(Sch, t) ≥∑S∈ζ

    pS × maxMi∈S

    (WT (Mi, t)) (25)

    where ζ is the set of all message subsetsUsing theorem 2.2, without loss of generality, we can restrict ourselves

    to lower bound the cost of periodic schedules to get a lower bound in theoptimum cost ? Let Sch be a schedule with period T . Since the cost is theaverage over a period hence

    Cost(Sch) =1T

    T∑t=1

    Cost(Sch, t) (26)

    Cost(Sch) =1T

    T∑t=1

    ∑S∈ζ

    pS × WT (S, t) (27)

    Cost(Sch) =1T

    ∑S∈ζ

    pS ×T∑

    t=1

    WT (S, t) (28)

    Cost(Sch) =∑S∈ζ

    pS × 1T

    T∑t=1

    WT (S, t) (29)

    10

  • Let the message Mi be broadcasted N(Mi) times during a period andlet thetime slots when Mi is broadcasted be Ti,j where i ∈ {1, 2, . . . , n} andj ∈ {1, 2, . . . , N(Mi)}.

    Hence, for all Mi ∈ S

    1T

    T∑t=1

    ×WT (S, t) ≥ T2i,1 + (Ti,1 − Ti,2)2 + . . . + (Ti,N(Mi)−1 − Ti,N(Mi))2

    2 × T(30)

    Classically the above term is minimum when the messages are evenly distri-buted in the schedule with equal periods between them i.e to say every

    τi =T

    N(Mi)(31)

    . Since we already know the frequency fi = 1τi (that just implies the totalnumber of occurences of message Mi in the schedule),

    1T

    T∑t=1

    WT (S, t) ≥ 12× 1

    fi∀ i (32)

    1T

    T∑t=1

    WT (S, t) ≥ 12× τi ∀ i (33)

    1T

    T∑t=1

    WT (S, t) ≥ 12× max

    Mi∈Sτi (34)

    Cost(Sch) ≥ 12

    ∑S∈ζ

    pS × maxMi∈S

    (τi) (35)

    (36)

    hence we define the lower bound as follows

    L =

    minτ1,τ2,...,τn≥012

    ∑S∈ζ pS × maxMi∈S(τi)

    such that 1τ1 +1τ2

    + · · · + 1τn ≤ 1As above we have already shown that L is actually less than the Cost of theoptimum schedule.

    We further show with the help of an example that relaxing furthermorethe bounds by replacing the max by the average, so that we can compute thelower bound easily by lagrangian relaxation rather than ellipsoid algorithm,would have yielded a lower bound L′ arbitarily bad as compared to theoriginal L. As argued beforehand, the analysis of the algorithms would nothave been proper, and the approximation factor would not be tight.

    11

  • Let us define the relaxed lower bound as

    L′ =

    minτ1,τ2,...,τn≥012

    ∑S∈ζ pS ×

    ∑Mi∈S

    τi|S|

    such that 1τ1 +1τ2

    + · · · + 1τn ≤ 1(37)

    Example : Let the message set M = {A1, . . . , An, B1, . . . , Bk}. The setof message setsζ = {{A1}, {A2}, . . . , {An}, {A1, . . . , An, B1}, {A1, . . . , An, B2}, . . . ,{A1, . . . , An, Bk}} So the set of messages consists of two different types,one which have just one message of the type Ai i.e. Si = {Ai} and havingprobability p = 12n , the other packet type has all the A packets and oneBi packets i.e S

    ′i = {A1, . . . , An, Bi}. The second type of sets have each

    probability p = 12k .Since all Ai are symmetrical, hence we claim that in an optimum to the

    lower bound L′, frequencies are symmetrically distributed i.e. τ1 = τSi =τSj ∀ i, j and τ2 = τS′i = τS′j ∀ i, j. Suppose this is not so, i.e there exists anoptimum schedule where for some i, j we have τSi �= τSj . The cost functionis a convex function of the frequencies τ . If we exchange the two valuesτ§i and τSj , we have a new schedule with the same Cost, but since the costfunction is a convex one, hence linear combinations of these two scheduleswill have a cost less than or equal to the original Cost ,which presents acontradiction, hence the previous assumption is wrong.

    Consider the lower bound L′.

    L′ =

    minτ1,τ2≥012 × τ1 + 12 × τ2+n×τ1n+1

    such that nτ1 +kτ2

    = 1

    Considering the above constraints, we find out the minima of the func-tion by partial differentiation using Lagrangian relaxation by introducing anadditional variable λ redefining the objective function(function to be mini-mized) as

    f(τ1, τ2, λ) =12× τ1 + 12 ×

    τ2 + n × τ1n + 1

    − λ(1 − nτ1

    − kτ2

    ) (38)

    Since the partial differentiation of f with respect to all the variables, atthe minima will be zero for all variables, hence :

    ∂f∂τ1

    = 0 ⇒ τ1 =√

    2(n+1)n2n+1 × λ

    ∂f∂τ2

    = 0 ⇒ τ2 =√

    2(n + 1)k × λ

    ∂f

    ∂λ= 0 ⇒ n

    τ1+

    k

    τ2= 1

    12

  • Using the above two results in the third result, we get

    λ = (

    √k +

    √n(2n + 1)√

    2(n + 1))2

    Since we are interested in asymptotical values when n, k go to infinity , weomit negligible terms to focus on the asymptotical values of the parameters.

    After solving the above equations, the lower bound comes out to be :

    L′ = (√

    n +

    √k√n

    ) × (√

    k√n

    +√

    n) (39)

    L′ = n +k

    n+

    √k (40)

    Consider now the newly constructed lower bound L

    L =

    minτ1,τ2≥012 × τ1 + 12 × max(τ1, τ2)

    such that nτ1 +kτ2

    = 1

    To find out the minimum of the function, we have to consider three seperatecases :

    – if at the minimum τ1 = τ2

    τ1 = τ2 = n + kMin = n + k

    – if at the minimum τ1 > τ2

    L = τ1

    Since τ1 > τ2 and nτ1 +kτ2

    = 1so τ1 > n + k, hence Min > n + k.

    – if at the minimum τ1 < τ2

    L =τ1 + τ2

    2τ2 > n + k

    Now we use the same Lagrangian relaxation as used earlier, so here fis defined as

    f(τ1, τ2, λ) =τ1 + τ2

    2− λ(1 − n

    τ1− k

    τ2)

    13

  • ∂f∂τ1

    = 0 ⇒ τ2 =√

    2nλ

    ∂f∂τ2

    = 0 ⇒ τ2 =√

    2kλ

    ∂f∂λ = 0 ⇒ nτ1 + kτ2 = 1

    Using the first two equations to derive the two above relations betweenτ1 and λ and similarily between τ1 and λ and then using these relationsin the final third equation results in

    τ1 =√

    n(√

    n +√

    k)τ2 =

    √k(√

    n +√

    k)

    Min =(√

    k +√

    n)2

    2

    Hence the lower bound comes out to be (√

    k+√

    n)2

    2 if k > n and k + n other-wise.

    So if n < k < n2 , LL′ = O(√

    k/n). Consider the analysis of a constantfactor approximation algorithm A with the above example in context. Withthe new lower bound L,

    Cost(A)L

    = c (A Constant) (41)

    Cost(A)L′

    ≥ c ×√

    k/n (42)

    So if analysed by the lower bound L′ the constant factor algorithm, will beanalysed as a

    √k/n-factor algorithm.

    We have proved that a wrong lower bound will result in bad analysisof approximation algorithms. We have proposed a lower bound based onanother objective function.

    L = minτ1,τ2,...,τn

    12

    ∑S∈ζ

    pS × maxMi∈S

    (τMi)

    where 1τ1 +1τ2

    + · · · + 1τn ≤ 1The only thing which remains is how to find the optimum in polynomial

    time. We use ellipsoid method mentioned in [13, 10, 14, 8] to solve the abovementioned set of equations. The method works since the present equationsare convex.

    3.2 Randomized Algorithm

    3.2.1 Introduction

    Randomized algorithms are a common solution to optimization problem.They work effectively in a manner that they are not very complex in their

    14

  • construction, but generally, their solutions are sufficiently close to the opti-mum, although the analysis associated with finding out the efficiency of thealgorithm are a little heavy.

    Most earlier works done on this topic have come up with a randomizedalgorithm for some specific subproblem of the general broadcasting problem.For obtaining a randomized algorithm, the most general approach has beenthe following.

    – The lower bound on the cost of a schedule is first derived by a sui-table objective function. We minimize the objective function and bythis process we end up at a lower bound on the cost. Intutively sincethe lower bound and the optimum are supposed to be close, hence weuse the same frequencies that we obtain while minimizing the objec-tive function. Let the objective function be Obj(τ1, τ2, . . . , τn) and letCOST (Sch) be the cost of the schedule which has the frequency ofMi as 1τi

    Obj(τ1, τ2, . . . , τn) < COST (Sch) (43)inf

    τ1,...,τnObj(τ1, τ2, . . . , τn) < inf

    All SchedulesCost(Sch) (44)

    infτ1,...,τn

    Obj(τ1, τ2, . . . , τn) < OPT (45)

    So finding a minimum to Obj(τ1, τ2, . . . , τn) gives us a lower bound onoptimum cost for a schedule.

    – Next, we assign the use the frequencies thus obtained and device a ge-neral randomized algorithm where the probabililty of a message beingbroadcast is equal to the frequency obtained by previous methods.

    3.2.2 The Algorithm

    Algorithm 1 Randomized approximation algorithm for the broadcast sche-duling problemInput: – n messages M1,M2, . . . ,Mn, k sets S1, S2, . . . , Sk of messages and

    demand probabilities (pi)i=1...k associated with each of the message setsSi.

    – A distribution of frequencies (fi)i=1...n, such that f1 + f2 + . . .+ fm = 1while t > 0 do

    Pick i ∈ {1, 2, . . . , n} with probability fiBroadcast Mi

    end while

    The frequencies fi mentioned in the above algorithm are obtained fromthe objective function by the help of ellipsoid method, as discussed in theprevious section.

    15

  • 3.2.3 The Analysis

    Theorem 3.1. The randomized algorithm when using the frequency distri-bution fi = 1

    τ′i

    yields schedules with an approximation factor of Hn on the

    expected cost, where 2Hn is the harmonic function Hn = 1n +2n + . . . + 1

    Démonstration. The input given to us is– A set of n messages M = {M1,M2, . . . ,Mn}– ζ = k sets S1, S2, . . . , Sk of messages, Si ⊆ M– Demand probabilities (pi)i=1...k associated with each of the message

    sets SiWe use the objective function

    Obj(τ1, τ2, . . . , τn) =12

    ∑S∈ζ

    pS × maxMi∈S

    (τi)

    LB = minObj(τ1, τ2, . . . , τn) whenn∑

    i=1

    1τi

    ≤ 1

    and find the minimum for this function. If we get the minimum of theobjective function at 1

    τ′1

    , 1τ′2

    , . . . , 1τ ′n

    and lower bound be LB. Working on

    the intution that the objective function and the cost function are close, werandomly broadcast each message Mi with probability 1τ ′i

    .

    Lemma 3.1. The expected waiting time or E(WT (S, t)) for any set S isless than Hn × maxMi∈S(τi) for the schedule resulting from the algorithm

    E(WT (S, t)) < Hn × maxMi∈S

    (τ ′i)

    where WT (S, t) is the expected waiting time for the set S of messages.

    Démonstration. Consider a set S of J messages S = {M1,M2, . . . ,MJ} wi-thout loss of generality. Also we may assume that we start at t and arewaiting for all the J messages. At any time interval the message Mi isbroadcasted with a probability 1

    τ′i

    . The event of observing the J messages

    is actually is a succession of J distinct events, one for each time, a mes-sage appears from S for the first time after t. Let the events be namedevent1, . . . , eventJ . Let the time for the eventi be ti, and let the time inter-vals between two events eventi and eventi+1 be termed as Intervali+1 so

    Interval1 = t1 (46)Intervali = ti+1 − ti (47)

    tJ − t = Interval1 + Interval2 + . . . + IntervalJ (48)E(tJ − t) = E(Interval1) + E(Interval2) + . . . + E(IntervalJ) (49)

    16

  • tJ is the last event or the point where all the messages have been re-cieved. So tJ − t is the expected waiting time at t, but since the messagesare broacasted with the same probabilities at all times, hence the expectedwaiting time does not change.

    At time ti, i messages have been already broadcasted after t. For theevent to happen , any of the other J − i messages have to be broadcas-ted after now. Since all the messages have broadcast probabilities atleastminMi∈S(

    1τ′i

    ), hence the probability of getting any of the J − i messages isatleast (J − i) × minMi∈S( 1τ ′i ). Lets call it q.

    If at every time interval an event can happen with a probablity q, thenthe expected time at which the event first occurs is equal to 1q .

    Pr(event happens first at time t) = (1 − q)t−1q

    E(first time event happens) =∞∑t=1

    (1 − q)t−1q

    E(first time event happens) =1q

    applying this in the present case, we get.

    Pr(eventi+1) = (J − i) × minMi∈S

    (1τ

    ′i

    )

    E(Intervali+1) =1

    (J − i) × minMi∈S( 1τ ′i )

    E(Intervali+1) =1

    J − i × maxMi∈S τ′i

    E(tJ − t) = E(Interval1) + E(Interval2) + · · · + E(IntervalJ)

    E(tJ − t) = maxMi∈S

    (τ′i ) ×

    J−1∑i=1

    1i

    E(tJ − t) = maxMi∈S

    (τ′i ) × HJ

    E(WT (S, t)) = maxMi∈S

    (τ′i ) × HJ

    17

  • so the expected cost from the schedule comes out to be

    Cost(Sch, t) =∑S∈ζ

    pS × WT (S, t)

    E(Cost) =∑S∈ζ

    pS × E(WT (S, t))

    E(Cost) =∑S∈ζ

    pS × maxMi∈S

    (τ′i ) × HJ

    Lower Bound =12×

    ∑S∈ζ

    pS × maxMi∈S

    (τ′i )

    Approximation Factor =E(Cost)

    Lower Bound≤ 2Hn

    So its proved that the approximation factor of the randomized algorithmbeing analysed comes out to be 2Hn where n is the number of messages

    Further we prove that this is a tight bound with the help of someexamples. Consider the example,There are n messages M1,M2, . . . ,Mn, there is one set S = {M1,M2, . . . ,Mn}and PS = 1.

    Obviously due to the symmetry of the question, we know that that allthe frequencies should be equal to minimize our lower bound . Further wecan even guess the most optimum schedule, a round robin schedule of all themessages like M1 : M2 : . . . : Mn. The Cost comes out to be n. If we takethe random schedule , the probability for any message to be broadcasted isequal to 1n . hence the expected waiting time equals 1 +

    nn−1 + . . . +

    n2 + n.

    So the approximation factor comes out to be Hn = 1 + 12 + . . . +1n .

    The above example can be extended into an example where we have ksuch subsets each having a n messages with probability 1k each. Schaba-nel [16] worked on Preemptive cases like this and proved that the optimumschedule is when all the messages are broadcasted one after another and thenall the subsets are broadcasted one after another. The cost of the optimumschedule comes out to be a

    Optimum Cost =n × (k2 + 1)

    2k(50)

    Randomized Cost = nk × H(k) (51)Approximation Factor =

    2H(k)k2

    k2 + 1(52)

    18

  • Hence, if k = n, then as k → ∞, the Approximation factor tends to2H(n). This demonstrates that the bound is tight. So the randomized algo-rithm is actually a 2Hn-factor algorithm.

    3.3 Deterministic Approximation Algorithm

    We propose a new 4-approximation algorithm for the general schedulingproblem. While the older algorithm were essentially derandomizations of therandomized algorithms, our approach consists of using the perfectly periodicschedules, to construct a periodic schedule while at the same guaranteeingan approximation factor of 4.

    3.3.1 Introduction to Perfectly periodic schedule

    Consider a system with n messages which have to be broadcasted anda single bandwidth resource on which they have to be broadcasted. So theyhave to share the bandwidth by time multiplexing. A schedule for resourceallocation is called perfectly periodic, if the resource gets allocated to anymessage i every once after βi time slots.

    The main question arises as to the usefulness of the perfectly periodicschedules. Since these schedules are mathematically very simple, hence theyresult in some particularly pleasing consequences.

    – They are very simple and easy to analyse.– The process of inference of the schedule from the client’s viewpoint is

    very simple , although this point is unrelated to the problem presentlyunder consideration.

    Even deciding if a given set of periods admit a Perfectly periodic scheduleis NP-Hard if

    ∑i τi ≤ 1 [5]. So its impossible to hope for a polynomial time

    algorithm to find a perfectly periodic schedules. However, tree scheduleshave been found to be very effective in calculations of perfectly periodicschedules. A tree schedule is a schedule represented by a tree, where theleaves correspond to clients, and the period of each client is computed basedon the depth of the leaf and the product of degrees if its ancestors.

    The construction of a tree schedule by using the Shamir et al algorithm.[6] can be used very effectively in the present problem to construct a perfectlyperiodic schedule.

    3.3.2 Construction of a Perfectly periodic Schedule

    Schedule S is defined as an infinite sequence S = s0s1 . . . where sj ∈{1, 2, . . . , n} denoting the n messages. A schedule is perfectly periodic, if theslots allocated to each message are equally spaced i.e for each message Mithere exists integers βi ≥ 1 and 0 ≤ oi < βi, such that i is scheduled in thejth slot if and only if j = oi mod βi. βi is the period of the message i andoi its offset.

    19

  • Algorithm 2 Dispatching algorithm for a Schedule Tree [6]Dispatch Function

    Input: A schedule tree TA message Identifier

    Output: a messageCODEv ← root(T )while v is not n leaf do

    v ← Token(v)end while

    Token FunctionInput: a non-leaf node uOutput: a node in T

    CODElet e0, e1, . . . , ed−1 be the d outgoing edges of uif ei has the token and ei = (u, v)move token to e(i+1) mod dreturn v

    A schedule tree is defined as follows : a ordered tree can be interpretedas a perfectly periodic schedule as follows. The message of the scheduleform a bijection with the leaves of the tree. An ordered tree is a rootedtree where the edges coming out from each non-leaf u mode are numbered0, 1, . . . , deg(u)−1. The period or root r is β(r) is 1. The period of a non-rootnode u is computed recursively as

    β(u) = β(par(u)) × deg(par(u))

    where deg(v) is the degree of node v and par(v) denotes the parent of nodev. To calculate the offset of a node, we define a funtion h(u) which denotesthe position of a node among its siblings and is equal to the number on theedge from par(u) to u. The offset then can be easily described as

    o(u) = o(par(v)) + h(v) × β(par(v))

    The above described schedule will come out to be a perfectly periodic sche-dule with the periods coming out as β(u).

    Now comes the question of dispatching the tree schedule, which meansgiven the schedule tree how to construct a schedule from it. We describe thealgorithm developed by Shamir et al [6]. The idea of the algorithm is tofind the message to schedule by traversing the tree with the help of tokensplaced on the tree edges. All non-leaf nodes have a token placed on one oftheir outgoing edges. The algorithm descends to a leaf node by following the

    20

  • tokens down the tree. In addition, each time, the algorithm crosses an edge(u, v) the token on this edge is moved to the next sibling of v i.e. the nextchild of u after v.

    The algorithm is easy to comprehend but a little difficult to visualize.We can prove very easily that any leaf node u has a period of β(u) =β(par(u)) × deg(par(u)). This is because u is scheduled if the tokens are soarranged that the algorithm reaches par(u) in β(par(u)) period and par(u)leads to u only once in deg(par(u)) times, hence the result.

    Our approach to the present problem is as follows. Consider we havegot the frequency distribution for all the messages, but we have to fit theminto a schedule. We try to construct a perfectly periodic schedule for themessages. The first step in this direction would be the construction of aschedule tree from the frequency distribution. We propose an algorithm todo that in the next section. After that using the algorithm presented beforewe construct the schedule and analyse it. The schedule comes out to be a4-approximation.

    3.3.3 Construction of a schedule tree from a frequency distribu-tion

    We use the same notation, namely set of m messages {M1,M2, . . . ,Mn}make up the set of messages. Consider that the frequencies for all the mes-sages has been given i.e (fi)i=1,2,...,n.

    We round down the frequencies fi to a power of 2 and get a new set offrequencies f

    ′i such that

    f′i =

    12j+1

    if12j

    > fi ≥ 12j+1 (53)

    We can construct a perfectly periodic schedule with these f′i with the

    help of a dummy message M0 with its frequency being f′0 = 1 −

    ∑ni=1 f

    ′i .

    The algorithm by Shamir et al constructs the schedule tree. We definetwo kinds of nodes : leaf nodes and non-leaf node. Mi or messages are at-tached as a leaf nodes, all others are supposed to be non-leaf nodes. Weconstruct the Tree in stages, adding another layer of nodes at depth i , atstage i. Some non-leaf nodes from the previous stage serve as the parentsfor the leaf nodes Mi we attach in this stage. For all other non-leaf nodesfrom stage i−1 , we attach two nodes at depth i as its children. These nodein turn will either serve as parents of leaf nodes or non-leaf nodes at depthi + 1. We stop when we have attached all the message nodes.

    Theorem 3.2 (Shamir et al [6]). The algorithm constructs a scheduletree T from any set of frequencies (fi)i=1,2,...,n satisfying the condition∑n

    i=1 fi ≤ 1

    21

  • Algorithm 3 Construction of a Schedule Tree from (fi)i=1,2,...,nInput: (fi)i=1,2,...,nOutput: a Schedule tree T

    Procedure 1 : Change of FrequencyCODEfor 1 ≥ i ≤ n do

    if 12j

    > fi ≥ 12j+1 thenf

    ′i =

    12j+1

    end ifend for

    Procedure 2 : Construction of Tree TCODEinitiate root rfor 1 ≤ j ≤ maxi −1 × log f ′i do

    add all the nodes Mi with f′i =

    12j

    to the tree at the depth j as leavesfrom the non-leaf nodes from the previous stage.Condition : no node should not have a outgoing degree more than 2.Add remaining non-leaf nodes to existing non-leaf nodes at

    end forturn all remaining non-leaf nodes to leaf nodes with the label M0

    Démonstration. The Procedure 1 gives us a set of (f′i ) .

    fi ≥ f ′i (54)fi < 2 × f ′i (55)

    n∑i=1

    fi ≤ 1 (56)n∑

    i=1

    f′i ≤ 1 (57)

    We use the following lemma

    Lemma 3.2 (Shamir et al [6]). All the messages Mi are inserted to thetree T and at any stage j there are enough non-leaf nodes left from theprevious stage to accept the leaf nodes Mi with f

    ′i =

    12j

    in this stage aschildren.

    By the above lemma we have seen that the all the messages are insertedinto the tree T which implies that the T can transformed into a perfectlyperiodic schedule with the help of a dummy variable. The dummy variableis used since the sum of all the altered frequencies is not equal to one. So

    22

  • we add a new dummy message such that the total sum of frequency nowbecomes one.

    3.3.4 Construction of the schedule and final analysis

    The next step would be to construct a perfectly periodic schedule PPSwith the help of Algorithm 2.

    Theorem 3.3. The schedule PPS if constructed with the help of the fre-quency set ( 1

    τ′i

    )i=1,2,...,n mentioned in the lower bound section, is a 4 approxi-

    mation solution .

    Démonstration. Before beginining the actual proof. We recall the cost of theoptimum .

    Lower Bound =12×

    ∑S∈ζ

    pS × maxMi∈S

    τ′i ≤ OPT (58)

    We now find the cost of our schedule PPS in terms of (fi)i=1,2,...,n.The waiting time for a message is 1

    f′i

    at the maximum. Now consider the

    set S. Since this is a perfectly periodic schedule, hence the messages repeatthemselves at regular intervals so at any point in time, the waiting time willatmost be the maximum waiting time for all the individual messages,

    WT (S, t) ≤ maxMi∈S

    WT (Mi, t) (59)

    WT (Mi, t) ≤ 1f

    ′i

    (60)

    WT (S, t) ≤ maxMi∈S

    1f

    ′i

    (61)

    Cost(PPS) ≤∑S∈ζ

    pS × maxMi∈S

    1f

    ′i

    (62)

    Cost(PPS) ≤∑S∈ζ

    pS × maxMi∈S

    2 × 1fi

    (63)

    Cost(PPS) ≤ 2 ×∑S∈ζ

    pS × maxMi∈S

    1fi

    (64)

    Now since we are using the frequency set f′i =

    1

    τ′i

    , Hence the cost becomes

    Cost(PPS) ≤ 2 ×∑S∈ζ

    pS × maxMi∈S

    τ′i (65)

    OPT ≥ 12×

    ∑S∈ζ

    pS × maxMi∈S

    τ′i (66)

    α =Cost(PPS)

    OPT≤ 4 (67)

    23

  • where α is the approximation factor.So finally we have been able to prove, that the approximation algorithm

    has an approximation factor of 4. But there is still a small issue, our schedulehas been constructed with the help of a dummy message which may as wellcover at he maximum half of the whole time. We have incorporated it inour schedule. But if we were to schedule some other message in place of thedummy message, the waiting time for that message will decrease as will thetotal waiting time and hence the cost.

    What remains is the analysis of the running time of the algorithm. Thefirst part of the algorithm which estimates the τ

    ′i uses ellipsoid algorithms

    and is polynomial in time complexity, the second part which dispatchingthe tree does not have a complexity more than O(n2) since at any level,we need to maintain only n non leaf nodes. And the maximum depth isgoing to be O(log(length of input)). And the final step of making perfectlyperiodic schedule is O(n× length of the schedule). So the total computationalcomplexity is polynomial.

    4 Conclusions

    The broadcast scheduling problem gives a nice theoretical base for analy-sing broadcasting problems. It also models other problems like maintenancescheduling problem and the multi-item replenishment problem [3].

    We worked on the general broadcasting problem. We found out that it isNP Hard. We designed a lower bound for the optimum cost, a randomized2Hn-approximation algorithm and finally a constant factor approximationalgorithm based on perfectly periodic schedules to give us a algorithm withapproximation factor 4.

    A couple of open questions and issues came out as a result of this re-search :

    – We did not consider the case when the lengths of individual messagesare different from each other. Although this will be hard to solve bythe methods developed in this report, but nonetheless, the methodsmay work, the reason being that similar methods have been used whiledealing with the variable length topic in other settings.

    – Derandomization of the randomized algorithm may be able to shedsome more light on the exact relationships between τi and pi. A greedyalgorithm devised by the help of the above mentioned method will mostprobably have a better approximation ratio than we have been able toacheive in this report.

    – It will be interesting to see how extending the number of channels tomake it a multi channel network, affects the efficiency of the algorithmssuggested in this report

    – last but not the least, recently several results have been obtained for

    24

  • the online setting of the basic broadcast problem where the pi arenot pre-determined but are considered dynamically [7]. These resultsinclude impossibility results as well as competitive online algorithms. Itwould be interesting to determine whether the customized newspaperproblem is still tractable in this setting.

    Références

    [1] M. H. Ammar and J. W. Wong. The design of teletext broadcast cycles.In Performance Evaluation, volume 5(4), pages 235–242, 1985. Infor-mation available at //www.cc.gatech.edu/fac/Mostafa.Ammar/.

    [2] M. H. Ammar and J. W. Wong. On the optimality of cyclictransmission in teletext systems. In IEEE Trans. on Comm., vo-lume COM-35(11), pages 1159–1170, 1987. Information available at//www.cc.gatech.edu/fac/Mostafa.Ammar/.

    [3] S. Anily, C. A. Glass, and R. Hassin. The scheduling of maintenanceservice. Paper available at //www.math.tau.ac.il/∼hassin/, Juil. 1995.

    [4] A. Bar-Noy, R. Bhatia, J. Naor, and B. Schieber. Minimizing serviceand operation costs of periodic scheduling. In Proc. of the 9th AnnualACM-SIAM Symp. on Discrete Algorithms (SODA’98), pages 11–20,1998. Paper available at //www.eng.tau.ac.il/∼amotz/.

    [5] Amotz Bar-Noy, Aviv Nisgav, and Boaz Patt-Shamir. Nearly optimalperfectly-periodic schedules. In PODC ’01 : Proceedings of the twentiethannual ACM symposium on Principles of distributed computing, pages107–116, New York, NY, USA, 2001. ACM Press.

    [6] Zvika Brakerski, Vladimir Dreizin, and Boaz Patt-Shamir. Dispatchingin perfectly-periodic schedules. J. Algorithms, 49(2) :219–239, 2003.

    [7] Jeff Edmonds and Kirk Pruhs. Broadcast scheduling : when fairness isfine. In SODA ’02 : Proceedings of the thirteenth annual ACM-SIAMsymposium on Discrete algorithms, pages 421–430, Philadelphia, PA,USA, 2002. Society for Industrial and Applied Mathematics.

    [8] Rugenstein EK. active set strategies and an ellipsoid algorithm forgeneral non linear programming problems. PhD thesis, rensselear poly-technique inst, 2002.

    [9] J. Gecsei. The architecture of Videotex Systems. Prentice Hall, Engle-wood Cliffs, N. J., 1983.

    [10] Ecker JG and Kupferschmind M. Introductions to operations research.1988.

    [11] Claire Kenyon, Nicolas Schabanel, and Neal Young. Polynomial-timeapproximation scheme for data broadcast. In STOC ’00 : Proceedings ofthe thirty-second annual ACM symposium on Principles of distributedcomputing, pages 659–666, New York, NY, USA, 2000. ACM Press.

    25

  • [12] Sanjeev Khanna and Shiyu Zhou. On indexed data broadcast. In STOC’98 : Proceedings of the thirtieth annual ACM symposium on Theory ofcomputing, pages 463–472, New York, NY, USA, 1998. ACM Press.

    [13] Shor NZ. Cut-off method with space extension in convex programmingproblems. Cybernetics, 12(94) :6, 1977.

    [14] Shah S. An ellipsoid algorithm for equallity constrained non linear pro-gram. PhD thesis, rensselear polytechnique inst, 1998.

    [15] Nicolas Schabanel. Algorithmes d’approximation pour les télécommuni-cations sans fil : Ordonnancement pour la dissémination de données etAllocation statique de fréquences. PhD thesis, ENS Lyon, January 2000.Available at //perso.ens-lyon.fr/nicolas.schabanel/schabanel phd thesis.

    [16] Nicolas Schabanel. The data broadcast problem with preemption. InLNCS Proc. of the 17th Symp. on Theoritical Aspects of ComputerScience (STACS’2000), volume 1770, pages 181–192, Feb. 2000.

    [17] S. Shekhar and D. Liu. Genesis : An approach to data dissemination inAdvanced Traveler Information Systems (ATIS). IEEE Data Enginee-ring Bulletin, Special issue on Data Dissemination, 19(3), Sept. 1996.Paper available at //www.cs.umn.edu/∼shekhar.

    [18] N. H. Vaidya and S. Hameed. Log time algorithms for scheduling singleand multiple channel data broadcast. In Proc. of the 3rd ACM/IEEEConf. on Mobile Computing and Networking (MOBICOM), Sept. 1997.Paper available at //www.cs.tamu.edu/faculty/vaidya/Vaidya.html/.

    [19] Nitin Vaidya and Sohail Hameed. Data broadcast scheduling : On-lineand off-line algorithms. Technical report, College Station, TX, USA,1996.

    26