20
Beneficial Caching in Mobile Ad Hoc Networks Bin Tang, Samir Das, Himanshu Gupta Computer Science Department Stony Brook University

Beneficial Caching in Mobile Ad Hoc Networks

Embed Size (px)

DESCRIPTION

Beneficial Caching in Mobile Ad Hoc Networks. Bin Tang, Samir Das, Himanshu Gupta Computer Science Department Stony Brook University. Outline. Introduction: Why caching in ad hoc network? Problem formulation of cache placement problem under memory constraint Beneficial caching - PowerPoint PPT Presentation

Citation preview

Page 1: Beneficial Caching in Mobile Ad Hoc Networks

Beneficial Caching in Mobile Ad Hoc Networks

Bin Tang, Samir Das, Himanshu GuptaComputer Science Department

Stony Brook University

Page 2: Beneficial Caching in Mobile Ad Hoc Networks

Outline Introduction:

Why caching in ad hoc network? Problem formulation of cache placement problem under

memory constraint Beneficial caching

Centralized greedy algorithm with provable bound Distributed caching algorithm

Cache routing protocol Distributed caching policy

Simulation and analysis Comparison of centralized and distributed algorithm Comparison of distributed algorithm with latest existing work

(Yin & Cao Infocom’04) Conclusion and future work

Page 3: Beneficial Caching in Mobile Ad Hoc Networks

Motivation of Caching in MANET MANET

Multi-hop wireless networks consisting of mobile nodes without any infrastructure support. Each node is both a host and a router

Application: rescue work, battle field, outdoor assemblies…

Scarce bandwidth and limited battery power/memory

Wireless communication is a significant drain on the battery

Our goal Develop communication-efficient caching technique

with memory limitations

Page 4: Beneficial Caching in Mobile Ad Hoc Networks

Problem formulation of Cache Placement Problem under Memory Constraint

Page 5: Beneficial Caching in Mobile Ad Hoc Networks

General ad hoc network graph G(V,E) p data items D1,D2, … Dp. Each Di is originally

stored by a source node Si

Each node has memory capacity of mi pages Node i request Dj with access freq aij,

Distance between i and j is dij

Definition: Aijk indicates the jth memory page of node i is selected for caching of Dk

Our Goal: minimize total access cost

Page 6: Beneficial Caching in Mobile Ad Hoc Networks

Centralized Greedy Algorithm Benefit of Variable: Let Γ denote the set

of variables that have been already selected by the greedy algorithm at some stage. The benefit of Aijk with respect to Γ is defined as:

Page 7: Beneficial Caching in Mobile Ad Hoc Networks

Theorem: Algorithm 1 returns a solution Γ whose benefit is at least as half of the optimal benefit.

Page 8: Beneficial Caching in Mobile Ad Hoc Networks

Distributed Algorithm Cache routing protocol

Cache routing table entry at node i: (Dj, Hj, Nj, dj)

Nj is the closest node to i that stores a copy of Dj Hj is the next hop on the shortest path to Nj

Dj is the weighted length of the shortest path to Nj

Special cases: If i is the source node of Dj, assume the Dj will not be

removed If i has cached Dj, then Nj is the nearest node

(excluding i) that has a copy of Dj

Page 9: Beneficial Caching in Mobile Ad Hoc Networks

Distributed caching policy: Node i observes its local traffic and

calculates the benefit (Bij) of caching or removing a data item Dj:

Bij = k known locally akj dj

Node i decides to cache the mi most beneficial data items

Page 10: Beneficial Caching in Mobile Ad Hoc Networks

Performance Evaluation Comparison of centralized and

distributed algorithms Parameters

Number of nodes in the network Transmission radius Tr

Number of data items Number of clients accessing each data Memory capacity of each node

Distributed and centralized algorithms perform quite closely.

Page 11: Beneficial Caching in Mobile Ad Hoc Networks

Varying number of data Varying number of nodes and items and memory capacity Transmission Radius

Page 12: Beneficial Caching in Mobile Ad Hoc Networks

Varying number of clients

Page 13: Beneficial Caching in Mobile Ad Hoc Networks

Comparison of beneficial caching and cooperative caching (Yin & Cao Infocom’04)

Experiment setup: Ns2 implementation

Underlying routing protocol: DSDV 2000m x 500m Random waypoint model in which 100

nodes move at a speed within (0,20m/s) Tr=250m, bandwidth=2Mbps

Experiment metrics: Average delay Message overhead Packet delivery ratio (PDR)

Page 14: Beneficial Caching in Mobile Ad Hoc Networks

Server Model: Two servers: server0, server1 (to be consistent with Cao’s

paper) 100 data items: even-id data items in server0, odd-id data

items in server1 Data size uniformly distributed between 100 bytes and

1500 bytes Client Model:

Each node generates a single stream of read-only queries Query generating time follows exponential distribution with

some mean value (if the requested data does not return to the requesting node before the next query sent out, it is considered as a packet loss)

Each node accesses 20 data items uniformly out of 100 data items

Page 15: Beneficial Caching in Mobile Ad Hoc Networks

Beneficial caching: Each node maintains a cache routing table, each entry of

which indicates the closest cache of each data item. It is maintained by flooding

Node observes the data requests passing by and records how many times it sees for each item

When some threshold number of data request is reached(100 in our experiment), each node calculates the benefit of caching

Cache replacement algorithm is based on the benefit Cooperative caching (Yin & Cao infocom’04):

Cache data – the data packet is cached if its size is smaller than some threshold value

Cache path – the id of the requestor is cached, otherwise Requestor always caches the data packet; LRU is cache

replacement policy

Page 16: Beneficial Caching in Mobile Ad Hoc Networks
Page 17: Beneficial Caching in Mobile Ad Hoc Networks

Experiment Analysis In static network:

Ours perform much better in average dealy (3 times better), when traffic gets very heavy ( query generating time < 5s), outs are 4~5 times better

Better PDR performance (100% vs. 98% in heavy traffic)

Worse message overhead when traffic is light In Mobile network (max speed 20 m/s):

Our delay performance is slightly better Better PDR (87% vs. 75% for most of the range) Worse message overhead (5 times worse)

Page 18: Beneficial Caching in Mobile Ad Hoc Networks
Page 19: Beneficial Caching in Mobile Ad Hoc Networks

Conclusions We propose and design a benefit-based

caching paradigm for wireless ad hoc networks.

A centralized algorithm in static network is given with provable bound under memory constraint of each node.

The distributed version has a very close performance to the centralized one.

Compared with the latest published work in mobile environment, our scheme performs better in some range of parameters.

Page 20: Beneficial Caching in Mobile Ad Hoc Networks

Ongoing and future work

We are currently working on mobility-based caching techniques

Reduce overhead in our work