View
35
Download
0
Category
Preview:
DESCRIPTION
PRESTO: Feedback-driven Data Management in Sensor Network. (* PRE dictive STO rage ). Ming Li, Deepak Ganesan, and Prashant Shenoy University of Massachusetts, Amherst. Tracking. Structure/Machinery Monitoring. Emerging large-scale sensor networks. - PowerPoint PPT Presentation
Citation preview
Department of Computer ScienceUniversity of Massachusetts, Amherst
PRESTO: Feedback-driven Data Management in Sensor Network
Ming Li, Deepak Ganesan, and Prashant ShenoyUniversity of Massachusetts, Amherst
(*PREdictive STOrage)
UNIVERSITY OF MASSACHUSETTS, AMHERST
Emerging large-scale sensor networks
◊ Hierarchical wireless networks composed of low power sensors.
◊ Enables densely and closely monitoring of phenomena.
Tracking
Surveillance
Structure/Machinery Monitoring
UNIVERSITY OF MASSACHUSETTS, AMHERST
Hierarchical Sensor Network Architecture
Internet
Client Data Browsing, Querying and Processing
Mesh Network
Base-station
Sensor Proxy
Remote Sensors
Sensor Proxy
Remote Sensors
UNIVERSITY OF MASSACHUSETTS, AMHERST
Approaches to Proxy-Sensor Interaction
Sensor-centric Architecture Proxy-centric Architecture
UNIVERSITY OF MASSACHUSETTS, AMHERST
Proxy-Centric Architecture
◊ Overview Proxy determines when to pull
data, which sensor to query, and what data to pull using complex modeling and query processing mechanisms.
◊ Pros: Intelligence placed where
resources are available. More complex algorithms possible.
◊ Cons: Cannot capture anomalies. Less energy-efficiency Greater query error.
BBQ [Deshpande04]
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor-Centric Architecture
◊ Overview Forward queries into the
sensor network. Perform data fusion, query processing and filtering within the network.
◊ Pros: Greater query accuracy Better energy-efficiency.
◊ Cons: Greater sensor complexity. Greater query latency. Directed Diffusion [Heidemann01]
UNIVERSITY OF MASSACHUSETTS, AMHERST
Key Ideas in PRESTO
◊ Steal from the rich (proxy) and give to the poor (sensors).
◊ Exploit predictable structure in sensor data when possible.
◊ Adapt to data & query dynamics to minimize energy usage.
◊ Exploit low-power storage for efficient archival querying.
UNIVERSITY OF MASSACHUSETTS, AMHERST
Outline
◊ Motivation◊ Key Ideas◊ Example◊ ARIMA Model◊ Evaluation◊ Summary & Future Work
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor Proxy
Example-Modeling
Data
11 −− += ttt eXX θModel
11 −− += ttt eXX θ
Build Model
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor
Example-Model Driven Push
?|| δ>− tt XT
Proxy
tX
tt confX ,Predict
11 −− += ttt eXX θ
Predict11 −− += ttt eXX θ
11, −− tt eX
11, −− tt eX
tT
Yes tT
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor
Example-Query
ProxyQuery
What is the reading at time t with confidence c?
tt confX ,
?cconft ≤Yes tXNoPull Tt
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor Proxy
Example-Feedback
11 ' −− += ttt eXX θ
Build Model
11 −− += ttt eXX θ
11 ' −− += ttt eXX θModel
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor
Example - Update Cache after Push
Push Tt
Proxy
Interpolation
Ttt eTT
TtXX
'
''
−−
−=
Interpolation
Ttt eTT
TtXX
'
''
−−
−=
UNIVERSITY OF MASSACHUSETTS, AMHERST
Sensor
Example - Update Cache after Pull
Pull Tt
Proxy
Interpolation
Interpolation
Re-prediction
Re-prediction
UNIVERSITY OF MASSACHUSETTS, AMHERST
Outline
◊ Motivation◊ Key Ideas◊ Example◊ ARIMA Model◊ Evaluation◊ Summary & Future Work
UNIVERSITY OF MASSACHUSETTS, AMHERST
Data Trends
◊ Temperature data trace shows very obvious temporal trend
◊ Shows both long term trend and short term trend.
Seasonal Period
UNIVERSITY OF MASSACHUSETTS, AMHERST
Data Trends
◊ ARIMA model can catch both of these trends
( ) ( ) (1 ) (1 ) ( ) ( )S S D d SP p t Q q tB B B B X B B eφ θΦ ⋅ ⋅ − ⋅ − ⋅ =Θ ⋅ ⋅
Long Term Trend
Short Term Trend
UNIVERSITY OF MASSACHUSETTS, AMHERST
Computation
◊ Easy to predict Five additions and three multiplies
1111 ' −−−−−−−− Θ+Θ−+−+= StSttStSttt eeeXXXX θ
Previous prediction results Previous prediction errors
UNIVERSITY OF MASSACHUSETTS, AMHERST
Outline
◊ Motivation◊ Key Ideas◊ Example◊ ARIMA Model◊ Evaluation◊ Summary & Future Work
UNIVERSITY OF MASSACHUSETTS, AMHERST
Evaluations
◊ Both numerical simulations and real deployments
◊ Test Bed: 1 Stargate (Proxy) / 20 Tmote’s (Sensor) 1 Stargate acts as emulator
◊ Data Trace: James Reserve
UNIVERSITY OF MASSACHUSETTS, AMHERST
Micro Benchmark
Component OperationEnergy (nJ)
NAND Flash20B Read + 8B Write
152
MSP430 Processor
Predict 1 Sample 24
CC2420 Radio
Transmit 1 byte 2000
Model Asymmetry
Component Operation Energy (nJ)
Stargate Model Building 11000
Telos MotePredict 1 Sample
24
Cost of model building is 500x more than prediction
Total cost of prediction and storage is 10x less than communication.
Breakdown of Energy Costs
UNIVERSITY OF MASSACHUSETTS, AMHERST
Model-driven Push Performance
◊ Matlab simulation shows that Model-driven push performs better than model-driven pull.
UNIVERSITY OF MASSACHUSETTS, AMHERST
Scalability
◊ Impact of System Scale Uses emulator to get large network scale
Support up to 100 sensor nodes per proxy
UNIVERSITY OF MASSACHUSETTS, AMHERST
Scalability
◊ Impact of Query Frequency System adapts to high query frequency. Query latency does increase with query frequency
Most of the queries are answered using proxy cache
UNIVERSITY OF MASSACHUSETTS, AMHERST
Adaptation
◊ Adapt to query dynamics Reduce query latency by 50% compared to
before adaptation
Adapt to the low query tolerance after a short period
Average query tolerance changes to a lower value which brings more pulls
UNIVERSITY OF MASSACHUSETTS, AMHERST
Adaptation
◊ Adapt to data dynamics Reduce communication by 30% compared to
non-adaptive scheme
Reduces 30% of communications
UNIVERSITY OF MASSACHUSETTS, AMHERST
Failure Detection
◊ Detect sensor failure using pulling messages Detection latency decreases with query interval,
as well as query tolerance.
Longest detection latency less than 2 hours
Recommended