Transcript ppt

Sensor & Computing
Infrastructure for
Environmental Risks
CCC
ipa
Integrated Platform for
Autonomic Computing
Vassilis Papataxiarhis
[email protected]
Department of Informatics and Telecommunications
University of Athens – Greece
"WSNs in the Real-World" Workshop,
ZigBee Alliance Fall 2011 Members Meeting, October 2011, Barcelona
Sector of Computer Systems and Applications
Pervasive Computing Research Group
(http://p-comp.di.uoa.gr)
Coordinator: Stathes Hadjiefthymiades
(3 Faculty Members, 4 Post Doc Researchers, 7 Ph.D. - 10 M.Sc.
Students)
Research interests:
Pervasive Computing, Mobile Computing,
Wireless Sensor Networks, Context- and Situation-Aware Computing
Information Fusion, Distributed Computing, Semantic Web, Intelligent Multimedia
Activities:
Multi-layered Data Fusion, Inform. Dissemination, Distributed Intelligence,
Context Prediction, Quality of Context, Optimal Stopping, Context Discovery
Publications:
Ph.D. dissertations: 7
IEEE / ACM Transactions: TMC, TAAS, TSMC, TITB
Top Conf.: WWW, MDM, COMPSAC, MobiDE, ICPADS, Globecom
175 publications, 14 book chapters, 1050 citations
Collaborators/Projects:
ICT/IST (IDIRA, IPAC, SCIER, PoLoS), GSRT (Polysema, Mnisiklis, Pythagoras)
CSEM, Uni. Geneva, Frequentis, Ministry of Defense, Fraunhofer, FIAT, Uni.Cyprus
Sensor & Computing
Infrastructure for
Environmental Risks
SCIER Objectives
• Sensor network infrastructures for the detection and
monitoring of disastrous natural hazards.
• Advanced sensor fusion and management schemes.
• Risk evolution models simulated on GRID.
• Multi-risk platform.
• Public-private sector cooperation.
SCIER architecture
Computing System
Public infrastructure
Local
Alerting
Control
Unit
LACU
LACU
LACU
LACU
private infrastructure
LACU
LACU
LACU
SCIER Sensing Subsystem
• Sensor Infrastructure
– In-field sensor nodes (humidity, temp, wind
speed/direction)
– Out-of-field vision sensors (vision sensor)
• Sensor Data Fusion
SCIER Computing Subsystem
• Computation and Storage
• Environmental models
– Flash Floods (FL), forest fires (FF)
– GIS Infrastructure
– Storage, analysis and visualization of monitored data,
spatial calibration and event localization
• Predictive Modeling
• Front-End Subsystem
Local Alerting Control Unit
Alerting Infrastructure
Sensor
Infrastructure
MON:ctb30.gridctb.uoa.gr
XML
Sensing system proxy
Computing Subsystem
LACU
Software
modules
Worker Node
WN:ctb33.gridctb.uoa.gr
CE,SiteBDII:ctb31.gridctb.uoa.gr
Worker Node
WN:ctb34.gridctb.uoa.gr
VO Storage Disk
Pool
SE:ctb32.gridctb.uoa.gr
OSGI
JDBC
DataBase
Data flow
Control flow
Remote
Administration
console
SCIER Site @ UoA
LACU Fusion Component (FF)
• Receives sensor data and executes fusion
algorithms.
• Generates fused data with degree of
reliability.
• Fused data fed to the Computing Subsystem.
2nd Level Fusion Process (FF) in
CS
• Camera data and Fused sensor data from
LACUs are processed .
• Algorithms:
– Voting algorithm
– Dempster Shafer Theory of Evidence
• Triggers simulations according to the final
probability of fire, flood, etc.
FF simulation modeling
•
•
•
Simulation of several possible futures through
the GRID infrastructure.
GRID used to simulate many possible future
situations (1-100) under different propagation
conditions
results analyzed to identify the size and shape
of the resulting burned area, and provide
probabilities for each of the simulated futures.
FL Modeling
• Conditions stored in metadata catalog
• Engine for parsing and evaluating conditions
based on incoming data.
• Interface with Simulation subsystem triggering
model execution based on fusion result
conditions
Sensor input data
Condition
evaluation engine
Metadata
Catalog
Fusion Decision
SCIER GRID and FL with web-services
Sensors
Collect data (location+time+value):
- precipitation
- temperature
- humidity
- wind
Fusion
User interface
SCIER central point
Forwards data to storage
Issues simulation jobs
Runs web server with UI
GRID
Executes fire modelling jobs
ArcGIS
Services
Storage for:
- fire models executables
- model input data
- model structural data
- model output data
- Pre-prepared WS + CS
scenarios
Simulation PC(s)
Executes 1D flood modelling jobs
Incorporates pre-calculated flood maps lookup
System Validation & Evaluation
• Testing includes both fires and flooding
– Gestosa, Portugal (experimental and controlled
burns)
– Stamata, Attica, Greece (fires, system deployed)
– Aubagne, Bouches du Rhone, S. France (fires and
floods)
– Brno, Czech Republic (floods, system deployed)
System Validation & Evaluation
• Gestosa, Portugal (experimental and
controlled burns)
System Validation & Evaluation
• Stamata, Attica, Greece (fires, system
deployed)
System Validation & Evaluation
• Aubagne, Bouches du Rhone, S. France
(fires and floods)
IPAC
IPAC Objectives

Integrated Platform for Autonomic Computing

Main goals
• Middleware for autonomic computing
• Application Creation Environment
Visual
Editor
Textual
Editor
Developer
Code
Emulator
Generation Debugger
Application Creation
Environment
IPAC
Applications
IPAC
Middleware
Services
WiseMAC
WiFi
Short Range
Communication
Interfaces
OSGi Platform
H/W, OS, JVM
IPAC Node
GPS
SunSPOTs
Visual Sensors
Sensing
Elements
IPAC Node
Alarm
Alarm
Chatting
Monitoring
…
Querying
IPAC Middleware
Application Layer
Service Layer
Wireless
Network
Interfaces
Public Segment
Storage
OSGI
Framework
Service Registry
SEC Proxy
Application Manager
Reconfiguration Service
Reasoner
Scheduler
Event Checker Service
S
R
C
C
SRCC Proxy & Information
Dissemination
User Interaction Service
Private Segment
Storage
Event Admin
H/W, OS, JVM
S
E
C
Sensing
Elements
Service Tracking
IPAC
Embedded System
Light-weight IPAC node
•
•
•
•
•
•
A lean version of the middleware (WiseMAC case only)
On an embedded wireless sensor node platform (WiseNode)
Targeted functionality
IPAC-compatible communication-wise
A single, customized application
To be used as relay node, simple sensor node, beacon, ... where full
IPAC complexity is not necessary
WiseNode
-> more nodes...
-> cheaper...
IEEE1451 in IPAC
• IEEE1451 standard has inspired the implementation of the
Sensing Element Components as “smart sensor”.
• The philosophy which the IEEE1451 is based on is one of
the features of the IPAC system, namely the uniform
treatment of all IPAC sensors.
• The standard is still under development and some parts are
not well defined.
• Commercial products (sensors, dev kit or adapter) are no
available, partially available or with very short lifetime
• A Java implementation of the IEEE1451 has been
performed based on the SUNSpot platform
IEEE1451 software architecture
NCAP component:
- “soft NCAP”, SECproxy OSGI module that provide NCAP functionalities
- embedded in the SEC Proxy service
- new sensor discovery and sensor removal
- sensor data retrieval
- integration with Reasoner, Storage and ECS service
TIM component (Sunspot board):
- SEC midlet on SUNSpot that provide TIM functionalities
- physical sensor reading
- respond to discovery queries
- respond to transducer access requests
- handle transducer management tasks
- support TEDS management functions
SEC hardware platform
Hardware:
Software:
- Virtual Squawk Machine
- Fully capable J2ME CLDC 1.1 Java VM
with OS functionality
- VM executes directly out of flash
memory
- Device drivers written in Java
- Automatic battery management
- Developer Tools
- Use standard IDEs. e.g. NetBeans, to
create Java code
- Integrates with J2SE applications
- Sun SPOT wired via USB to a computer
acts as a base-station
- Dimensions 41 x 23 x 70 mm 54 grams
- 180 MHz 32 bit ARM920T core - 512K RAM/4M Flash
- 2.4 GHz IEEE 802.15.4 radio with integrated antenna
- USB interface
- 3.7V rechargeable 720 mAh lithium-ion battery
- 32 uA deep sleep mode
- General Purpose Sensor Board
- 2G/6G 3-axis accelerometer
- Temperature sensor
- 8 tri-color LEDs
- 6 analog inputs
- 2 momentary switches
- 5 general purpose I/O pins and 4 high current output
pins
IPAC - Platooning
•
•
•
•
•
Two main scenarios: Road Condition & Road Availability
8 applications
Applications have specific business logic
Applications react when specific events are triggered
Events are based on: messages (data, etc) or sensor
values
Scenario 1: Road Condition
• A convoy should avoid a non safe area
(e.g. ice in the road)
• Applications used:
 First Vehicle
• the node has a vision sensor attached on
it but no temperature sensor
• reacts in an ice event. The event is
triggered based on the vision sensor
indication and other vehicles’
temperature indication
• in case of an ICE event is sends a
warning message to the rest of the
vehicles
 Convoy Vehicle
• has a temperature sensor attached on it
• reacts in a warning message by
presenting the ICE warning in the
application interface
Scenario 2: Road Availability
• Two convoys have intersecting routes
and should avoid simultaneous
use of a road junction.
• Applications used:
– Head Vehicle (for both convoys)
• sends a ‘data’ message containing the node ID as the convoy moves
• stops or continues its route according to the message sent by the route scheduler
– Tail Vehicle (for both convoys)
• sends a ‘data’ message containing the node ID as the convoy moves
– Route scheduler
• accepts ‘data’ messages (data events) and based on the Rssi values it decides which
convoy should proceed first
RSSI-based logic
• Thorough handling of RSSI measurements from
convoy vehicle.
• The route scheduler assesses the absolute RSSI
value to roughly determine the distance of the
approaching vehicle and the time derivative to
determine its speed.
• Similar approach is followed for the departure
from the junction.
Thank you!
IPAC website: http://ipac.di.uoa.gr
SCIER website: http://www.scier.eu