Platform Based Design Student presentations

Download Report

Transcript Platform Based Design Student presentations

Diana Cristina Albu
Embedded Systems
Michael Bramberger, Andreas Doblander, Arnold Maier, Bernhard Rinner
Graz University of Technology
Helmut Schwabach
Austrian Research Centers, Seibersdorf



Smart cameras are fun
Surveillance tasks are important in today’s
traffic
I wanted something related to the subject but
not something mentioned in class


It was a Research Feature for IEEE Computer
society paper in February 2006 (Volume 39,
Issue 2)
It presents the design of a smart camera as a
fully embedded system, with application in
traffic surveillance

Smart cameras are equipped with a highperformance onboard computing and
communication infrastructure, combining in a
single embedded device
◦ video sensing
◦ processing
◦ communications

Networks of embedded cameras can
potentially support more complex and
challenging applications:
◦
◦
◦
◦
Smart rooms
Surveillance
Tracking
Motion analysis




“From Analog to Digital Cameras”:
1st generation surveillance: analog equipment
(closed circuit TV cameras transmitted video
signal over analog lines)
2nd generation: digital back-end components;
allow real time automated analysis of incoming
data
3rd generation: complete digital transformation;
video converted in digital domain at the camera
and transmitted via a computer network; cameras
can also compress video to save bandwidth

4th generation: intelligent cameras; perform lowlevel image processing operations on the
captured frames onboard to improve video
compression and intelligent host efficiency
◦ however most of the processing is done at a central unit

But “smart cameras”
◦
◦
◦
◦
◦
directly perform highly sophisticated video analysis
video sensing
video processing
communication
designed as reconfigurable and flexible processing
nodes with self-reconfiguration, self-monitoring, and
self diagnosis capabilities.

Shift from a central to a distributed control
surveillance system
◦ Increase the surveillance system’s functionality,
availability, and autonomy
◦ Can react autonomously to changes in the system’s
environment
◦ Can detected events in the monitored scenes.

A static surveillance system configuration is
no longer feasible!

scalable, embedded, high-performance,
multiprocessor platform consisting of a
◦ network processor
◦ a variable number of digital signal processors
(DSPs)

commercial off-the-shelf software/hardware
architecture was chosen
◦ support fast prototype development
◦ achieve flexibility and performance at a reasonable
price

Sensing unit

Processing unit (PU)
◦ Monochrome CMOS image sensor
◦ delivers images with VGA resolution at up to 30 fps
◦ transfers images via a first-in, first-out (FIFO) memory to the PU
◦ Up to 10 Texas Instruments TMS320C64x DSPs
 can deliver an aggregate performance of up to 80 GIPS while keeping the power
consumption low
◦ PCI bus couples the DSPs and connects them to the network processor

Communication unit
◦ network processor: Intel XScale IXP425
◦ establishes the connection between the processing and communication
units
◦ controls internal and external communication
◦ currently supports two interfaces for IP-based external
communication:
 Wired Ethernet
 wireless Global System for Mobile Communications/general
packet radio service (GSM/GPRS)

DSP framework – runs on every DSP
◦ provides an abstraction of the hardware and
communication channels
◦ supports dynamic loading and unloading of
application tasks
◦ manages the DSP’s on-chip and off-chip resources
◦ algorithms on different DSPs use the service
management facilities to dynamically establish
connections to each other
◦ the DSP framework was built on Texas Instruments’
DSP/BIOS operating system.

SmartCam framework - runs on the network proc
◦ an abstraction of the DSPs to ensure the application layer’s
platform independence
◦ application layer uses the provided communication
methods to exchange information
 internal messaging to the DSPs
 external IP-based communication


application development by high-level interfaces to
DSP algorithms and the DSP framework’s functions
XScale processor runs standard Linux
◦ only customization of the Linux kernel is the DSP kernel
module
 processor uses it to establish the connection to the DSPs via the
PCI bus



Use the smart cameras to implement a
distributed intelligent video surveillance
system (IVS)
Partition IVS into distributed logical groups
(surveillance clusters)
IVS
◦ requires an assignment of cameras to a specific
cluster
◦ dynamically and autonomously maps surveillance
tasks into individual cameras depending on their
resources and the system’s current state

Tasks are implemented onto cameras using a mobile agent
system (MAS) built atop the SmartCam framework
Changes in the environment trigger a task mission

Quality of Service (QoS):

Power awareness

◦ parameters include frame rate, transfer delay, image resolution,
and video-compression rate
◦ levels can change over time due to user interactions or changes in
the monitored environment (so novel IVS systems must include
dedicated QoS management mechanisms)
◦ camera supports combined power and QoS management (PoQoS)
for distributed IVS systems
◦ PoQoS dynamically configures the power and QoS level of the
camera’s hardware and software to adapt to user requests and
changes in the environment


Commercial off the shelf hardware
components to test and evaluate the video
surveillance system
1 cam consists of:
◦ network processor
◦ several DSPs
◦ a CMOS image sensor

Platform: IXDP425 Intel
Processor: IXP425 Xscale

On-chip support:

DSP platform

PCI boards plugged into the baseboard, consist of

Eastman Kodak’s monochrome sensor LM-9618 captures the images

◦ 533 MHz
◦ 256 Mbytes of external memory
◦ four PCI slots
◦ Ethernet access
◦ multiple serial ports
◦ PCI host controller
◦ Ateme’s network video development kits (NVDK)
◦ Each NVDK board offers 264 Mbytes of memory accessible via two different DSP
external memory interfaces
◦ Texas Instruments TMS320C6416 DSPs, 600 MHz
◦ high-dynamic range of up to 110 decibels at VGA resolution
◦ FIFO memory connects the sensor to one of the DSPs

Network processor: Linux (Kernel 2.6.8.1)
◦ access to a broad variety of open source software
modules

The SmartCam framework, executes atop Linux
◦ ensures interoperability with the DSPs


Java also runs atop Linux, supporting platform
wide applications
DSP operating system: DSP/BIOS real-time
operating system (DSP framework runs atop it;
serves as the SmartCam framework’s counterpart
on the network processor)







Mobile Agent System supports autonomous operation of the surveillance tasks
Each task incapsulated in a mobile agent which migrate between hosts
DSP agents:
◦ a module that manages the agent’s integration into its environment
◦ a DSP binary representing the agent’s functionality
◦ an optional set of intermediate data
◦ a set of DSP resource
Task allocation mechanism requires these parameters to autonomously allocate
surveillance tasks to smart cameras
SmartCam agents:
◦ perform status information and communication tasks
◦ are executed on the network processor and can access the DSPs
◦ don’t include resource requirements or DSP binaries
Additional agents provide system functionality
◦ task-allocation system
System exploits mobile SmartCam agents to determine in a distributed manner
how to optimally allocate surveillance tasks to the cluster’s SmartCams





Two identical SmartCam prototypes
Integrated up to three additional PCs (Pentium III running
under Linux at 1 GHz) to evaluate larger SmartCam
networks
Complete SmartCam framework and the MAS could
execute on the PC without any modification
Diet agents running under Java as the MAS and applied the
JamVM Java virtual machine on the smart camera prototype
Compared the SmartCam prototype’s Java performance
with that of a standard PC
◦ The results showed that the interpreter-based JamVM is about 20
times slower than the Sun Java runtime environment (JRE) 1.4.2 on
the PCs
 the native computing performance between a Pentium III PC and the
SmartCam (XScale) differs only by a factor of two







Multicamera object-tracking application
Multicamera system instantiates only a single tracker (agent) task
The agent follows the tracked object migrating to the SmartCam that
should next observe the object
Tracking agent based on a Kanade-Lucas-Tomasi feature tracker
Main advantage is its short initialization time
◦ Applicable for multicamera object tracking by mobile agents
◦ Tracking agents control the handover process, using predefined migration regions
◦ When the tracked object enters a migration region, the tracker initiates handover to
the next SmartCam
◦ Each migration region assigned to one or more possible next SmartCams
◦ Motion vectors help distinguish among several SmartCams assigned to the same
migration migration region
◦ Motion vectors check whether the object moves in the correct direction
◦ A master-slave approach for the tracked object handover
Tracking agent’s migration between SmartCams takes up to 1 second
Task-allocation system’s setup time—approximately 190 milliseconds

Keys to successful deployment of smart cameras
are:
◦ the integration of sensing, computing, and
communication in a small, power-aware embedded
device
◦ the availability of high-level image/video processing
algorithms or libraries for the embedded target
processors (the DSPs)
◦ a lightweight software framework supporting glueless
intra- and intercamera communication
◦ the availability of various system-level services such as
task mapping and QoS adaptation to allow autonomous
and dynamic operation of the overall multicamera
system

System usage:
◦
◦
◦
◦
traffic surveillance
detection of stationary vehicles
detection of wrong-way drivers
computation of traffic statistics such as
 average speed
 lane occupancy
 vehicle classification

The approach is good considering they are using
off the shelf products
◦ The amount of memory and power dissipation are higher
than the design would require
◦ it is good for testing and research but not suitable in
real world situations

The JamVM seems to be slowing down tracker
migration (about 1 sec)
◦ Maybe try another virtual machine (eg. Kaffe)

migration times rather long also because of the
master-slave architecture - increases resource
utilization because two or more trackers are
active at the same time






An Integrated Visualization Of A Smart Camera Based Distributed
Surveillance System - Sven Fleck, Christian Vollrath, Florian Walter, Wolfgang
Straßer WSI/GRIS, University Of Tubingen
A Mobile Agent-based System For Dynamic Task Allocation In Clusters Of
Embedded Smart Cameras - Michael Bramberger, Bernhard Rinner And
Helmut Schwabach
An Embedded Smart Camera On A Scalable Heterogeneous Multi-dsp System
- Michael Bramberger, Bernhard Rinner And Helmut Schwabach
Embedded Smart Cameras As Key Components In Reactive Sensor Systems Michael Bramberger, Bernhard Rinner And Helmut Schwabach
Decentralized Object Tracking In A Network Of Embedded Smart Cameras M. Quaritsch, M. Kreuzthaler, B. Rinner, B. Strobl
Autonomous Multicamera Tracking On Embedded Smart Cameras - Markus
Quaritsch, Markus Kreuzthaler, Bernhard Rinner, Horst Bischof, And Bernhard
Strobl






1. W. Wolf, B. Ozer, and T. Lv, “Smart Cameras as Embedded
Systems,” Computer, Sept. 2002, pp. 48-53.
2. G.L. Foresti, C. Mahonen, and C.S. Regazzoni, Multimedia
Video-Based Surveillance Systems, Kluwer Academic Publishers,
2000.
3. M. Bramberger, B. Rinner, and H. Schwabach, “A Method for
Dynamic Allocation of Tasks in Clusters of Embedded Smart
Cameras,” Proc. Int’l Conf. Systems, Man and Cybernetics, IEEE
Press, 2005, pp. 2595-2600.
4. R. Steinmetz and K. Nahrstedt, Multimedia Systems, Springer,
2004.
5. A. Maier, B. Rinner, and H. Schwabach, “A Hierarchical
Approach for Energy-Aware Distributed Embedded Intelligent
Video Surveillance,” Proc. IEEE/IFIP Int’l Workshop Parallel and
Distributed Embedded Systems, IEEE Press, 2005, pp. 12-16.
6. J. Shi and C. Tomasi, “Good Features to Track,” Proc. IEEE Int’l
Conf. Computer Vision and Pattern Recognition, IEEE Press,
1994, pp. 593-600.