tools - Inria
Download
Report
Transcript tools - Inria
ViSEvAl
ViSualisation
and
EvAluation
http://team.inria.fr/stars/fr/2012/02/02/viseval-software/
Overview
What is evaluation?
Evaluation process
Metric definition
ViSEvAl
Description
Installation
Configuration
Functionalities
Evaluation process
General overview
Metric definition
Metric = distance + filter + criteria
Distance: associate detected and annotated objects
Spatial: compare bonding boxes area
Temporal: compare time intervals
Filter: selects which object to evaluate
Specific type, distance to the camera, ...
Criteria: how the property of detected and
annotated objects is similar?
4 tasks: detection, classification, tracking,
event detection
ViSEvAl platform description
ViSEvAlGUI
ViSEvAl:
Interfaces
All functionalities
(synchronisation,
display,...)
ViSEvAlEvaluation
Core library
Loading Video
Frame Metric
Distance
Tool
Temporal Metric
Event Metric
Object Filter
Plugin
ViSEvAl plugins 1/2
Loading video
Distance
ASF-Videos, Caviar-Images, JPEGImages, Kinect-Images (hospital),
OpenCV-Videos (Vanhaeim), PNG-Images
Bertozzi, dice coefficient, overlapping
Filter
CloseTo, FarFrom, Identity, TypeGroup
ViSEvAl plugins 2/2
Criteria
Detection: M1.X
Classification: M2.X
2 criteria (M2.1: type, M2.2: sub type)
Tracking: M3.X
2 criteria (M1.1: area, M1.2: silhouette)
6 criteria (M3.1: F2F, M3.2: persistence, M3.4
(tracking time), M3.5: confusion, M3.6, M3.7:
confusion + tracking time, M3.8: frame detection
accuracy)
Event: M4.X
4 criteria (M4.1, M4.2: begin and end time, M4.3:
common frame, M4.4: common time)
ViSEvAl: inputs
A set of XML files
Detection: XML1 file -> sup platform
Recognised event: XML3 file -> sup platform
Ground truth: xgtf file -> Viper tool
Time stamp file for time synchronisation : xml file ->
createTimeStampFile.sh script provided by ViSEvAl
ViSEvAl installation
Get the sources
sup svn repository
cd sup/evaluation/ViSEvAl/
Run intall.sh at the root of ViSEvAl folder
Dependence:
Librairies: QT4 (graphical user interface, plugin
management), gl and glu (opengl 3D view), xerces-c
(XML read), opencv (video read)
Tool: xsdcxx (automatically compute C++ classes for
reading XML files)
cd bin/appli; setenv LD_LIBRARY_PATH ../../lib
Run ./ViSEvAlGUI chu.conf
ViSEvAl folder organisation
src : appli, plugins (Cdistance, CeventMetric, CframeMetric,
CloadingVideoInterface, CobjectFilter, CTemporalMetric)
include : header files
install.sh, clean.sh
doc : documentation
lib : core library, plugins
scripts : createTimeStampFile.sh makeVideoFile.sh splitxml12-3file.sh
bin : ViSEvAlGUI, ViSEvAlEvaluation
tools : CaviarToViseval, QuasperToViseval
xsd : xml schemas
ViSEvAl: configuration file
Configuration file based on Keyword-Parameter
SequenceLoadMethod "JPEG-Images” #"ASF-Videos“
SequenceLocation "0:../../example/CHU/Scenario_02.vid"
TrackingResult "0:../../example/CHU/Scenario_02_Global_XML1.xml"
EventResult "../../example/CHU/Scenario_02_Global_XML3.xml"
GroundTruth "0:../../example/CHU/gt_2011-11-15a_mp.xgtf"
XMLCamera "0:../../example/CHU/jai4.xml"
MetricTemporal "Mono:M3.4:M3.4:DiceCoefficient:0.5:TypeGroup"
MetricEvent "M4.2:M4.2.1:Duration:10
ViSEvAl run trace
Mon, 11:15> ./ViSEvAlGUI
Load all the plugins
-----------------------------------Loading video interfaces:
ASF-Videos
Caviar-Images
JPEG-Images
Kinect-Images
OpenCV-Videos
PNG-Images
-----------------------------------Loading distance:
3DBertozzi
3DDiceCoefficient
3DOverlapping
Bertozzi
DiceCoefficient
Overlapping
-----------------------------------Loading object filter:
CloseTo
FarFrom
Identity
TypeGroup
Loading frame metric:
M1.1
M1.2
M2.1
M2.2
M3.1
-----------------------------------Loading temporal metric:
M3.2
M3.4
M3.5
M3.6
M3.7
M3.8
-----------------------------------Loading event metric:
M4.1
M4.2
M4.3
M4.4
------------------------------------
ViSEvAl: two tools
ViSEvAlGUI
Graphical user interface
Visualise detection and ground truth on the images
User can easily select parameters (e.g. distance,
threshold,...)
Frame metrics results are computed in live
ViSEvAlEvaluation
Generate a .res file containing the results of the metrics
Frame, temporal and event metrics are computed
User can evaluate several experiments
Same configuration file for the both tools
ViSEvAl: result file (.res)
camera: 0
Tracking result file: /user/bboulay/home/work/svnwork/sup/evaluation/ViSEvAl/example/vanaheim/res.xml1.xml
Fusion result file:
Event result file:
Ground truth file: /user/bboulay/home/work/svnwork/sup/evaluation/ViSEvAl/example/vanaheim/Tornelli-2011-01 28T07_00_01_groups.xgtf
Common frames with ground-truth:
Detection results:
7978 7979 7980 7981 7983 7984 7985
*****
====================================================
Metric M1.1.1
====================================================
Frame;Precision;Sensitivity 0;True Positive;False Positive;False Negative 0;Couples
8004;0.500000;1.000000;1;1;0;(100;170;0.737438)
8005;0.500000;1.000000;1;1;0;(100;170;0.721577)
8006;0.500000;1.000000;1;1;0;(100;170;0.706809)
8007;0.500000;1.000000;1;1;0;(100;170;0.713584)
====================================================
Final Results:
Global results:
Number of True Positives : 1789
Number of False Positives : 1597
Number of False Negatives 0: 2254
Precision (mean by frame) : 0.523071
Sensitivity 0 (mean by frame) : 0.477763
Precision (global) : 0.528352
Sensitivity 0 (global) : 0.442493
--------------------------------------------Results for GT Object 2
Number of True Positives : 0
Number of False Positives : 0
Number of False Negatives 0: 0
Precision (global) : 0.000000
Sensitivity 0 (global) : 0.000000
---------------------------------------------