Architecture
Download
Report
Transcript Architecture
TAXISAT PROJECT
Low Cost GNSS and Computer
Vision based data fusion solution
for driverless vehicles
Marc POLLINA
[email protected]
11
Outline
•
•
•
•
•
Importance of ITS
In-vehicle systems: Future Technologies
System Architecture
Results Analysis
Conclusions
22
Importance of ITS
• The Global Market for ITS Technologies is estimated to grow
to €50BN by 2020.
• Automotive Industry is one of the most innovative sectors
– Active: Continuously monitor an aspect of the user, vehicle, environment or
transport network and alert the user to potential danger, or intervene with the
driving task to avoid danger
– Passive: These are crash mitigation or minimisation technologies that act to
enhance the safety of the driver or other road users by minimising the severity.
– Combined active and passive systems (CAPS): These systems monitor the
environment, vehicle or driver for potential danger and then apply passive safety
measures if a crash is deemed unavoidable
33
GNSS Sensor in Urban Area
Example of test case ( GUIDE Laboratory – Toulouse)
Blue : GNSS , Green : reference ( PPK + high grade IMU)
44
GNSS Sensor in Urban Area
Example of test case ( GUIDE Laboratory – Toulouse)
Blue : GNSS , Green : reference ( PPK + high grade IMU)
55
Future Technologies
• Sensor Fusion is essential : no sole positioning sensor covers all
requirements and constraints
• Combination of computer vision, 3D Maps and GNSS
technologies are fostering new solutions not only for driving
assistance but for unmanned vehicles
66
Future Technologies
•
GNSS : new constellations & new
frequencies
New GNSS satellite constellations, signals, and
associated frequency diversity is stimulating innovations
in user equipment design leading to improved capabilities
of positioning
•
3D Maps : city mapping
3D city mapping has the potential to revolutionize
positioning in challenging urban areas. Adding height
information to street maps can be used to aid GNSS
positioning for land vehicle and pedestrian navigation.
•
GNSS
Computer vision: intelligent camera
The major new navigation sensor of the next decade
could well be the camera. Visual odometry, is a form of
dead reckoning
77
3D maps
Position
Computer
Vision
Others …
Architecture
88
Architecture
Traditional Sensors
Cost/Accuracy Trade off
Odometers for:
- Wheels speed
- Front Axle orientation
Gyro:
- Optical
- MEMS
99
Architecture
Position Sensors
Cost/Accuracy Trade off
Trimble bullet III: compact
antenna
- Low cost and good gain
LEA-6T : GPS/EGNOS
receiver
- Accurate, reliable
1010
Architecture
Computer Vision
Cost/Accuracy Trade off
FLEA 3, Point grey, stereo pair
SLAM
Enhancing performance level
compared to usual INS
Transversal displacements
and estimations of velocity
and orientation
Matching between a live map
of the scene structure and a
new acquired image
FOLLOW THE LANE
Improve security, reliability
and 24/7 operation possibility
Extra feature derived from
ADAS to assist continuously
the car’s control loops
1111
Architecture
EDAS Connection Module
Local server
- Hosting the EDAS client
software (EDAS server
connection software)
- Filtering routine
3G communication
- Communication between
the local server and the
vehicle
1212
Architecture
• Tight Hybridization module composed of
• An Inertial Navigation System (INS) which integrates the gyrometer/odometer data
(100Hz)
• A Navigation filter which updates and corrects the INS according to the measurements
from the Vision or GNSS modules when available and valid
• 3 platforms -> Time synchronisation of measurement required
1313
Real Time Scenario GeoPositioning
No GeoReferenced
information
A-priori Unknown
scenario
Information Ratio
Real World
Real
Distance /
Location
(lat,lon)
Mapping of real
world information
to 2D image
Camera/Vehicle position and
Orientation in Real Time
Captured image
.
Captured image
Measured
Information:
- GNSS Position
Device
- Orientation by
Sensors
.
Information
in pixels
(x3,y3) - (lat3,lon3)
(x4,y4) - (lat4,lon4)
Known relation
Depth Information
-
Future GIS Hibridization Capabilities
Precise Map Building
Usable information for control loops: predictive
(x1,y1) - (lat1,lon1)
(x2,y2) - (lat2,lon2)
Measured Reference
(x0,y0) - (lat0,lon0)
1414
Vision Sensor: FtL results
Follow the Lane
• Tx: (lateral) translation in x
• Vx: linear velocity in x
• Wx: width of the lane
• dWx: linear velocity of the
change of width
• Self Assesment
• Active Control of Light
Conditions
1515
Vision Sensor: SLAM
SLAM (Simultaneous Location & Mapping ) : Visual odometry + Mapping
•
•
Visual odometry: Estimation of the EgoMotion (6D camera/vehicle pose) in real time
Real time 3D scene map generation
1616
Evaluation
FtL Evaluation
•
Recorded video sequences: 337 minutes
SLAM Module - 2 step evaluation
•
•
Laboratory computer using the KITTI odometry evaluation dataset
with ground truth
– 22 sequences of images recorded with a stereo pair of
cameras embedded in a car.
Evaluation in San Sebastian
– Running predifined paths
1717
Evaluation
Accuracy and precision of the odometry
•
•
•
Translation error max 0.29%
rotational error 0.0122 deg/m
Runtime 9.0 ms
1818
Conclusions
– Computer Vision as one key sensor for enabling autonomous driving
– Enable autonomous or semi-autonomous driving of your vehicle
even in situations when GNSS Signal is unreliable or not available at
all (i.e. indoors, in tunnels, under dense vegetation, etc.).
– Know the position of your vehicle even when no GNSS reception is
available.
– Improve position precision and reliability considerably when
compared to GNSS-only solutions
– Improve availability compared to GNSS solutions. SLAM is possible
24/7 while GNSS reception might be unreliable or not available at all
for several minutes
– Create a map in Real Time and Geo-locate all the point of an image
in Real Time
1919
THANK YOU!
Marc POLLINA
[email protected]
2020