CS Biology - Indiana University

Download Report

Transcript CS Biology - Indiana University

Geometric
Representations &
Collision Detection
Kris Hauser
I400/B659: Intelligent Robotics
Spring 2014
3D models in robotics
•
•
•
•
•
•
•
•
Design
Simulation
Robot collision detection (i.e. prediction)
Proximity calculation
Map building
Object recognition
Grasp planning
Etc..
Common Representations
• Primitives
• Raw data
• Point clouds
• Depth image
• Polygon soup
• Surfaces
•
•
•
•
Polygon mesh
Parametric curves
Subdivision surfaces
Implicit surface
• Volumes
• Voxels
• Distance transforms
• Quad/octree
Aspects to consider
• How accurately does the geometry need to be represented?
• What operations need to be performed on the geometry?
(Recognition? Matching? Simulation? Collision detection?
Visualization? Distance computation?) How quickly? How
accurately?
• Storage and transmission limitations?
• How easily / reliably / accurately / quickly can the source data
(e.g., raw sensor data, CAD models) be converted to the
desired format?
Visualization
• Primitives
• Raw data
• Point clouds
• Depth image
• Polygon soup
• Surfaces
•
•
•
•
Polygon mesh
Parametric curves
Subdivision surfaces
Implicit surface
Easy to convert to polygons
• Volumes
• Voxels
• Distance transforms
• Quad/octree
Convert to polygons (somewhat expensive)
Use specialized rasterization techniques
Easy Conversions
• From primitives/surfaces to:
• Polygon soups : discretization
• Point clouds: sampling
• Depth images: rasterization
• From volumes to surfaces:
• Simple: output a box for each occupied cell (coarse)
• Marching cubes: walk along the volume until a surface is hit,
output a piece of the surface in the volume (accurate)
Harder Conversions
• From point clouds / soups to surfaces / volumes: a challenge
•
•
•
•
Fill cells that contain points (coarse)
Implicit function fitting
Silhouette carving
Lots of other methods… no “perfect” way to deal with missing
data
• From surfaces to volumes :
• Fill cells that contain surface (coarse)
• Euclidean distance transform: wavefront propagation from
surface cells (closed surfaces, accurate)
Resolution issues
• Accuracy, space, & computation complexity determined by
resolution h
• Suppose object size W
• Point clouds, polygon meshes: O((W/h)2) space
• Voxels: O((W/h)3) space
• Downsampling
• Mesh simplification
Geometric queries
• Collision detection:
• When A and B are at hypothetical poses TA and TB, would they
collide?
• Distance calculation
• When A and B are at hypothetical poses TA and TB, how far are
they apart?
• Time-of-first contact
• When A and B move along paths TA(t) and TB(t), at what time do
they first collide?
Geometric primitives
• Points, Rays, Lines, Segments, Triangles, Spheres, Ellipses,
Boxes
• Collision detection, distance detection fast operations (O(1))
• Ex: segment – sphere collision detection / distance calculation
r
𝑐
𝑎
𝑏
Managing large scenes
• O(n2) pairs of objects – how to check collisions?
Collision Detection Methods
• Many different methods
• In particular:
• Grid method: good for many simple moving objects of about the
same size (e.g., many moving discs with similar radii)
• Closest-feature tracking: good for moving polyhedral objects
• Bounding Volume Hierarchy (BVH) method: good for few moving
objects with complex and diverse geometry
Grid Method
d
 Subdivide space into a
regular grid cubic of square
bins
 Index each object in a bin
Grid Method
d
Running time is proportional to
number of moving objects
Useful also to compute pairs of
objects within some distance (vision,
sound, …)
Closest-Feature Tracking
(M. Lin and J. Canny. A Fast Algorithm for Incremental Distance Calculation. Proc. IEEE Int. Conf. on
Robotics and Automation, 1991)
 The closest pair of features (vertex, edge,
face) between two polyhedral objects are
computed at the start configurations of the
objects
 During motion, at each small increment of
the motion, they are updated
 Efficiency derives from two observations:
 The pair of closest features
changes relatively infrequently
 When it changes the new closest features will
usually be on a boundary
of the previous closest features
Closest-Feature Test for VertexVertex
Vertex
Vertex
Application: Detecting Self-Collision
in Humanoid Robots
(J. Kuffneret al. Self-Collision and Prevention for Humanoid Robots. Proc. IEEE Int. Conf. on Robotics
and Automation, 2002)
Bounding Volume Hierarchy
Method
BVH with spheres:
S. Quinlan. Efficient Distance Computation Between Non-Convex Objects. Proc. IEEE
Int. Conf. on Robotics and Automation, 1994.
BVH with Oriented Bounding Boxes:
S. Gottschalk, M. Lin, and D. Manocha. OBB-Tree: A Hierarchical Structure for Rapid
Interference Detection. Proc. ACM SIGGRAPH '96, 1996.
Combination of BVH and feature-tracking:
S.A. Ehmann and M.C. Lin. Accurate and Fast Proximity Queries Between Polyhedra
Using Convex Surface Decomposition. Proc. 2001 Eurographics, Vol. 20, No. 3, pp. 500510, 2001.
Adaptive bisection in dynamic collision checking:
F. Schwarzer, M. Saha, J.C. Latombe. Adaptive Dynamic Collision Checking for Single
and Multiple Articulated Robots in Complex Environments, manuscript, 2003.
Bounding Volume Hierarchy
Method
 Enclose objects into bounding volumes (spheres or boxes)
 Check the bounding volumes first
 Decompose an object into two
Bounding Volume Hierarchy
Method
 Enclose objects into bounding volumes (spheres or boxes)
 Check the bounding volumes first
 Decompose an object into two
 Proceed hierarchically
Bounding Volume Hierarchy
Method
 Enclose objects into bounding volumes (spheres or boxes)
 Check the bounding volumes first
 Decompose an object into two
 Proceed hierarchically
Bounding Volume Hierarchy
Method
• BVH is pre-computed for each object
BVH in 3D
Collision Detection
A
A
C
B
D
E
F
C
B
G
D
E
F
Two objects described by their
precomputed BVHs
G
Collision Detection
Search tree
AA
pruning
A
A
Collision Detection
A
C
B
Search tree
AA
BB
BC
D
CB
E
F
CC
A
A
G
Collision Detection
A
C
B
Search tree
AA
BB
pruning
BC
D
CB
E
CC
F
G
Collision Detection
A
C
B
Search tree
AA
BB
BC
FD
E
D
CB
FE
If two leaves of the BVH’s overlap
(here, G and D) check their content
for collision
F
G
CC
GD
G
GE
D
Variant
A
C
B
Search tree
AA
BB
BA
BC
D
CB
CA
E
F
CC
A
A
G
Collision Detection
• Pruning discards subsets of the two objects that are separated
by the BVs
• Each path is followed until pruning or until two leaves overlap
• When two leaves overlap, their contents are tested for overlap
Search Strategy and Heuristics
 If there is no collision, all paths must eventually be
followed down to pruning or a leaf node
 But if there is collision, it is desirable to detect it as
quickly as possible
  Greedy best-first search strategy with
f(N) = d/(rX+rY)
[Expand the node XY
with largest relative
overlap (most likely to
contain a collision)]
rX
X
d
rY
Y
Recursive (Depth-First)
Collision Detection Algorithm
Test(A,B)
1.
2.
3.
4.
5.
If A and B do not overlap, then return 1
If A and B are both leaves, then return 0 if their contents overlap
and 1 otherwise
Switch A and B if A is a leaf, or if B is bigger and not a leaf
Set A1 and A2 to be A’s children
If Test(A1,B) = 1 then return Test(A2,B) else return 0
Performance
• Several thousand collision checks per second for 2 threedimensional objects each described by 500,000 triangles, on a
1-GHz PC
Distance Computation
> M, prune
M
Greedy Distance Computation
M (upper bound on distance) is initialized to infinity
Greedy-Distance(A,B,M)
1.
2.
3.
4.
5.
6.
7.
8.
If min-dist(A,B) > M, then return M
If A and B are both leaves, then return distance between their
contents
Switch A and B if A is a leaf, or if B is bigger and not a leaf
Set A1 and A2 to be A’s children
M  min(max-dist(A1,B), max-dist(A2,B), M)
d1  Greedy-Distance(A1,B,M)
d2  Greedy-Distance(A2,B,M)
Return Min(d1,d2)
Approximate Distance
M (upper bound on distance) is initialized to infinity
Approx-Greedy-Distance(A,B,M,a)
1.
2.
3.
4.
5.
6.
7.
8.
If (1+a)min-dist(A,B) > M, then return M
If A and B are both leaves, then return distance between their
contents
Switch A and B if A is a leaf, or if B is bigger and not a leaf
Set A1 and A2 to be A’s children
M  min(max-dist(A1,B), max-dist(A2,B), M)
d1  Approx-Greedy-Distance(A1,B,M,a)
d2  Approx-Greedy-Distance(A2,B,M,a)
Return Min(d1,d2)
Desirable Properties of
BVs and BVHs
BVs:
• Tightness
• Efficient testing
• Invariance
?
BVH:
 Separation
 Balanced tree
Spheres
• Invariant
• Efficient to test
• But tight?
Axis-Aligned Bounding Box
(AABB)
Axis-Aligned Bounding Box
(AABB)
 Not invariant
 Efficient to test
 Not tight
Oriented Bounding Box (OBB)
Oriented Bounding Box (OBB)
 Invariant
 Less efficient to test
 Tight
Comparison of BVs
Sphere AABB OBB
Tightness
-
--
+
Testing
+
+
o
no
yes
Invariance yes
No type of BV is optimal for all situations
Desirable Properties of
BVs and BVHs
BVs:
• Tightness
• Efficient testing
• Invariance
BVH:
 Separation
 Balanced tree
?
Desirable Properties of
BVs and BVHs
BVs:
• Tightness
• Efficient testing
• Invariance
BVH:
 Separation
 Balanced tree
Construction of a BVH
• Top-down construction
• At each step, create the two children of a BV
• Example:
For OBB, split longest side at midpoint
Computation of an OBB
[Gottschalk, Lin, and Manocha, 96]
 N points ai = (xi, yi, zi)T, i = 1,…, N
y
 SVD of A = (a1 a2 ... aN)
 A = UDVT where
 D = diag(s1,s2,s3) such
that s1  s2  s3  0
 U is a 3x3 rotation matrix
that defines the principal
axes of variance of the ai’s
 OBB’s directions
X
Y
rotation
described by
matrix U
 The OBB is defined by max and min
coordinates of the ai’s along these directions
 Possible improvements: use vertices of convex hull of the ai’s
or dense uniform sampling of convex hull
x