Transcript PPT
Approximate Nearest Subspace Search with applications to pattern recognition Ronen Basri Tal Hassner Weizmann Institute Lihi Zelnik-Manor Caltech Subspaces in Computer Vision •Illumination •Faces Basri & Jacobs, PAMI’03 •Objects QuickTime™ and a TIFF (LZW) decompressor are needed to see this picture. •Viewpoint, Motion •Dynamic textures •… Zelnik-Manor & Irani, PAMI’06 Nayar et al., IUW’96 Nearest Subspace Search Query Which is the Nearest Subspace? Sequential Search Database n subspaces d dimensions k subspace dimension Sequential search: O(ndk) Too slow!! Is there a sublinear solution? A Related Problem: Nearest Neighbor Search Database n points d dimensions Sequential search: O(nd) There is a sublinear solution! Approximate NN • Tree search (KD-trees) r • Locality Sensitive Hashing (1+)r Query: Logarithmic Preprocessing: O(dn) Fast!! Is it possible to speed-up Nearest Subspace Search? Existing point-based methods cannot be applied LSH Tree search Our Suggested Approach • Reduction to points • Works for both linear and affine spaces Sequential Run time Our Database size Problem Definition S Subspace with dim k q Query Find Mapping u f (S) v g(q) Independent mappings 2 u v dist 2 (q,S) A linear function of original distance Apply standard point ANN to u,v Monotonic in distance Finding a Reduction dist (q,S) SS q q 2 T q 2 Vec(SS T I) Vec(qqT ) u 2 uv u v T SS q v Feeling lucky? 2 2 Constants? S 2 2dist q,S We are lucky !! 2 u dk 2 v q 4 Depends on query Basic Reduction u Vec(SS I) T v T Vec(qq ) 2 u v dist (q,S) 2 Want: minimize / Geometry of Basic Reduction u Vec(SS I) T v T Vec(qq ) Query Lies on a cone 2 v q Database u Lies on a sphere and on a hyper-plane 2 4 dk Improving the Reduction u Vec(SS I) T T v Vec(qq ) Final Reduction u Vec(SS I) T T v Vec(qq ) ,, = constants Can We Do Better? q uv dist (q,S) 0 2 If =0 Trivial mapping 2 uv 0 Additive Constant is Inherent Final Mapping Geometry ANS Complexities Preprocessing: Query: O(nkd2) O(d2)+TANN(n,d2) Linear in n Log in n Dimensionality May be Large • Embedding in d2 • Might need to use small ε • Current solution: –Use random projections (use JohnsonLindenstrauss Lemma) –Repeat several times and select the nearest Synthetic Data Varying dimension Varying database size Our Our Run time Sequential Run time Sequential Database size dimension d=60, k=4 n=5000, k=4 Face Recognition (YaleB) Database 64 illuminations k=9 subspaces Query: New illumination Face Recognition Result Wrong Match True NS Approx NS Wrong Person Retiling with Patches Wanted Query Patch database Approx Image Retiling with Subspaces Wanted Query Subspace database Approx Image Patches + ANN ~0.6sec Subspaces + ANS ~1.2 sec Patches + ANN ~0.6sec Subspaces + ANS ~1.2 sec Summary • • • • Fast, approximate nearest subspace search Reduction to point ANN Useful applications in computer vision Disadvantages: – Embedding in d2 – Additive constant • Other methods? • Additional applications? A lot more to be done….. THANK YOU