Jensen-Bregman LogDet Divergence for Efficient Similarity Computations on Positive Definite Tensors

Date of Submission: 
May 2, 2012
Report Number: 
12-013
Report PDF: 
Abstract: 
Covariance matrices provide an easy platform for fusing multiple features compactly and as a result have found immense success in several computer vision applications, including activity recognition, visual surveillance, and diffusion tensor imaging. An important task in all of these applications is to compute the distance between covariance matrices using a (dis)similarity function, for which the natural choice is the Riemannian metric corresponding to the manifold inhabited by these matrices. As this Riemannian manifold is not flat, the dissimilarities should take into account the curvature of the manifold. As a result such distance computations tend to slow down, especially when the matrix dimensions are large or gradients are required. Further, suitability of the metric to enable efficient nearest neighbor retrieval is an important requirement in the contemporary times of big data analytics. To alleviate these difficulties, this paper proposes a novel dissimilarity measure for covariances, the Jensen-Bregman LogDet Divergence (JBLD). This divergence enjoys several desirable theoretical properties, at the same time is computationally less demanding (compared to standard measures). To address the problem of efficient nearest neighbor retrieval on large covariance datasets, we propose a metric tree framework using kmeans clustering on JBLD. We demonstrate the superior performance of JBLD on covariance datasets from several computer vision applications.