Skip to main content
Please note this event occurred in the past.
October 19, 2023 2:30 pm - 2:30 pm ET
Learning Learning
LGRT 1621

This work extends notions of Differential Privacy (DP) to Riemannian Manifolds. Many DP methods which deal with data which naturally exist in non-linear spaces use extrinsic methods then rely on projection or fancy tricks akin to projection; all the methods presented in this work are entirely intrinsic to the manifold. That is, privatization and estimation retrain the structure of the original data. In particular we consider a mechanism referred to as the K-norm Gradient Mechanism (KNG) which is an instantiation of the exponential mechanism. These mechanisms require an energy function (utility, score, loss) which we set to be the Fr\'echet variance function in order to privately release the mean. We show that intrinsic methods add less noise than extrinsic methods under the same privacy constraints. Further, we have some empirical results under Kendall's shape space which is effectively the complex projective space.