Hi all,
Our next meeting in the Information Theory and Applications seminar will take place on Monday, February 26 at 10:00, in room A500.
The speaker is Dor Tsur, who will tell us about how estimating the mutual information between "good" projections of random vectors, and why this is useful.
See you there, Or, Oron, Yuval and Alex
---------------------------------------------
*Title*: Max-Sliced Mutual Information: Analysis, Applications and Interpretations *Abstract*: Quantifying dependence between high-dimensional random variables is central to statistical learning and inference. Two classical methods are canonical correlation analysis (CCA), which identifies maximally correlated projected versions of the original variables, and Shannon’s mutual information, which is a universal dependence measure that also captures higher-order dependencies. However, CCA only accounts for linear dependence, while mutual information is often infeasible to compute/estimate in high dimensions. We propose max-sliced mutual information (mSMI), which equals the maximal mutual information between low-dimensional projections of the high-dimensional variables and enjoys the best of both worlds: capturing intricate dependencies in the data while being amenable to fast computation and scalable estimation from samples. We show that mSMI retains favorable structural properties of Shannon’s mutual information and propose an efficiently computable neural estimator, which we couple with formal non-asymptotic error bounds. We present experiments that demonstrate the utility of mSMI for several contemporary learning tasks and draw connections to fundamental problems in statistics and information theory.
Reminder: this is happening tomorrow at 10:00.
On Wed, Feb 21, 2024 at 9:58 PM Or Ordentlich or.ordentlich@mail.huji.ac.il wrote:
Hi all,
Our next meeting in the Information Theory and Applications seminar will take place on Monday, February 26 at 10:00, in room A500.
The speaker is Dor Tsur, who will tell us about how estimating the mutual information between "good" projections of random vectors, and why this is useful.
See you there, Or, Oron, Yuval and Alex
*Title*: Max-Sliced Mutual Information: Analysis, Applications and Interpretations *Abstract*: Quantifying dependence between high-dimensional random variables is central to statistical learning and inference. Two classical methods are canonical correlation analysis (CCA), which identifies maximally correlated projected versions of the original variables, and Shannon’s mutual information, which is a universal dependence measure that also captures higher-order dependencies. However, CCA only accounts for linear dependence, while mutual information is often infeasible to compute/estimate in high dimensions. We propose max-sliced mutual information (mSMI), which equals the maximal mutual information between low-dimensional projections of the high-dimensional variables and enjoys the best of both worlds: capturing intricate dependencies in the data while being amenable to fast computation and scalable estimation from samples. We show that mSMI retains favorable structural properties of Shannon’s mutual information and propose an efficiently computable neural estimator, which we couple with formal non-asymptotic error bounds. We present experiments that demonstrate the utility of mSMI for several contemporary learning tasks and draw connections to fundamental problems in statistics and information theory.