Learning sparse polynomials from random samples is a notorious problem in learning theory. We show that
if the coefficients are in general position we can learn such functions efficiently. We show how this problem
has applications on learning the structure of unknown graphs by observing the values of graph cuts. We
further discuss our on-going work on learning variable interactions (quadratic polynomials) very efficiently.
Our techniques are coding theoretic and boil down to solving noisy linear equations over a finite field.
Based on joint work with Murat Kocaoglu, Karthik Shanmugam, and Adam Klivans.
Alex Dimakis is an Assistant Professor at the Electrical and Computer Engineering department, University of Texas at Austin. From 2009 until 2012 he was with the Viterbi School of Engineering, University of Southern California. He received his Ph.D. in 2008 and M.S. degree in 2005 in electrical engineering and computer sciences from UC Berkeley and the Diploma degree from the National Technical University of Athens in 2003. During 2009 he was a CMI postdoctoral scholar at Caltech.
He received an NSF Career award in 2011, a Google faculty research award in 2012 and the Eli Jury dissertation award in 2008. He is the co-recipient of several best paper awards including the joint Information Theory and Communications Society Best Paper Award in 2012. He is currently serving as an associate editor for IEEE Signal Processing letters and IEEE Transactions on Information Theory. His research interests include information theory, coding theory and machine learning with a current focus on distributed storage and graph analytics.