University of Cambridge > Talks.cam > Machine Learning @ CUED > Random Function Classes for Machine Learning

Random Function Classes for Machine Learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact dsu21.

Random function classes offer an extremely versatile tool for describing nonlinearities, as they are commonly employed in machine learning. This ranges from compact summaries of distributions to nonlinear function expansions. We show that Bloom Filters, the Count-Min sketch, and a new family of Semidefinite Sketches can all be viewed as attempts at finding the most conservative solution of a convex optimization problem (and with matching guarantees) when querying properties of a distribution. Moreover, the sketches themselves prove useful, e.g. when representing high-dimensional functions, thus leading to the hash kernel for generalized linear models and recommender systems. Next we discuss random kitchen sinks and their accelerated variants, fastfood, a-la-carte and deep-fried convnets. They offer memory-efficient alternatives to incomplete matrix factorization and decomposition for kernel functions. Finally, we combine this approach with sketches, using Pagh’s compressed matrix multiplication construction, yielding computationally efficient two-sample and independence tests.

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity