convex deep modeling
Abstract: "Training deep predictive models with latent hidden layers poses a hard computational problem: since the model parameters have to be trained jointly with inference over latent variables, the convexity of the training problem is usually destroyed. In this talk, we first present a novel reformulation of supervised training of a two-layer architecture by introducing a latent feature kernel, which allows a rich set of latent feature representations to be captured while still allowing useful convex formulations via semidefinite relaxation. To tackle the resulting computational problem, efficient training algorithms are developed to exploit the specific structure of the problem.
In practice, deeper models have been essential for obtaining state of the art results.Therefore we then show that the two-layer approach can be extended to handle an arbitrary number of latent layers. To achieve this extension, a novel layer loss is proposed that is jointly convex in the adjacent normalized latent feature kernels. An efficient algorithmic approach is then developed for this extended formulation yielding promising empirical results. These results demonstrate the first fully convex formulation of training a deep architecture with an arbitrary number of hidden layers."
ai seminar series
Fridays at noon, Amii and the Department of Computing Science host AI Seminars, engaging presentations on topics in the broad field of artificial intelligence. With speakers from the University of Alberta and other world-leading groups, the talks give AI enthusiasts a friendly way of engaging with the latest trends and topics in research and development.
Seminars are open to the public, and no registration is required, though seating is limited and on a first-come-first-served basis. Topics range from foundational theoretical work to innovative applications of artificial intelligence technologies.
If you would like to present at an upcoming AI Seminar, please contact Colin Bellinger.
Join the AI Seminar mailing list to stay up-to-date on all the latest presentations.