News
Now that the 2020 Tea Time Talks are on Youtube, you can always have time for tea with Amii and the RLAI Lab! Hosted by Amii’s Chief Scientific Advisory Dr. Richard S. Sutton, these 20-minute talks on technical topics are delivered by students, faculty and guests. The talks are a relaxed and informal way of hearing leaders in AI discuss future lines of research they may explore, with topics ranging from ideas starting to take root to fully-finished projects.
Week nine of the Tea Time Talks features:
To effectively perform the task of next-word prediction, long short-term memory networks (LSTMs) keeps track of many types of information. Some information is directly related to the next word's identity, but some is more secondary -- for example, discourse-level features or features of downstream words. Correlates of secondary information appear in LSTM representations even though they are not part of an explicitly supervised prediction task. In contrast, reinforcement learning (RL) has found success in techniques that explicitly supervise representations to predict secondary information. In this talk, Qingfeng proposes predictive representation learning (PRL), which explicitly constrains LSTMs to encode specific predictions, such as those that might need to be learned implicitly. He shows that PRL significantly improves two strong language modeling methods, converges more quickly and performs better when data is limited. The fusion of RL with LSTMs shows that explicitly encoding a simple predictive task can facilitate the search for a more effective language model.
In this talk, Banafsheh introduces classical conditioning testbeds for studying the problem of state construction. These testbeds are modelled after tasks in psychology where an animal is exposed to a sequence of stimuli and has to construct an understanding of its state in order to predict what will happen next. The testbeds are proposed to study online multi-step prediction. Banafsheh provides results on the first testbed, characterizing a multitude of approaches including the common modern approaches as well as simpler methods inspired by models in animal learning.
The Tea Time Talks have now concluded for the year, but stay tuned as we will be uploading the remaining talks in the weeks ahead. In the meantime, you can rewatch or catch up on previous talks on our Youtube playlist.
Nov 22nd 2023
News
On October 17, Dr. Deepa Krishnaswamy, from Boston's Brigham and Women's Hospital, presented “AI-derived annotations for lung cancer collections using NCI Imaging Data Commons" at the AI Seminar.
Nov 16th 2023
News
Thinking of pursuing a PhD? Amii researcher Alona Fyshe explores how undertaking a PhD is like training for the Olympics, and how you can develop an Olympian state of mind.
Nov 16th 2023
News
Read our November monthly update on Alberta’s AI ecosystem. Featured this month: Impact Report & New U of A Faculty Positions.
Looking to build AI capacity? Need a speaker at your event?