News
Mark Schmidt, Amii Canada CIFAR AI chair and Associate Professor of Computing Science at the University of British Columbia, has been awarded an Arthur B. McDonald Fellowship.
The fellowship, granted by the Natural Sciences and Engineering Research Council of Canada (NSERC), offers two-year grants to support early-career researchers "so that they can become leaders in their field and inspire others."
The $250,000 fellowship includes funding to relieve the recipients from teaching and administrative duties.
"It's amazing. It's not so easy to get teaching relief, so having time to focus on research is really special," Schmidt says.
Much of that research has focused on optimization in machine learning, computer vision and other applications to improve the speed and efficiency of machine learning models. With the support offered by the McDonald grant, Schmidt says he and his students plan to "double down" on that work.
"We're trying to develop methods that let you train machine learning models a lot cheaper, in terms of both time and money."
Schmidt's proposal to NSERC had a significant focus on optimizing hyperparameters in machine learning training. Hyperparameters are the variables that control how an ML model actually learns and are set before the training begins. Hyperparameters are generally set before a model is trained on data and involve trial and error. Researchers spend a lot of time training, tweaking hyperparameters, and then training again. This might happen many times over before the researchers find a set of hyperparameters that they are happy with.
Schmidt says this process of constantly having to start from scratch wastes a lot of time, computing power and cost. Some of the recent work Schmidt and his students have done involves one particular hyperparameter, the learning rate. This represents the pace at which a machine-learning model learns or updates a variable as it is being trained.
Their work has shown that instead of restarting the training process over and over with different learning rates, it can be possible to adjust the learning rate during training and still end up with results similar to if the ideal rate was set from the beginning. A paper detailing their work is set to be presented at this year's Neural Information Processing Systems conference in December.
Making more efficient, optimised models is an important step in advancing artificial intelligence. It allows the use of larger datasets that can be applied to solve more complex problems than wouldn’t have been possible otherwise. And creating more efficient models is vital to real-world applications of AI, as well as making them more accessible.
"It's a matter of what it used to take to train a model? If you had to do it in a [computer] cluster, maybe now you can do it on a workstation. What you used to do in a workstation, you can maybe do on a laptop. And what you did on a laptop, you can maybe do on your phone.
And what used to be impossible, well, maybe now you can solve that if you have better algorithms."
Jul 24th 2024
News
How do we get the best results when AI and human beings work together? In this episode of Approximately Correct, we’re looking into Human-In-The-Loop AI with Amii Fellow and Canada CIFAR AI Chair Matt Taylor.
Jul 22nd 2024
News
Read our monthly update on Alberta’s growing machine intelligence ecosystem and exciting opportunities to get involved.
Jul 18th 2024
News
Amii announces work with Communitech, a Waterloo Region innovation hub, to empower startup founders with the AI tools and resources they need to integrate AI and build in-house capabilities successfully. The collaboration will leverage Amii’s leading AI expertise and resources and be centred around Amii’s Machine Learning Exploration (ML Exploration) program.
Looking to build AI capacity? Need a speaker at your event?