I'm Maximilian Golub, a masters student at the University of British Columbia under Guy Lemieux and Mieszko Lis.
I'm currently researching ways to accelerate the training and inference of reduced precision neural networks using FPGAs while reducing memory requirements by up to 10x. After graduating, I'll be working at Mercedes Research and Development in Seattle.
I recently submitted a paper to NIPS 2018 called "DropBack: Continuous Pruning During Training", which can reduce the runtime memory consumption of deep neural networks while training, you can find it here, or on arXiv. Using DropBack, I achieved state of the art results in pruning WRN-28-10 on CIFAR-10.
Things I love include field programmable gate arrays, Neural Networks, Python, automation, photography, skiing, and cars.
Right now, I'm working on learning more about Generative Adversarial Networks and Long Term Short Term Networks.