I'm Maximilian Golub, an associate software engineer at Mercedes-Benz Research and Development
I'm currently working on secrete projects that are cool and neat, but can't be talked about publicly.
Previously, I was a Master's student at UBC under the supervision of Mieszko Lis and Guy Lemieux, where I researched ways to accelerate the training and inference of reduced precision neural networks using FPGAs while reducing memory requirements by up to 10x.
I recently submitted a paper to SYSML 2018 called "DropBack: Continuous Pruning During Training", which can reduce the runtime memory consumption of deep neural networks while training, you can find it here, or on arXiv. Using DropBack, I achieved state of the art results in pruning WRN-28-10 on CIFAR-10.
Things I love include field programmable gate arrays, Neural Networks, Python, automation, photography, skiing, and cars.
Right now, I'm working on learning more about Generative Adversarial Networks and Long Term Short Term Networks.