I am a Research Assistant at Idiap Research Institute and a PhD student at EPFL, Switzerland. My advisor is François Fleuret. My research is in the area of deep learning. I am currently investigating the local linear structure (i.e.; input-output gradients) of deep neural networks, and developing tools that can leverage these to improve learning. So far, I’ve worked on problems in distillation and interpretability.
Prior to this, I completed my Masters (by Research) at the Indian Institute of Science (IISc), where I was advised by Prof. Venkatesh Babu. During this time I worked on adaptively controlling the computational complexity of deep neural nets by using a multiplicative gating mechanism and then sparsifying these gates during training. Using this technique, I was able to learn the number of neurons in a layer, train sparse neural networks, and learn dropout rates. My master’s thesis is available here.
I am broadly interested in issues relating to robustness, adaptability (transfer learning/continual learning) and interpretability of deep neural networks.
My CV is here. (Last Updated: September 2019)