whoami

I am a PhD student at the University of Utah, advised by Prof. Saday Sadayappan. My research interest lies broadly in High Performance Computing (HPC). In the past, I have worked with IBM Research in New Delhi, towards making AI faster. As a student, I did some research in languages and middleware for HPC at ETH Zurich and Inria. I went to college at the Birla Institute of Technology and Science (BITS) in a small village called Pilani, in India. While powerlifting gives me my daily adrenaline fix, I like to hike and ski when on a vacation.

Research interests

Currently, I am working on accelerating sparse tensor decompositions in both shared and distributed memory settings. Overall this problem is vastly multidimensional (no pun intended), primarily because the data (non-zero values) are not always structured. Therefore, load imbalance and high data movement make performance optimisation hard. I see tremendous applications of this research in several domains, including large scale training of neural networks, simulations of quantum circuits and computational physics and chemistry. Towards the end of my PhD, I aim to work on tailoring some performance optimisation techniques into these domain specific workloads.

In the past, I have enjoyed working on:

  • model pruning for NLP and graph neural nets
  • Kvik: task splitting middleware for the Rust language
  • DACE: a domain specific lanugage that enables dataflow-based performance optimisation
  • flight software for a nanosatellite!

Updates

  • July, 2020: Our paper on making the BERT model (in natural language processing) faster was presented at ICML’ 20.
  • February, 2021: My first patent application (as the primary inventor) filed by IBM Research with the USPTO.
  • June, 2021: Got the Outstanding Technical Achievement Award at IBM Research worldwide.
  • July, 2021: My 4th patent application filed by IBM Research with the USPTO.
  • March, 2021: Received a PhD offer from the University of Utah, under Prof. Sadayappan! Looking forward to joining in Fall’ 21.
  • August, 2021: Officially a student again! Super excited to work on accelerating sparse computations.