I am a pre-doctoral Research Fellow at Microsoft Research India, where I work with Navin Goyal. I am broadly interested in analysis and interpretability of models in Natural Language Processing. Our work involves analyzing neural networks to understand their ability to model different behavior and properties relevant for modeling syntax in language. We also work towards building more robust and explainable methods for solving semantic parsing problems.

Before I joined MSR, I spent a wonderful semester working with Partha Talukdar at the Machine and Language Learning (MALL) Lab in IISc. Prior to that, I spent a summer at A*STAR in Singapore where I worked with Anders Skanderup. In the summer of 2016, I worked with Chris Mungall and Dan Keith as a Google Summer of Code student. I (used to) occasionally solve problems on websites like Kaggle and answer questions on stats.stackexchange forum.

I graduated with B.E. (Hons.) in Computer Science and Int. M.Sc. (Hons.) in Biological Science from BITS Pilani, India in 2019. For more details, refer to my CV or drop me an email.

Publications

  Google Scholar|   Semantic Scholar

On the Ability and Limitations of Transformers to Recognize Formal Languages
Satwik Bhattamishra, Kabir Ahuja, Navin Goyal
EMNLP'20 | Conference on Empirical Methods in Natural Language Processing
pdf code abstract

On the Practical Ability of RNNs to Recognize Hierarchical Languages
Satwik Bhattamishra, Kabir Ahuja, Navin Goyal
COLING'20 | International Conference on Computational Linguistics
[Recipient of the Best Short Paper Award]

pdf code abstract

On the Computational Power of Transformers and Its Implications in Sequence Modeling
Satwik Bhattamishra, Arkil Patel, Navin Goyal
CoNLL'20 | Conference on Computational Natural Language Learning
pdf code abstract

Unsung Challenges of Building and Deploying Language Technologies for Low Resource Language Communities
Pratik Joshi, Christain Barnes, Sebastin Santy, Simran Khanuja, Sanket Shah, Anirudh Srinivasan, Satwik Bhattamishra, Sunayana Sitaram, Monojit Choudhury and Kalika Bali
ICON'19 | International Conference on Natural Language Processing
pdf abstract cite

Submodular Optimization-based Diverse Paraphrasing and its Effectiveness in Data Augmentation
Ashutosh Kumar*, Satwik Bhattamishra*, Manik Bhandari, Partha Talukdar
NAACL'19 | North American Chapter of the Association for Computational Linguistics
pdf code abstract

Tools

LibNMF
An easy to use python library with implementations of a set of tested optimization and regularization methods of NMF. Implemented Algorithms include graph regularized NMF, probabilistic NMF, a first-order primal-dual algorithm ...etc
Github

PyDPP
A python package available in pip with modules for sampling from Determinantal Point Processes (DPP). Contains implementations of algorithms to sample from DPPs that encourage diversity in the selection of a subset of points from a grounded superset.
Github

Service

Reviewer   EMNLP 2020   NAACL 2021
Sub-Reviewer   CoNLL 2019
BITS Pilani
2014 - 2019
Google Summer of Code
S2016
A*STAR, Singapore
S2017
Indian Institute of Science
F2018
Microsoft Research India
2019 - Present
  Template: Sebastin