October 12, 2021
We are looking for summer research interns at the AMR parsing team.
the main internship features are
- research project topic aligned to the student’s research area
- the high success rate in getting papers published in top venues (ACL, NAACL, EMNLP)
- work can be open-sourced through IBM’s Github (MIT, Apache2.0 license)
- we work as a team, even if the student is the lead author there is plenty of extra hands
- other potentially useful experiences for an intern: patent, algorithms deployed
some of the previous intern’s papers (4/5 are intern led)
GPT-too: A language-model-first approach for AMR-to-text generation
https://aclanthology.org/2020.acl-main.167/
https://aclanthology.org/2020.acl-main.167/
Bootstrapping Multilingual AMR with Contextual Word Alignments
https://aclanthology.org/2021.eacl-main.30/
https://aclanthology.org/2021.eacl-main.30/
Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing
(2021.emnlp-main) https://openreview.net/forum?id=qjDQCHLXCNj
(2021.emnlp-main) https://openreview.net/forum?id=qjDQCHLXCNj
topics range for next year is wide
- structured synthetic data generation for improved parsing/generation
- natural-to-formal (code, AMR), formal-to-formal transformations
- document-level or cross-lingual AMR Parsing
No experience with AMR is needed. AMR is a good conduit to think about the relation between natural and formal descriptions as well as the role of inductive biases and structure in neural models that deal with those. That is the space where we hope to contribute.
Contact Ramon Fernandez Astudillo <ramon.astudillo@ibm.com>