The Conversational AI Language team at Facebook is looking for research PhD interns in NLP for this Fall, preferably focused on, but not limited to, question answering, summarization, and on-device modeling. If you are enrolled in a PhD program and interested, please email me your resume at sonal...@fb.com.
Recent research from the team in 2020: VentureBeat article on a new training paradigm, won Google’s EfficientQ&A competition in the unrestricted track based on human evaluation, new SoTA on several tasks via R3F better fine-tuning (such as summarization, XNLI), new unified Q&A over structured and unstructured text (new SoTA on NaturalQA and other Q&A datasets), won WebNLG 2020 shared premier task in 2 human eval categories, new explainable paradigm for summarization, CoLING 2020 outstanding paper award, 5 papers in EMNLP/ICLR/EACL on semantic parsing (scaling languages and domains, code-switching, conversational semantic parsing), among others. The team also works actively in efficient model architectures for on-device ML, and privacy-preserving ML - like Federated Learning and Federated Personalization.
The team collaborates closely with researchers and engineers within and outside Facebook to develop novel ML/NLP systems. Our team is made up of strong research scientists and engineers and we are always on the lookout for people who are passionate about contributing to these advancements through the AI technologies listed above.