Seeking U.S. citizen postdoc for next fall (time-sensitive)

Skip to first unread message

Matt Lease

Feb 1, 2022, 12:35:07 PMFeb 1
to Crowdsourcing and Human Computation
I am seeking 1-2 qualified candidates (U.S. citizens only) for a time-sensitive postdoc funding proposal due at the end of February.  If the funding proposal were selected, the postdoc could start in the fall (October 1 or possibly sooner), and the position would pay 75-80K per year (plus benefits) for up to 2 years. 

The research would be conducted in one of two broad areas, with opportunity for the candidate to help define the topic to best match their interests and skills:

1. resolving disagreement between human answers or annotations via statistical aggregation and/or interface/interaction designs for data collection or discussion (e.g., collection of annotator rationales and/or use of multi-stage workflows, though many other directions could be pursued). Some related work from my lab:

Mucahid Kutlu, Tyler McDonnell, Tamer Elsayed, and Matthew Lease. Annotator Rationales for Labeling Tasks in CrowdsourcingJournal of Artificial Intelligence Research (JAIR), 69:143--189, 2020. Award Winning Papers Track. [ bib | pdf | blog-post from original best conference paper ]

Alexander Braylan and Matthew Lease. Modeling and Aggregation of Complex Annotations via Annotation Distances. In Proceedings of the Web Conference, pages 1807--1818, 2020. [ pdf | sourcecode | video | slides ]

2. interpretable and explainable natural language processing (NLP) -- Prof. Jessy Li and I will co-advise this postdoc. While neural models for text classification now achieve dominant performance across many text classification tasks, these models are typically far more difficult to interpret than simpler, traditional models, making it difficult for people to use, trust, and adopt such models in practice. While we are broadly interested in this area, our ongoing work includes developing novel neural architectures for text classification that integrate classic case-based reasoning with pre-trained language models (e.g., BERT, BART, etc.) to enable intuitive and faithful explanations for model predictions.

Given that this funding opportunity is due in February, please contact me sooner rather than later with a brief expression of interest, expected availability date, and your curriculum vitae.

Matt Lease
Associate Professor
School of Information 
University of Texas at Austin
Voice: (512) 471-9350 · Fax: (512) 471-3971 · Office: UTA 5.536
Reply all
Reply to author
0 new messages