Post-doc positions at the Getulio Vargas Foundations (Rio de Janeiro, Brazil)

10 views
Skip to first unread message

Diego Parente

unread,
Nov 13, 2022, 10:18:31 AM11/13/22
to MLCSB COSI

I'm recruiting post-docs to machine learning with me at the Getulio Vargas Foundation (FGV) in Rio de Janeiro.

Topic. Currently, my lab is investing in four fronts: i) graph neural networks; ii) probabilistic machine learning; iii) physics-informed neural networks; and iv) causal machine learning. We are also working on applications in software engineering. In principle, you will work on one (or more) of those broad directions. We can discuss this more thoroughly during the selection process.


Excellence. Our lab's work has been featured in prestigious ML conferences (e.g., NeurIPS, AISTATS, UAI) and in reputable journals (e.g., TNNLS). In the last years, we have i) pushed the state-of-the-art in Bayesian inference for big data [1,2,3] and ii) done empirical research to understand core principles of graph neural nets [4,5], both for temporal and static settings.


The place. You will work with me in the school of applied mathematics (EMAp), at the Getulio Vargas Foundation (FGV). Our department is diverse, counting on many excellent computer scientists, statisticians, and mathematicians. FGV has also set out to become a center of reference for data science, hiring top-notch faculty and investing heavily in computing infrastructure.


Funding. A successful candidate will get a scholarship of approximately 10 thousand BRL. This almost doubles what most institutions in Brazil offer -- and is more than enough to live comfortably.


How to apply? Just send me an email (diego.m...@fgv.br) with your CV, pointers to your favorite work, and a short description of your interests. Please add "post-doc" to the email subject. PhDs in, e.g., computer science, mathematics, statistics, and engineering are welcome. It is also fine if you didn't graduate yet, but has a graduation date in sight. I am committed to diversity and encourage folks from different backgrounds (e.g., nationality, sex, gender, and religion) to apply.


References

[1] Embarrassing parallel MCMC using deep invertible transformations UAI 2019

[2] Parallel MCMC without embarrassing failures, AISTATS 2022

[3] Learning GPLVM with arbitrary kernels using the unscented transformation, AISTATS 2021

[4] Rethinking pooling in graph neural networks, NeurIPS 2020

[5] Provably expressive temporal graph networks, NeurIPS 2022



See https://weakly-informative.github.io/ for more about me


Best,

Diego Mesquita

contact: diego dot mesquita at fgv dot br

Assistant Prof., FGV EMAp

Reply all
Reply to author
Forward
0 new messages