eval on test

129 views
Skip to first unread message

amingf...@gmail.com

unread,
Apr 9, 2019, 2:26:00 AM4/9/19
to Visual Commonsense Reasoning
Hello,

If I do not use your eval_for_leaderboard.py, could I submit two 'preds.npy'?

Rowan Zellers

unread,
Apr 9, 2019, 12:30:06 PM4/9/19
to Visual Commonsense Reasoning
no, sorry :( You don't need to use my script, but you do need to submit the prediction file in the specified format.

Yanan Wang

unread,
Sep 9, 2021, 2:42:39 AM9/9/21
to Visual Commonsense Reasoning
Hi,

Could I only submit the prediction of the Q-A task?  As we want to confirm the performance of our model on the Q-A subtask at first.

I  look forward to your reply.

Best,
Yanan

2019年4月10日水曜日 1:30:06 UTC+9 Rowan Zellers:

Rowan Zellers

unread,
Sep 10, 2021, 2:03:07 PM9/10/21
to Visual Commonsense Reasoning
Sure! Just make up dummy predictions for the rationales task and submit those. You'll then need to wait a week to submit QA->R as per the rules -- note that it might be better to double check if things are working just on the validation set, before submitting to test :)

thanks,
Rowan
Reply all
Reply to author
Forward
0 new messages