Competition winners

Skip to first unread message

João Lages

May 9, 2017, 8:37:20 AM5/9/17
to STS SemEval
Hi there, I am a student doing his thesis in "Computational Fact Checking" and I have been trying to find the best techniques to see if two texts are semantically related.
This task was about the same problem, although I am only doing this in English for now.

Is there any way I can get to know what kind of algorithms/features the winners of this competition used? Specially in the EN-EN task.

Best regards,
João Lages

Daniel Cer

Jul 6, 2017, 10:43:16 PM7/6/17
to STS SemEval
Hi João,

For the best performing methods on the 2017 STS shared task, see the task description paper: SemEval-2017 Task 1: Semantic Textual Similarity Multilingual and Cross-lingual Focused Evaluation to appear in Proceedings of the 11th International Workshop on Semantic Evaluations (SemEval-2017).

Section 6.6, Methods, provides a hopefully useful summary of the techniques used by top performing systems. Additionally, all of the state-of-the-art baseline semantic similarity models from section 8, STS Benchmark, are now open source. Links to the open source implementations are provided in the paper for everything except InferSent, which as open sourced after the preparation of the task paper (InferSent on github). 

Reply all
Reply to author
0 new messages