Wikipedia Dump filtering, preprocessing and TF-IDF creation.
461 views
Skip to first unread message
Karsten
unread,
Aug 31, 2012, 9:39:52 AM8/31/12
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to gen...@googlegroups.com
Hi,
I am about to implement Explicit Semantic Analysis (http://www.cs.technion.ac.il/~gabr/resources/code/esa/esa.html) for genism for my master thesis. There is a Python imeplementation. However, the structure of genism is more flexible and would allow TF-IDF, LSI or LDA transformations of Wikipedia as input.
Unfortunetaly I have some problems filtering the Wikipedia Dump to reduce the number of articles depending on their inter-article references. There is a Perl preprocessor called Wikiprep (http://sourceforge.net/apps/mediawiki/wikiprep/index.php?title=Main_Page) but it is not maintained anymore and crashes on my machine. I am looking into to it.
So does anybody know of a program that takes the Wikipedia Dump as an input resolves templates and produces some extra output such as inter-article references etc.?
Thx a lot,
Karsten
Radim Řehůřek
unread,
Aug 31, 2012, 11:58:06 AM8/31/12
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message