How to do spares vector product in Spark?

13 views
Skip to first unread message

Xi Shen

unread,
Mar 13, 2015, 3:34:37 AM3/13/15
to Shanghai Linux User Group
Hi,

I have two RDD[Vector], both Vector are spares and of the form:

    (id, value)

"id" indicates the position of the value in the vector space. I want to apply dot product on two of such RDD[Vector] and get a scale value. The none exist values are treated as zero.

Any convenient tool to do this in Spark?


Thanks,
David

wormwang

unread,
Mar 13, 2015, 3:39:04 AM3/13/15
to sh...@googlegroups.com
改用Greenplum DB或HAWQ吧,network.pivotal.io可 以下载最新版本。
--
-- You received this message because you are subscribed to the Google Groups Shanghai Linux User Group group. To post to this group, send email to sh...@googlegroups.com. To unsubscribe from this group, send email to shlug+un...@googlegroups.com. For more options, visit this group at https://groups.google.com/d/forum/shlug?hl=zh-CN
---
您收到此邮件是因为您订阅了Google网上论坛上的“Shanghai Linux User Group”群组。
要退订此群组并停止接收此群组的电子邮件,请发送电子邮件到shlug+un...@googlegroups.com
要查看更多选项,请访问https://groups.google.com/d/optout

YiZhi Liu

unread,
Mar 18, 2015, 10:23:05 AM3/18/15
to shlug
try Vector in spark/mllib

2015-03-13 15:43 GMT+08:00 'wormwang' via Shanghai Linux User Group
<sh...@googlegroups.com>:
--
Yizhi Liu
Software Engineer / Data Mining
www.mvad.com, Shanghai, China
Reply all
Reply to author
Forward
0 new messages