Groups
Conversations
All groups and messages
Send feedback to Google
Help
Sign in
Groups
Spark Users
Conversations
About
Spark Users
1–30 of 1676
Mark all as read
Report abusive group
0 selected
Matei Zaharia
1/3/14
Announcement
IMPORTANT: Spark mailing lists have moved to Apache
As part of Spark's move to Apache, we have now made this Google Group read-only. To keep
unread,locked,
Announcement
IMPORTANT: Spark mailing lists have moved to Apache
As part of Spark's move to Apache, we have now made this Google Group read-only. To keep
1/3/14
seanm
, …
Tathagata Das
8
1/3/14
Exception in thread "DAGScheduler" java.lang.IllegalArgumentException: Shuffle ID 71056 registered twice
Does it stop with an error? Also after how long? Can you give an idea of the different values of
unread,
Exception in thread "DAGScheduler" java.lang.IllegalArgumentException: Shuffle ID 71056 registered twice
Does it stop with an error? Also after how long? Can you give an idea of the different values of
1/3/14
egraldlo
1/3/14
examples not build successfully
hi, I use maven to build the spark0.7.2, and I successfully build spark-core,streaming,bagel. But
unread,
examples not build successfully
hi, I use maven to build the spark0.7.2, and I successfully build spark-core,streaming,bagel. But
1/3/14
ngoc linh
,
Tathagata Das
5
1/3/14
Spark streaming, sliding window example and explaination
I got it all, your posts are really helpful. Thanks very much. On Friday, January 3, 2014 4:34:21 PM
unread,
Spark streaming, sliding window example and explaination
I got it all, your posts are really helpful. Thanks very much. On Friday, January 3, 2014 4:34:21 PM
1/3/14
Archit Thakur
2
1/3/14
Issue with sortByKey.
I saw Code of sortByKey: def sortByKey(ascending: Boolean = true, numPartitions: Int = self.
unread,
Issue with sortByKey.
I saw Code of sortByKey: def sortByKey(ascending: Boolean = true, numPartitions: Int = self.
1/3/14
Damien Dubé
,
Jey Kottalam
3
1/2/14
Unable to connect spark 0.8.1 (built for hadoop 2.2.0) to connect to mesos 0.14.2
I've tried it bulding spark with mesos 0.14.2 and I have the exact same error Stack: [
unread,
Unable to connect spark 0.8.1 (built for hadoop 2.2.0) to connect to mesos 0.14.2
I've tried it bulding spark with mesos 0.14.2 and I have the exact same error Stack: [
1/2/14
尹志军
1/2/14
No Response when running program on Spark on EC2 Cluster
Hi, I followed instructions in the document of 0.8.1 to run a program on EC2 cluster. 1. I can setup
unread,
No Response when running program on Spark on EC2 Cluster
Hi, I followed instructions in the document of 0.8.1 to run a program on EC2 cluster. 1. I can setup
1/2/14
Richard Conway
1/2/14
HDFS, Hadoop 2.2 and Spark error
Hi all, Wonder if someone can point me in the right direction ... I'm using the 0.81 binaries
unread,
HDFS, Hadoop 2.2 and Spark error
Hi all, Wonder if someone can point me in the right direction ... I'm using the 0.81 binaries
1/2/14
Archit Thakur
2
1/2/14
Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
Need not mention Workers could be seen on the UI. On Thu, Jan 2, 2014 at 5:01 PM, Archit Thakur <
unread,
Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
Need not mention Workers could be seen on the UI. On Thu, Jan 2, 2014 at 5:01 PM, Archit Thakur <
1/2/14
Archit Thakur
,
Christopher Nguyen
3
1/2/14
Not able to understand Exception.
Yes, I am using My Custom Data Structures (for Key and Value) and have registered different
unread,
Not able to understand Exception.
Yes, I am using My Custom Data Structures (for Key and Value) and have registered different
1/2/14
Archit Thakur
3
12/30/13
NPE while reading broadcast variable.
I am still getting it. I googled and found a similar open problem on stackoverflow: http://
unread,
NPE while reading broadcast variable.
I am still getting it. I googled and found a similar open problem on stackoverflow: http://
12/30/13
Izhar ul Hassan
12/29/13
running sparkPi example on spark-0.8.1 with yarn 2.2.0
Command: SPARK_JAR=./assembly/target/scala-2.9.3/spark-assembly-0.8.1-incubating-hadoop2.2.0.jar \ ./
unread,
running sparkPi example on spark-0.8.1 with yarn 2.2.0
Command: SPARK_JAR=./assembly/target/scala-2.9.3/spark-assembly-0.8.1-incubating-hadoop2.2.0.jar \ ./
12/29/13
Izhar ul Hassan
,
Patrick Wendell
3
12/29/13
Errors with Spark-0.8.1 and hadoop-yarn 2.2.0
Hi, Sorry that was my mistake. I corrected it. On Friday, December 27, 2013 6:52:20 PM UTC+1, Patrick
unread,
Errors with Spark-0.8.1 and hadoop-yarn 2.2.0
Hi, Sorry that was my mistake. I corrected it. On Friday, December 27, 2013 6:52:20 PM UTC+1, Patrick
12/29/13
Scott Langevin
, …
Max
5
12/27/13
Trouble deploying Java Standalone Spark Job
What is the difference between including my app (dependencies) jars in SPARK_CLASSPATH and putting
unread,
Trouble deploying Java Standalone Spark Job
What is the difference between including my app (dependencies) jars in SPARK_CLASSPATH and putting
12/27/13
zxz
12/26/13
Spark what is yarn-client mode exactly ?
Apache-spark rencently update the version to 0.8.1, in which yarn-client mode is available. My
unread,
Spark what is yarn-client mode exactly ?
Apache-spark rencently update the version to 0.8.1, in which yarn-client mode is available. My
12/26/13
Xiaozhe Wang
,
Cheng Lian
2
12/26/13
How can we print some output in the map function
Hi Kaden. Are you running your application on a Spark cluster? If so, `f' is executed by the
unread,
How can we print some output in the map function
Hi Kaden. Are you running your application on a Spark cluster? If so, `f' is executed by the
12/26/13
ngoc linh
12/25/13
Fail to bind address when creating SparkContext
Hi, I'm deploy spark in YARN. Spark is 0.8.1 Hadoop is 2.2.0 But when I run driver program. It
unread,
Fail to bind address when creating SparkContext
Hi, I'm deploy spark in YARN. Spark is 0.8.1 Hadoop is 2.2.0 But when I run driver program. It
12/25/13
luanxue...@gmail.com
,
Kay Ousterhout
3
12/25/13
spark work web UI
To configure the web UI port on the worker, you need to set SPARK_WORKER_WEBUI_PORT appropriately in
unread,
spark work web UI
To configure the web UI port on the worker, you need to set SPARK_WORKER_WEBUI_PORT appropriately in
12/25/13
zxz
2
12/24/13
run spark on YARN with yarn-client mode ,got an error: SparkContext: Master yarn-client does not match expected format
Never mind , I found out I followed the wrong version of documentation . yarn-client mode is
unread,
run spark on YARN with yarn-client mode ,got an error: SparkContext: Master yarn-client does not match expected format
Never mind , I found out I followed the wrong version of documentation . yarn-client mode is
12/24/13
Matei Zaharia
, …
Patrick Wendell
34
12/24/13
IMPORTANT: Spark mailing lists moving to Apache by September 1st
Hey Andy - these Nabble groups look great! Thanks for setting them up. On Tue, Dec 24, 2013 at 10:49
unread,
IMPORTANT: Spark mailing lists moving to Apache by September 1st
Hey Andy - these Nabble groups look great! Thanks for setting them up. On Tue, Dec 24, 2013 at 10:49
12/24/13
mingle
12/24/13
How can I achieve the elasticsearch and shark(spark) Integration
I have used the elasticsearch-hadoop successfully, mapreduce run over elasticsearch instead of hdfs
unread,
How can I achieve the elasticsearch and shark(spark) Integration
I have used the elasticsearch-hadoop successfully, mapreduce run over elasticsearch instead of hdfs
12/24/13
Debasish Das
, …
Kevin Moulart
14
12/23/13
Spark compilation with CDH 4.5.0
Hey thanks again, that's what I feared anyway, but I had hope they would assure
unread,
Spark compilation with CDH 4.5.0
Hey thanks again, that's what I feared anyway, but I had hope they would assure
12/23/13
Patrick Wendell
12/23/13
Re: Failed To Launch spark-shell on EC2
Ya this can be confusing (the logging output gets interpolated with the shell). Just hitting "
unread,
Re: Failed To Launch spark-shell on EC2
Ya this can be confusing (the logging output gets interpolated with the shell). Just hitting "
12/23/13
Archit Thakur
,
Gary Malouf
2
12/23/13
ADD_JARS doubt.!!!!!
I would not recommend putting your text files in via ADD_JARS. The better thing to do is to put those
unread,
ADD_JARS doubt.!!!!!
I would not recommend putting your text files in via ADD_JARS. The better thing to do is to put those
12/23/13
chi zhang
12/21/13
Question about parallelism of saveAsNewHadoopFile
Hi, I'm testing my spark program recently, in which I have 4 RDD of different data, after a
unread,
Question about parallelism of saveAsNewHadoopFile
Hi, I'm testing my spark program recently, in which I have 4 RDD of different data, after a
12/21/13
chi zhang
12/21/13
Question about parallelism of saveAsNewHadoopFile
Hi, I'm testing my spark program recently, in which I have 4 RDD of different data, after a
unread,
Question about parallelism of saveAsNewHadoopFile
Hi, I'm testing my spark program recently, in which I have 4 RDD of different data, after a
12/21/13
nazar buzun
,
Mark Hamstra
3
12/21/13
What does this exception mean?
private var nodes: RDD[(Int, StaticNode)] = ....some code for init.... ....... val tmp = nodes nodes
unread,
What does this exception mean?
private var nodes: RDD[(Int, StaticNode)] = ....some code for init.... ....... val tmp = nodes nodes
12/21/13
Xicheng Dong
,
Josh Rosen
2
12/20/13
could python program in spark run on hadoop-yarn?
It might work using the `yarn-client` mode (https://spark.incubator.apache.org/docs/latest/running-on
unread,
could python program in spark run on hadoop-yarn?
It might work using the `yarn-client` mode (https://spark.incubator.apache.org/docs/latest/running-on
12/20/13
richard...@gmail.com
,
Reynold Xin
2
12/20/13
matrix streaming algorithm in Spark
Definitely you can stream it. You just need to do val rdd = sc.textFile("hdfs://....") rdd.
unread,
matrix streaming algorithm in Spark
Definitely you can stream it. You just need to do val rdd = sc.textFile("hdfs://....") rdd.
12/20/13
Zhitao Yan
,
Matei Zaharia
4
12/20/13
java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.Client when runing Spark on Yarn Sample
Thanks for the reply. The second issue also has been resolved by rebuild use the same version as our
unread,
java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.Client when runing Spark on Yarn Sample
Thanks for the reply. The second issue also has been resolved by rebuild use the same version as our
12/20/13