How to set no of cores per executor in Standalone cluster mode

24 views
Skip to first unread message

shingh...@gmail.com

unread,
Feb 14, 2016, 4:01:32 PM2/14/16
to spark-notebook-user
Hi,
   I am using
    spark-notebook-0.6.2-scala-2.10.4-spark-1.5.2-hadoop-2.6.0-with-parquet



  I have created a notebook with meta data :

{
  "name": "testStandalone",
  "user_save_timestamp": "1970-01-01T01:00:00.000Z",
  "auto_save_timestamp": "1970-01-01T01:00:00.000Z",
  "language_info": {
    "name": "scala",
    "file_extension": "scala",
    "codemirror_mode": "text/x-scala"
  },
  "trusted": true,
  "customLocalRepo": null,
  "customRepos": null,
  "customDeps": null,
  "customImports": null,
  "customArgs": null,
  "customSparkConf": {
    "spark.app.name": "Notebook",
    "spark.master": "spark://gauss:7077",
    "spark.executor.memory": "1G",
    "spark.deploy.defaultCores": "4"
  },
  "kernelspec": {
    "name": "spark",
    "display_name": "Scala [2.10.4] Spark [1.5.2] Hadoop [2.6.0]   {Parquet ✓}"
  }
}

My notebook can connect to the standalone cluster at spark://gauss:7077.
But the no  of cores shown on the Spark Dashboard is 0. Please see attached.

May I know how to set  no cores per executor ?

Thanks in advance for your assistance !

Shing





sparkDashboard.png
Reply all
Reply to author
Forward
0 new messages