hello team,
I am using dataproc for spark streaming job.My cluster configuration is like 1 master and 4 worker node.
master->2 vcores and 7.5 GB Memory
worker->8 vcores and 64 GB Memory per node
I am submitting spark job in cluster mode and using 16 executors with 11000mb memory per executor Here cluster allowing me to use 192 GB(including overhead) of memory out of 256 GB .
So my question is where 64GB is getting used and If I want to make use of that 64GB memory how can I do that?yarn minimum allocation is 1024 mb and maximum is 49152 mb.
Thanks and regards,
Nagin Narbag.