How can I scale my Spark cluster to test my deep learning distribution performance?

22 views
Skip to first unread message

Hamza Saaidia

unread,
Apr 9, 2023, 8:13:53 PM4/9/23
to User Group for BigDL
Hi team, 

I am studying the distribution effect for a deep learning algorithm, I am using only three machines, but I want to scale my Hadoop/Spark/BigDL cluster to many nodes, and I wonder if there exist efficient cloud providers (free, or low cost) to scale such a cluster.

Thanks in advance,

glor...@gmail.com

unread,
Apr 9, 2023, 9:51:18 PM4/9/23
to User Group for BigDL
Hi, you can use virtualization frameworks such as openstack, libvirt, etc. to generate many VMs on your 3 machines, or you can deploy a Kubernets on your 3 machines.

Hamza Saaidia

unread,
Apr 10, 2023, 12:06:36 AM4/10/23
to User Group for BigDL
Thank you for answering my question, 

Actually I mean that my hardware is not enough for the tests that I want to do, I want to scale my cluster for 20-50 nodes for example,  so I am looking for some cloud provider (free, or low cost) when I can create a larger cluster and run my code on it.

Hamza,
Reply all
Reply to author
Forward
0 new messages