What is the minimum Hardware insfracture required for spark to run on spark standalone cluster mode? -


enter image description herei running spark standalone cluster mode in local computer .this hardware information computer

intel core i5
number of processors: 1 total number of cores: 2
memory: 4 gb.

i trying run spark program eclipse on spark standalone cluster .this part of code .

  string logfile = "/users/bigdinosaur/downloads/spark-2.0.1-bin-hadoop2.7 2/readme.md"; //   sparkconf conf = new sparkconf().setappname("simple application").setmaster("spark://bigdinosaur.local:7077")); 

after running program in eclipse getting following warning message

initial job has not accepted resources; check cluster ui ensure workers registered , have sufficient resource

this screen shot of web ui

after going through other people answer on similar problem seems hardware resource mismatch root cause.

i want more information on

what minimum hardware insfracture required spark standalone cluster run application on ?

as per know. spark allocates memory whatever memory available when spark job starts.

you may want try explicitely providing cores , executor memory when starting job.


Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -