This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Spark Stand Alone Mode
What would happen if the resources required by a spark cluster deployment are not available ? By spark cluster deployment I mean starting the master and workers for a single standalone cluster with certain requirements. Will the spark cluster begin with the resources that are available and then get other resources later or will the application return an error ?
These are the few cases that I can think of. I am sure there are many more but I could not find any info on this topic.
1.Spark cluster needs 4 nodes and 3 are available.
2.Deployment requires 3 workers with 2 cores each but only 2 workers are available with 2 cores, the third one can only get one core at the moment.
3.Deployment requires 3 workers with 500 mb each but only two workers can get 500mb and the third one cannot get 500mb.
It would help me if you also provided a source of the answer.
Subreddit
Post Details
- Posted
- 8 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/apachespark...