apache spark - How to configure automatic restart of the application driver on Yarn -


from spark programming guide

to automatically recover driver failure, deployment infrastructure used run streaming application must monitor driver process , relaunch driver if fails. different cluster managers have different tools achieve this.

spark standalon

  • spark standalone - spark application driver can submitted run within spark standalone cluster (see cluster deploy mode), is, application driver runs on 1 of worker nodes. furthermore, standalone cluster manager can instructed supervise driver, , relaunch if driver fails either due non-zero exit code, or due failure of node running driver. see cluster mode , supervise in spark standalone guide more details.
  • yarn - yarn supports similar mechanism automatically restarting application. please refer yarn documentation more details. ....

    so, question how support auto-restart spark streaming on yarn.

, best regards,

tao

what looking set of instructions launch application in yarn "cluster mode" : https://spark.apache.org/docs/latest/running-on-yarn.html

this means driver application runs on cluster on yarn (not on local machine). such can restarted yarn if fails.


Comments

Popular posts from this blog

c++ - Difference between pre and post decrement in recursive function argument -

php - Nothing but 'run(); ' when browsing to my local project, how do I fix this? -

php - How can I echo out this array? -