大叔经验分享(14)spark on yarn提交任务到集群后spark-submit进程一直等待
2023-09-14 09:00:08 时间
spark on yarn通过--deploy-mode cluster提交任务之后,应用已经在yarn上执行了,但是spark-submit提交进程还在,直到应用执行结束,提交进程才会退出,有时这会很不方便,并且不注意的话还会占用很多资源,比如提交spark streaming应用;
最近发现spark里有一个配置可以修改这种行为,提交任务的时候加长一个conf就可以
--conf spark.yarn.submit.waitAppCompletion=false
org.apache.spark.deploy.yarn.config
private[spark] val WAIT_FOR_APP_COMPLETION = ConfigBuilder("spark.yarn.submit.waitAppCompletion") .doc("In cluster mode, whether to wait for the application to finish before exiting the " + "launcher process.") .booleanConf .createWithDefault(true)
相关文章
- Hadoop - Kylin On OLAP1
- [React Recoil] Use selectors to calculate derived data based on state stored within a Recoil atom
- [Tailwind] Style Elements on hover and focus with Tailwind’s State Variants
- [Docker] Install Docker on Windows (hp) and start with Kitematic
- [React] React Fundamentals: Add-on ClassSet() for ClassName
- How to update pre-installed python on MacOS
- Android Studio解决unspecified on project app resolves to an APK archive which is not supported
- No space left on device you must specify the filesystem type--Linux重启挂在失败
- E-MapReduce集群如何使用Phoenix on HBase
- Atitit s2018.2 s2 doc list on home ntpc.docx Atiitt uke制度体系 法律 法规 规章 条例 国王诏书.docx Atiitt 手写文字
- 单点登录(Single Sign On),简称为 SSO
- 论文解读(RvNN)《Rumor Detection on Twitter with Tree-structured Recursive Neural Networks》