解决spark on yarn报错:File /tmp/hadoop-root/nm-local-dir/filecache does not exist
2023-09-14 09:02:04 时间
在测试过程中遇到了类似如下的错误:
/tmp/hadoop-root/
root为用户名
Application application_xxxxxxxxx_yyyy failed 2 times due to AM Container for application_xxxxxxxxx_yyyy
exited with exitCode: -1000 due to: java.io.FileNotFoundException: File /tmp/hadoop-root/nm-local-dir/filecache does not exist
在/tmp/hadoop-spark/nm-local-dir路径下创建filecache文件夹即可解决报错问题。
mkdir /tmp/hadoop-root/nm-local-dir/filecache
参考连接:https://www.cnblogs.com/luogankun/p/4191796.html
相关文章
- 【华为云技术分享】云图说 | 华为云AnyStack on BMS解决方案:助力线下虚拟化业务迁移上云
- Hadoop Installation on Linux
- Hadoop - Kylin On OLAP
- [Machine Learning] Octave Computing on Data
- [Javascript] Use JavaScript's for-in Loop on Objects with Prototypes
- [Angular] Design API for show / hide components based on Auth
- 【收藏】Spark on yarn cgroups 资源隔离 | cgroup使用示例-cpu限制
- ssh远程连接出现someone counld be eavesdropping on you right now 的错误
- 论文解读《The Emerging Field of Signal Processing on Graphs》
- PAT 1127 ZigZagging on a Tree[难]
- 【云原生】Hadoop HA on k8s 环境部署