在local模式下可以正常运行,放到集群中就出现如下错误:
Exception in thread "main" java.lang.NullPointerException
at com.quell.spark.pdfData$.main(pdfData.scala:27)
at com.quell.spark.pdfData.main(pdfData.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)程序部分如下:错误:while (i < tempList.length) 个人看不出这个问题所在,求大牛相助
Exception in thread "main" java.lang.NullPointerException
at com.quell.spark.pdfData$.main(pdfData.scala:27)
at com.quell.spark.pdfData.main(pdfData.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)程序部分如下:错误:while (i < tempList.length) 个人看不出这个问题所在,求大牛相助
解决方案 »
- 无法创建密钥对问题
- 求助:关于fuelweb安装openstack的问题
- 软通动力面试题
- CRITICAL nova [-] 'module' object has no attribute 有人知道原因吗?
- IAAS SAAS PAAS相关实现的技术
- docker中的image如何让多个tag链接到同一个id上
- kvm虚拟化技术能不能配置vcpu的主频大小?
- salesforce开发微信上传临时素材问题!求高人指点!!!
- WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platfo
- spark分组计算
- docker 容器中systemctl不能运行
- 使用cdh5.8.3安装完spark之后,发现spark角色类型Gateway显示的状态为“不适用”,请问正常吗
好像是这样的
val fs = fileSystem.listStatus(new Path(path))
val listPath = FileUtil.stat2Paths(fs)