System.setProperty("user.name","root")
val spark = new SparkContext("spark://miluo1:7077", "Spark Pi", "/usr/spark-1.3.1")
spark.addJar("C:\\Users\\root\\Desktop/io.jar")
val sc=spark.textFile("file:/root/2txt")
var sss= sc.first()
println(sss)
spark.stop()上面是代码,我是在Windows下eclipse(带scala插件)里直接运行的。 算是远程提交吧。但是读取不到文件。。如果是把spark://miluo1:7077 换成 local (本地模式)则没有问题。
下面是错误:
1. eclipse里的报错:
15/04/29 10:45:59 INFO SparkContext: Created broadcast 0 from textFile at SparkJava.java:21
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/root/2txt
at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:285)
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:203)2.日志报错(work节点上的)
15/04/29 10:23:49 ERROR FileAppender: Error writing stream to file /usr/spark-1.3.1/work/app-20150429102347-0046/0/st
derrjava.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1618)
at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
15/04/29 10:23:49 INFO Worker: Executor app-20150429102347-0046/0 finished with state KILLED exitStatus 143
15/04/29 10:23:49 INFO Worker: Cleaning up local directories for application app-20150429102347-0046有没大神做过远程提交的我之前做好了hadoop的远程作业提交和web项目整合希望大家多多指点下
val spark = new SparkContext("spark://miluo1:7077", "Spark Pi", "/usr/spark-1.3.1")
spark.addJar("C:\\Users\\root\\Desktop/io.jar")
val sc=spark.textFile("file:/root/2txt")
var sss= sc.first()
println(sss)
spark.stop()上面是代码,我是在Windows下eclipse(带scala插件)里直接运行的。 算是远程提交吧。但是读取不到文件。。如果是把spark://miluo1:7077 换成 local (本地模式)则没有问题。
下面是错误:
1. eclipse里的报错:
15/04/29 10:45:59 INFO SparkContext: Created broadcast 0 from textFile at SparkJava.java:21
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/root/2txt
at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:285)
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:203)2.日志报错(work节点上的)
15/04/29 10:23:49 ERROR FileAppender: Error writing stream to file /usr/spark-1.3.1/work/app-20150429102347-0046/0/st
derrjava.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1618)
at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
15/04/29 10:23:49 INFO Worker: Executor app-20150429102347-0046/0 finished with state KILLED exitStatus 143
15/04/29 10:23:49 INFO Worker: Cleaning up local directories for application app-20150429102347-0046有没大神做过远程提交的我之前做好了hadoop的远程作业提交和web项目整合希望大家多多指点下
解决方案 »
免费领取超大流量手机卡,每月29元包185G流量+100分钟通话, 中国电信官方发货