问题:安装了SPARK1.5.1,但就是启动报错,请大家帮忙指导下。报错信息:
[root@Master sbin]# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.out
failed to launch org.apache.spark.deploy.master.Master:
/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../bin/spark-class: line 78: cygpath: command not found
错误: 找不到或无法加载主类 org.apache.spark.deploy.master.Master
full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.out
Slave2.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.out
Master.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Master.Hadoop.out
Slave1.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out
Slave2.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker:
Slave2.Hadoop: 错误: 找不到或无法加载主类 org.apache.spark.launcher.Main
Slave2.Hadoop: full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.out
Master.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker:
Master.Hadoop: /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../bin/spark-class: line 78: cygpath: command not found
Master.Hadoop: 错误: 找不到或无法加载主类 org.apache.spark.deploy.worker.Worker
Master.Hadoop: full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Master.Hadoop.out
Slave1.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker:
Slave1.Hadoop: 错误: 找不到或无法加载主类 org.apache.spark.launcher.Main
Slave1.Hadoop: full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out环境:
export JAVA_HOME=/usr/jdk1.7.0_05
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export JAVA_OPTS="-Xms512M -Xmx1024M"
export CATALINA_OPTS="-Djava.awt.headless=true"
HADOOP_HOME=/opt/spark/hadoop-2.7.1
SCALA_HOME=/usr/local/scala/scala-2.11.7
SPARK_HOME=/usr/local/spark/spark-1.5.1-bin-hadoop2.6
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin:$SPARK_HOME/bin:$PATHexport HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
[root@Master sbin]# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.out
failed to launch org.apache.spark.deploy.master.Master:
/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../bin/spark-class: line 78: cygpath: command not found
错误: 找不到或无法加载主类 org.apache.spark.deploy.master.Master
full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.out
Slave2.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.out
Master.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Master.Hadoop.out
Slave1.Hadoop: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out
Slave2.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker:
Slave2.Hadoop: 错误: 找不到或无法加载主类 org.apache.spark.launcher.Main
Slave2.Hadoop: full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.out
Master.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker:
Master.Hadoop: /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../bin/spark-class: line 78: cygpath: command not found
Master.Hadoop: 错误: 找不到或无法加载主类 org.apache.spark.deploy.worker.Worker
Master.Hadoop: full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Master.Hadoop.out
Slave1.Hadoop: failed to launch org.apache.spark.deploy.worker.Worker:
Slave1.Hadoop: 错误: 找不到或无法加载主类 org.apache.spark.launcher.Main
Slave1.Hadoop: full log in /usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out环境:
export JAVA_HOME=/usr/jdk1.7.0_05
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH
export JAVA_OPTS="-Xms512M -Xmx1024M"
export CATALINA_OPTS="-Djava.awt.headless=true"
HADOOP_HOME=/opt/spark/hadoop-2.7.1
SCALA_HOME=/usr/local/scala/scala-2.11.7
SPARK_HOME=/usr/local/spark/spark-1.5.1-bin-hadoop2.6
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin:$SPARK_HOME/bin:$PATHexport HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
解决方案 »
免费领取超大流量手机卡,每月29元包185G流量+100分钟通话, 中国电信官方发货