搭建spark集群时,集群启动后,刚开始在节点上jps查看进程会显示master与worker,过一分钟后再次jps两个进程就都没了,日志为:
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Spark Command: /usr/lib/java/jdk1.7.0_71/bin/java -cp ::/usr/local/spark/spark-1.0.0-bin-hadoop1/conf:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/datanucleus-rdbms-3.2.1.jar:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/datanucleus-api-jdo-3.2.1.jar:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/datanucleus-core-3.2.2.jar:/usr/local/hadoop/hadoop-1.2.1/conf -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip 192.168.81.128 --port 7077 --webui-port 8080
========================================14/12/03 11:34:55 INFO spark.SecurityManager: Changing view acls to: root
14/12/03 11:34:55 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root)
14/12/03 11:35:44 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/12/03 11:35:48 INFO Remoting: Starting remoting
Exception in thread "main" java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at akka.remote.Remoting.start(Remoting.scala:173)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
at org.apache.spark.deploy.master.Master$.startSystemAndActor(Master.scala:785)
at org.apache.spark.deploy.master.Master$.main(Master.scala:765)
at org.apache.spark.deploy.master.Master.main(Master.scala)
求大神解决