Spark只启动了Master,Worker没启动的

发表:2017-05-18 00:16:28

有四台虚拟机,1台master,3台slave,master启动Spark正常,查看slave中logs文件,发现报如下错误

Spark Command: /usr/jdk1.8/bin/java -cp /usr/hadoop/spark-2.0.2/conf/:/usr/hadoop/spark-2.0.2/jars/*:/usr/hadoop/hadoop2.7.3/etc/hadoop/ -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 
======================================== 
17/01/16 16:12:22 INFO Worker: Started daemon with process name: 3001@master 
17/01/16 16:12:22 INFO SignalUtils: Registered signal handler for TERM 
17/01/16 16:12:22 INFO SignalUtils: Registered signal handler for HUP 
17/01/16 16:12:22 INFO SignalUtils: Registered signal handler for INT 
17/01/16 16:12:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
17/01/16 16:12:22 INFO SecurityManager: Changing view acls to: hadoop 
17/01/16 16:12:22 INFO SecurityManager: Changing modify acls to: hadoop 
17/01/16 16:12:22 INFO SecurityManager: Changing view acls groups to: 
17/01/16 16:12:22 INFO SecurityManager: Changing modify acls groups to: 
17/01/16 16:12:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set() 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1. 
Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service 'sparkWorker' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkWorker' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. 
        at sun.nio.ch.Net.bind0(Native Method) 
        at sun.nio.ch.Net.bind(Net.java:433) 
        at sun.nio.ch.Net.bind(Net.java:425) 
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)


spark的配置文件spark-env.sh如下

export JAVA_HOME=/usr/jdk1.8 
export SCALA_HOME=/usr/hadoop/scala-2.11.4 
export HADOOP_HOME=/usr/hadoop/hadoop2.7.3 
export HADOOP_CONF_DIR=/usr/hadoop/hadoop2.7.3/etc/hadoop 
export SPARK_MASTER_IP=192.168.9.200 
export SPARK_WORKER_MEMORY=1g 
export SPARK_WORKER_CORES=1 
export SPARK_HOME=/usr/hadoop/spark-2.0.2


在后面加上SPARK_LOCAL_IP之后三个Worker正常启动

export SPARK_LOCAL_IP=127.0.0.1


相关文章