zl程序教程

您现在的位置是:首页 >  工具

当前栏目

Spark练习 - 提交作业到集群 - submit job via cluster

集群Spark 练习 提交 作业 job cluster via
2023-09-14 09:03:08 时间

Created by Wang, Jerry, last modified on Sep 12, 2015

start-master.sh ( sbin folder下)

then ps -aux
7334 5.6 0.6 1146992 221652 pts/0 Sl 12:34 0:05 /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/sbin/…/conf/:/root/devExpert/spar
monitor master node via url: http://10.128.184.131:8080
启动两个worker:

./spark-class org.apache.spark.deploy.worker.Worker spark://NKGV50849583FV1:7077 ( bin folder下)

提交job到集群

./spark-submit --class “org.apache.spark.examples.JavaWordCount” --master spark://NKGV50849583FV1:7077 /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt

成功执行job

./spark-submit --class “org.apache.spark.examples.JavaWordCount” --master spark://NKGV50849583FV1:7077 /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt
added by Jerry: loading load-spark-env.sh !!!1
added by Jerry:…
/root/devExpert/spark-1.4.1/conf
added by Jerry, number of Jars: 1
added by Jerry, launch_classpath: /root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar
added by Jerry,RUNNER:/usr/jdk1.7.0_79/bin/java
added by Jerry, printf argument list: org.apache.spark.deploy.SparkSubmit --class org.apache.spark.examples.JavaWordCount --master spark://NKGV50849583FV1:7077 /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt
added by Jerry, I am in if-else branch: /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/conf/:/root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master spark://NKGV50849583FV1:7077 --class org.apache.spark.examples.JavaWordCount /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/15 14:08:02 INFO SparkContext: Running Spark version 1.4.1
15/08/15 14:08:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
15/08/15 14:08:03 WARN Utils: Your hostname, NKGV50849583FV1 resolves to a loopback address: 127.0.0.1; using 10.128.184.131 instead (on interface eth0)
15/08/15 14:08:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/08/15 14:08:03 INFO SecurityManager: Changing view acls to: root
15/08/15 14:08:03 INFO SecurityManager: Changing modify acls to: root
15/08/15 14:08:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/08/15 14:08:04 INFO Slf4jLogger: Slf4jLogger started
15/08/15 14:08:04 INFO Remoting: Starting remoting
15/08/15 14:08:04 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.128.184.131:44792]
15/08/15 14:08:04 INFO Utils: Successfully started service ‘sparkDriver’ on port 44792.
15/08/15 14:08:04 INFO SparkEnv: Registering MapOutputTracker
15/08/15 14:08:04 INFO SparkEnv: Registering BlockManagerMaster
15/08/15 14:08:04 INFO DiskBlockManager: Created local directory at /tmp/spark-6fc6b901-3ac8-4acd-87aa-352fd22cf8d4/blockmgr-4c660a56-0014-4b1f-81a9-7ac66507b9fa
15/08/15 14:08:04 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/08/15 14:08:05 INFO HttpFileServer: HTTP File server directory is /tmp/spark-6fc6b901-3ac8-4acd-87aa-352fd22cf8d4/httpd-b4344651-dbd8-4ba4-be1a-913ae006d839
15/08/15 14:08:05 INFO HttpServer: Starting HTTP Server
15/08/15 14:08:05 INFO Utils: Successfully started service ‘HTTP file server’ on port 46256.
15/08/15 14:08:05 INFO SparkEnv: Registering OutputCommitCoordinator
15/08/15 14:08:05 WARN Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
15/08/15 14:08:05 WARN QueuedThreadPool: 2 threads could not be stopped
15/08/15 14:08:05 WARN Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042.
15/08/15 14:08:05 WARN Utils: Service ‘SparkUI’ could not bind on port 4042. Attempting port 4043.
15/08/15 14:08:06 WARN Utils: Service ‘SparkUI’ could not bind on port 4043. Attempting port 4044.
15/08/15 14:08:06 WARN Utils: Service ‘SparkUI’ could not bind on port 4044. Attempting port 4045.
15/08/15 14:08:06 INFO Utils: Successfully started service ‘SparkUI’ on port 4045.
15/08/15 14:08:06 INFO SparkUI: Started SparkUI at http://10.128.184.131:4045
15/08/15 14:08:06 INFO SparkContext: Added JAR file:/root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar at http://10.128.184.131:46256/jars/JavaWordCount-1.jar with timestamp 1439618886415
15/08/15 14:08:06 INFO AppClient C l i e n t A c t o r : C o n n e c t i n g t o m a s t e r a k k a . t c p : / / s p a r k M a s t e r @ N K G V 50849583 F V 1 : 7077 / u s e r / M a s t e r . . . 15 / 08 / 1514 : 08 : 06 I N F O S p a r k D e p l o y S c h e d u l e r B a c k e n d : C o n n e c t e d t o S p a r k c l u s t e r w i t h a p p I D a p p − 20150815140806 − 000315 / 08 / 1514 : 08 : 06 I N F O A p p C l i e n t ClientActor: Connecting to master akka.tcp://sparkMaster@NKGV50849583FV1:7077/user/Master... 15/08/15 14:08:06 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150815140806-0003 15/08/15 14:08:06 INFO AppClient ClientActor:Connectingtomasterakka.tcp://sparkMaster@NKGV50849583FV1:7077/user/Master...15/08/1514:08:06INFOSparkDeploySchedulerBackend:ConnectedtoSparkclusterwithappIDapp20150815140806000315/08/1514:08:06INFOAppClientClientActor: Executor added: app-20150815140806-0003/0 on worker-20150815125648-10.128.184.131-53710 (10.128.184.131:53710) with 8 cores
15/08/15 14:08:06 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150815140806-0003/0 on hostPort 10.128.184.131:53710 with 8 cores, 512.0 MB RAM
15/08/15 14:08:06 INFO AppClient C l i e n t A c t o r : E x e c u t o r a d d e d : a p p − 20150815140806 − 0003 / 1 o n w o r k e r − 20150815125443 − 10.128.184.131 − 34423 ( 10.128.184.131 : 34423 ) w i t h 8 c o r e s 15 / 08 / 1514 : 08 : 06 I N F O S p a r k D e p l o y S c h e d u l e r B a c k e n d : G r a n t e d e x e c u t o r I D a p p − 20150815140806 − 0003 / 1 o n h o s t P o r t 10.128.184.131 : 34423 w i t h 8 c o r e s , 512.0 M B R A M 15 / 08 / 1514 : 08 : 06 I N F O A p p C l i e n t ClientActor: Executor added: app-20150815140806-0003/1 on worker-20150815125443-10.128.184.131-34423 (10.128.184.131:34423) with 8 cores 15/08/15 14:08:06 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150815140806-0003/1 on hostPort 10.128.184.131:34423 with 8 cores, 512.0 MB RAM 15/08/15 14:08:06 INFO AppClient ClientActor:Executoradded:app201508151408060003/1onworker2015081512544310.128.184.13134423(10.128.184.131:34423)with8cores15/08/1514:08:06INFOSparkDeploySchedulerBackend:GrantedexecutorIDapp201508151408060003/1onhostPort10.128.184.131:34423with8cores,512.0MBRAM15/08/1514:08:06INFOAppClientClientActor: Executor updated: app-20150815140806-0003/0 is now LOADING
15/08/15 14:08:06 INFO AppClient C l i e n t A c t o r : E x e c u t o r u p d a t e d : a p p − 20150815140806 − 0003 / 1 i s n o w L O A D I N G 15 / 08 / 1514 : 08 : 06 I N F O A p p C l i e n t ClientActor: Executor updated: app-20150815140806-0003/1 is now LOADING 15/08/15 14:08:06 INFO AppClient ClientActor:Executorupdated:app201508151408060003/1isnowLOADING15/08/1514:08:06INFOAppClientClientActor: Executor updated: app-20150815140806-0003/0 is now RUNNING
15/08/15 14:08:06 INFO AppClientKaTeX parse error: Double subscript at position 1112: …ock broadcast_0_̲piece0 stored a…OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/08/15 14:08:16 INFO SparkContext: Successfully stopped SparkContext
15/08/15 14:08:16 INFO Utils: Shutdown hook called
15/08/15 14:08:16 INFO RemoteActorRefProvider R e m o t i n g T e r m i n a t o r : S h u t t i n g d o w n r e m o t e d a e m o n . 15 / 08 / 1514 : 08 : 16 I N F O U t i l s : D e l e t i n g d i r e c t o r y / t m p / s p a r k − 6 f c 6 b 901 − 3 a c 8 − 4 a c d − 87 a a − 352 f d 22 c f 8 d 415 / 08 / 1514 : 08 : 16 I N F O R e m o t e A c t o r R e f P r o v i d e r RemotingTerminator: Shutting down remote daemon. 15/08/15 14:08:16 INFO Utils: Deleting directory /tmp/spark-6fc6b901-3ac8-4acd-87aa-352fd22cf8d4 15/08/15 14:08:16 INFO RemoteActorRefProvider RemotingTerminator:Shuttingdownremotedaemon.15/08/1514:08:16INFOUtils:Deletingdirectory/tmp/spark6fc6b9013ac84acd87aa352fd22cf8d415/08/1514:08:16INFORemoteActorRefProviderRemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

如果关掉一个worker:

要获取更多Jerry的原创文章,请关注公众号"汪子熙":