메뉴 건너뛰기

Cloudera, BigData, Semantic IoT, Hadoop, NoSQL

Cloudera CDH/CDP 및 Hadoop EcoSystem, Semantic IoT등의 개발/운영 기술을 정리합니다. gooper@gooper.com로 문의 주세요.


실행 : python3 DataSetCreator.py -i s2rdf/data/sparql.in -s 0.25

=>http://stackoverflow.com/questions/27792839/spark-fail-when-running-pi-py-example-with-yarn-client-mode 참조

-----------------------------로그내용------------------------------
Input RDF file ->"
16/05/27 18:22:57 INFO SparkContext: Running Spark version 1.6.1
16/05/27 18:22:57 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/05/27 18:22:57 WARN SparkConf: Detected deprecated memory fraction settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and storage memory management are unified. All memory fractions used in the old model are now deprecated and no longer read. If you wish to use the old memory management, you may explicitly enable `spark.memory.useLegacyMode` (not recommended).
16/05/27 18:22:57 INFO SecurityManager: Changing view acls to: hadoop
16/05/27 18:22:57 INFO SecurityManager: Changing modify acls to: hadoop
16/05/27 18:22:57 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/05/27 18:22:57 INFO Utils: Successfully started service 'sparkDriver' on port 56181.
16/05/27 18:22:58 INFO Slf4jLogger: Slf4jLogger started
16/05/27 18:22:58 INFO Remoting: Starting remoting
16/05/27 18:22:58 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@XXX.XXX.XXX.43:34384]
16/05/27 18:22:58 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 34384.
16/05/27 18:22:58 INFO SparkEnv: Registering MapOutputTracker
16/05/27 18:22:58 INFO SparkEnv: Registering BlockManagerMaster
16/05/27 18:22:58 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-cdc351b1-92b1-405c-9127-fca2f798daf3
16/05/27 18:22:58 INFO MemoryStore: MemoryStore started with capacity 1247.3 MB
16/05/27 18:22:58 INFO SparkEnv: Registering OutputCommitCoordinator
16/05/27 18:22:58 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/05/27 18:22:58 INFO SparkUI: Started SparkUI at http://XXX.XXX.XXX.43:4040
16/05/27 18:22:58 INFO HttpFileServer: HTTP File server directory is /tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74/httpd-8faa7605-d0e3-44b9-ba73-d18ce63fe8f1
16/05/27 18:22:58 INFO HttpServer: Starting HTTP Server
16/05/27 18:22:58 INFO Utils: Successfully started service 'HTTP file server' on port 49921.
16/05/27 18:22:58 INFO SparkContext: Added JAR file:/home/hadoop/DataSetCreator/./datasetcreator_2.10-1.1.jar at http://XXX.XXX.XXX.43:49921/jars/datasetcreator_2.10-1.1.jar with timestamp 1464340978585
16/05/27 18:22:58 WARN YarnClientSchedulerBackend: NOTE: SPARK_WORKER_CORES is deprecated. Use SPARK_EXECUTOR_CORES or --executor-cores through spark-submit instead.
16/05/27 18:22:58 INFO ConfiguredRMFailoverProxyProvider: Failing over to rm2
16/05/27 18:22:58 INFO Client: Requesting a new application from cluster with 4 NodeManagers
16/05/27 18:22:58 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (19288 MB per container)
16/05/27 18:22:58 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
16/05/27 18:22:58 INFO Client: Setting up container launch context for our AM
16/05/27 18:22:58 INFO Client: Setting up the launch environment for our AM container
16/05/27 18:22:58 INFO Client: Preparing resources for our AM container
16/05/27 18:22:59 INFO Client: Uploading resource file:/home/gooper/svc/apps/sda/bin/hadoop/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar -> hdfs://mycluster/user/hadoop/.sparkStaging/application_1464337540213_0018/spark-assembly-1.6.1-hadoop2.6.0.jar
16/05/27 18:23:01 INFO Client: Uploading resource file:/tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74/__spark_conf__2857474168024892319.zip -> hdfs://mycluster/user/hadoop/.sparkStaging/application_1464337540213_0018/__spark_conf__2857474168024892319.zip
16/05/27 18:23:01 INFO SecurityManager: Changing view acls to: hadoop
16/05/27 18:23:01 INFO SecurityManager: Changing modify acls to: hadoop
16/05/27 18:23:01 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/05/27 18:23:01 INFO Client: Submitting application 18 to ResourceManager
16/05/27 18:23:01 INFO YarnClientImpl: Submitted application application_1464337540213_0018
16/05/27 18:23:02 INFO Client: Application report for application_1464337540213_0018 (state: ACCEPTED)
16/05/27 18:23:02 INFO Client: 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: root.hadoop
         start time: 1464340977670
         final status: UNDEFINED
         tracking URL: http://sda2:8088/proxy/application_1464337540213_0018/
         user: hadoop
16/05/27 18:23:03 INFO Client: Application report for application_1464337540213_0018 (state: ACCEPTED)
16/05/27 18:23:04 INFO Client: Application report for application_1464337540213_0018 (state: ACCEPTED)
16/05/27 18:23:04 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
16/05/27 18:23:04 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sda1, PROXY_URI_BASES -> http://sda1:8088/proxy/application_1464337540213_0018), /proxy/application_1464337540213_0018
16/05/27 18:23:04 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/05/27 18:23:05 INFO Client: Application report for application_1464337540213_0018 (state: RUNNING)
16/05/27 18:23:05 INFO Client: 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: XXX.XXX.XXX.44
         ApplicationMaster RPC port: 0
         queue: root.hadoop
         start time: 1464340977670
         final status: UNDEFINED
         tracking URL: http://sda2:8088/proxy/application_1464337540213_0018/
         user: hadoop
16/05/27 18:23:05 INFO YarnClientSchedulerBackend: Application application_1464337540213_0018 has started running.
16/05/27 18:23:05 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44676.
16/05/27 18:23:05 INFO NettyBlockTransferService: Server created on 44676
16/05/27 18:23:05 INFO BlockManagerMaster: Trying to register BlockManager
16/05/27 18:23:05 INFO BlockManagerMasterEndpoint: Registering block manager XXX.XXX.XXX.43:44676 with 1247.3 MB RAM, BlockManagerId(driver, XXX.XXX.XXX.43, 44676)
16/05/27 18:23:05 INFO BlockManagerMaster: Registered BlockManager
16/05/27 18:23:05 INFO EventLoggingListener: Logging events to hdfs://mycluster/user/hadoop/spark/application_1464337540213_0018
16/05/27 18:23:08 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
16/05/27 18:23:08 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sda1, PROXY_URI_BASES -> http://sda1:8088/proxy/application_1464337540213_0018), /proxy/application_1464337540213_0018
16/05/27 18:23:08 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/05/27 18:23:09 ERROR YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
16/05/27 18:23:09 INFO SparkUI: Stopped Spark web UI at http://XXX.XXX.XXX.43:4040
16/05/27 18:23:09 INFO YarnClientSchedulerBackend: Shutting down all executors
16/05/27 18:23:09 INFO YarnClientSchedulerBackend: Asking each executor to shut down
16/05/27 18:23:09 INFO YarnClientSchedulerBackend: Stopped
16/05/27 18:23:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/05/27 18:23:09 INFO MemoryStore: MemoryStore cleared
16/05/27 18:23:09 INFO BlockManager: BlockManager stopped
16/05/27 18:23:09 INFO BlockManagerMaster: BlockManagerMaster stopped
16/05/27 18:23:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/05/27 18:23:09 INFO SparkContext: Successfully stopped SparkContext
16/05/27 18:23:09 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/05/27 18:23:09 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/05/27 18:23:09 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/05/27 18:23:28 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
16/05/27 18:23:28 ERROR SparkContext: Error initializing SparkContext.
java.lang.NullPointerException
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
        at dataCreator.Settings$.loadSparkContext(Settings.scala:69)
        at dataCreator.Settings$.<init>(Settings.scala:17)
        at dataCreator.Settings$.<clinit>(Settings.scala)
        at runDriver$.main(runDriver.scala:12)
        at runDriver.main(runDriver.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/05/27 18:23:28 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.ExceptionInInitializerError
        at runDriver$.main(runDriver.scala:12)
        at runDriver.main(runDriver.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
        at dataCreator.Settings$.loadSparkContext(Settings.scala:69)
        at dataCreator.Settings$.<init>(Settings.scala:17)
        at dataCreator.Settings$.<clinit>(Settings.scala)
        ... 11 more
16/05/27 18:23:28 INFO ShutdownHookManager: Shutdown hook called
16/05/27 18:23:28 INFO ShutdownHookManager: Deleting directory /tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74/httpd-8faa7605-d0e3-44b9-ba73-d18ce63fe8f1
16/05/27 18:23:28 INFO ShutdownHookManager: Deleting directory /tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74



^CTraceback (most recent call last):
  File "DataSetCreator.py", line 128, in <module>
    main(sys.argv[1:])
  File "DataSetCreator.py", line 125, in main
    generateDatsets()
  File "DataSetCreator.py", line 83, in generateDatsets
    delay()
  File "DataSetCreator.py", line 45, in delay
    time.sleep(delTime)
KeyboardInterrupt
번호 제목 날짜 조회 수
41 spark-sql실행시 The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH오류 발생시 조치사항 2016.06.09 7888
40 Spark에서 Serializable관련 오류및 조치사항 2017.04.21 7743
39 spark-sql실행시 Caused by: java.lang.NumberFormatException: For input string: "0s" 오류발생시 조치사항 2016.06.09 7108
38 Spark 2.1.1 clustering(5대) 설치(YARN기반) 2016.04.22 4960
37 VisualVM 1.3.9을 이용한 spark-submit JVM 모니터링을 위한 설정및 spark-submit실행 옵션 2016.10.28 4904
36 spark에서 hive table을 읽어 출력하는 예제 소스 2017.03.09 4848
» spark client프로그램 기동시 "Error initializing SparkContext"오류 발생할때 조치사항 2016.05.27 4646
34 "Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources"오류 발생시 조치사항 2016.05.25 4563
33 Scala에서 countByWindow를 이용하기(예제) 2018.03.08 4515
32 Spark 1.6.1 설치후 HA구성 2016.05.24 4442
31 spark-submit으로 spark application실행하는 다양한 방법 2016.05.25 4408
30 Windows7 64bit 환경에서 Apache Spark 2.2.0 설치하기 2017.07.26 4405
29 scala application 샘플소스(SparkSession이용) 2018.03.07 4366
28 spark-env.sh에서 사용할 수있는 항목. 2016.05.24 4349
27 Scala를 이용한 Streaming예제 2018.03.08 4296
26 spark 온라인 책자링크 (제목 : mastering-apache-spark) 2016.05.25 4248
25 Apache Spark와 Drools를 이용한 CEP구현 테스트 2016.07.15 4243
24 It is indirectly referenced from required .class files 오류 발생시 조치방법 2017.03.09 4182
23 java.lang.OutOfMemoryError: unable to create new native thread오류 발생지 조치사항 2016.10.17 4160
22 start-all.sh로 spark데몬 기동시 "JAVA_HOME is not set"오류 발생시 조치사항 2016.08.01 4092
위로