메뉴 건너뛰기

Cloudera, BigData, Semantic IoT, Hadoop, NoSQL

Cloudera CDH/CDP 및 Hadoop EcoSystem, Semantic IoT등의 개발/운영 기술을 정리합니다. gooper@gooper.com로 문의 주세요.


./spark-sql 실행시 아래의 오류가 발생하면 hive에서 메타데이타 저장시 사용하는 db driver jar

파일을 $HOME/spark/lib에 복사하고 ./conf/spark-env.sh에 export CLASSPATH=$CLASSPATH:$HOME/hadoop/spark/lib/mysql-connector-java-5.1.39-bin.jar를 추가시켜준다.



---------------------------------오류내용---------------------------------------

16/06/09 13:23:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

16/06/09 13:23:50 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

16/06/09 13:23:50 INFO ObjectStore: ObjectStore, initialize called

16/06/09 13:23:50 INFO Persistence: Property datanucleus.schema.validateColumns unknown - will be ignored

16/06/09 13:23:50 INFO Persistence: Property datanucleus.schema.validateConstraints unknown - will be ignored

16/06/09 13:23:50 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored

16/06/09 13:23:50 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored

16/06/09 13:23:50 INFO Persistence: Property datanucleus.schema.autoCreateAll unknown - will be ignored

16/06/09 13:23:50 INFO Persistence: Property datanucleus.schema.validateTables unknown - will be ignored

16/06/09 13:23:50 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)

        at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:101)

        at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:497)

        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)

        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)

        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)

        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)

        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)

        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)

        ... 11 more

Caused by: java.lang.reflect.InvocationTargetException

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)

        ... 16 more

Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory

NestedThrowables:

java.lang.reflect.InvocationTargetException

        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:497)

        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)

        at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)

        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)

        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)

        at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)

        at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)

        at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)

        at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)

        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)

        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

        at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)

        at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)

        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)

        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)

        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)

        at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)

        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)

        ... 21 more

Caused by: java.lang.reflect.InvocationTargetException

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)

        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)

        at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)

        at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)

        at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

        at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)

        at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)

        at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)

        at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)

        at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)

        ... 50 more

Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "dbcp-builtin" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)

        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)

        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)

        ... 68 more

Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.

        at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)

        at org.datanucleus.store.rdbms.connectionpool.DBCPBuiltinConnectionPoolFactory.createConnectionPool(DBCPBuiltinConnectionPoolFactory.java:49)

        at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)

        ... 70 more

번호 제목 날짜 조회 수
481 Components of the Impala Server 2018.03.21 378
480 sparql 1.1 BIND(if() as ?bind변수) 버그로 추정되는 문제점및 해결방안 -> select 문에 (if(,,) as ?bind변수) file 2016.01.21 379
479 쿠버네티스(k8s) 설치 및 클러스터 구성하기 2019.10.19 379
478 [sbt] sbt 0.13.11 를 windows에 설치하고 scala프로그램을 compile해서 jar파일 만들기 2016.07.11 380
477 ResultSet에서 데이타를 List<Map<String,String>>형태로 만들어서 리턴하는 소스(Collections.sort를 이용한 정렬 가능) 2016.12.15 381
476 이미지 관리 오픈소스 목록 2018.03.11 382
475 schema.xml vs managed-schema 지정 사용하기 - 두개를 동시에 사용할 수는 없음 2017.07.09 384
474 spark-submit으로 spark application실행하는 다양한 방법 2016.05.25 387
473 No broker partitions consumed by consumer thread오류 발생시 확인/조치할 사항 2016.09.02 387
472 [CentOS] 네트워크 설정 2018.03.26 387
471 editLog의 문제로 발생하는 journalnode 기동 오류 발생시 조치사항 2017.09.14 390
470 Cluster Install -> Provide Login Credentials에서 root가 아닌 다른 사용자를 지정하는 경우 "Exhausted available authentication methods"오류 발생시 조치방법 2018.05.22 392
469 Spark에서 KafkaUtils.createStream()를 이용하여 이용하여 kafka topic에 접근하여 객채로 저장된 값을 가져오고 처리하는 예제 소스 2017.04.26 393
468 fuseki webUI를 통해서 전체 카운트를 하면 급격하게 메모리를 소모해 버리는 문제가 있음 file 2017.04.28 393
467 lagom의 online-auction-java프로젝트 실행시 외부의 kafka/cassandra를 사용하도록 설정하는 방법 2017.10.12 393
466 drools에서 drl관련 로그를 기록하기 위한 클래스 파일 2016.07.21 394
465 우분투 16.04 설치후 APM (Apache2, PHP, MySQL) 설치 2017.01.29 394
464 Cleaning up the staging area file시 'cannot access' 혹은 'Directory is not writable' 발생시 조치사항 2017.05.02 396
463 sendmail + dovecot(pop3) + saslauthd 설치 2017.06.11 396
462 권한회수 및 권한부여 명령 몇가지 2017.11.16 397
위로