메뉴 건너뛰기

Cloudera, BigData, Semantic IoT, Hadoop, NoSQL

Cloudera CDH/CDP 및 Hadoop EcoSystem, Semantic IoT등의 개발/운영 기술을 정리합니다. gooper@gooper.com로 문의 주세요.


beeline을 실행하고 !connect jdbc:hive2://localhost:10000 scott tiger를 실행시 "User: root is not allowed to impersonate scott"라는 오류가 발생시 hive-site.xml의 아래 항목을 false로 변경하고 hiveserver2를 재기동한다.


   <property>

    <name>hive.server2.enable.doAs</name>

    <!-- value>true</value -->

    <value>false</value>

    <description>

      Setting this property to true will have HiveServer2 execute

      Hive operations as the user making the calls to it.

    </description>

  </property>



-----------------------------오류 발생 상황-------------

-bash-4.1# ./beeline

which: no hbase in (/opt/jdk1.8.0_66/bin:/usr/local/rvm/gems/ruby-2.2.3/bin:/usr/local/rvm/gems/ruby-2.2.3@global/bin:/usr/local/rvm/rubies/ruby-2.2.3/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/rvm/bin:/svc/apps/sda/bin/hadoop/hadoop/bin:/svc/apps/sda/bin/hadoop/elasticsearch/bin:/opt/apache-maven-3.3.9/bin:/svc/apps/sda/bin/hadoop/hive/bin)

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/home/gooper/svc/apps/sda/bin/hadoop/apache-hive-2.0.1-bin/lib/hive-jdbc-2.0.1-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/gooper/svc/apps/sda/bin/hadoop/apache-hive-2.0.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/gooper/svc/apps/sda/bin/hadoop/spark-1.3.1-bin-hadoop2.6/lib/spark-assembly-1.3.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/home/gooper/svc/apps/sda/bin/hadoop/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Beeline version 2.0.1 by Apache Hive

beeline> !connect jdbc:hive2://localhost:10000 scott tiger

Connecting to jdbc:hive2://localhost:10000

Error: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate scott (state=,code=0)

beeline> 

번호 제목 날짜 조회 수
43 ./spark-sql 실행시 "java.lang.NumberFormatException: For input string: "1s"오류발생시 조치사항 2016.06.09 231
» beeline실행시 User: root is not allowed to impersonate오류 발생시 조치사항 2016.06.03 857
41 Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D오류발생시 조치사항 2016.06.03 1256
40 impala 설치/설정 2016.06.03 1169
39 hive 2.0.1 설치및 mariadb로 metastore 설정 2016.06.03 5303
38 Scala버젼 변경 혹은 상황에 맞게 Spark소스 컴파일하기 2016.05.31 548
37 spark client프로그램 기동시 "Error initializing SparkContext"오류 발생할때 조치사항 2016.05.27 617
36 spark-submit으로 spark application실행하는 다양한 방법 2016.05.25 392
35 spark 온라인 책자링크 (제목 : mastering-apache-spark) 2016.05.25 196
34 "Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources"오류 발생시 조치사항 2016.05.25 1143
33 spark-env.sh에서 사용할 수있는 항목. 2016.05.24 870
32 Spark 1.6.1 설치후 HA구성 2016.05.24 742
31 spark-shell실행시 "A read-only user or a user in a read-only database is not permitted to disable read-only mode on a connection."오류가 발생하는 경우 해결방법 2016.05.20 636
30 Spark 2.1.1 clustering(5대) 설치(YARN기반) 2016.04.22 2071
29 Spark Streaming으로 유실 없는 스트림 처리 인프라 구축하기 2016.03.11 270
28 CDH 5.4.4 버전에서 hive on tez (0.7.0)설치하기 2016.01.14 308
27 Tracking URL = N/A 가발생하는 경우 - 환경설정값을 잘못설정하는 경우에 발생함 2015.06.17 705
26 java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error: Unable to deserialize reduce input key from...오류해결방법 2015.06.16 1986
25 Permission denied: user=hadoop, access=EXECUTE, inode="/tmp":root:supergroup:drwxrwx--- 오류해결방법 2015.05.17 490
24 hive 0.13.1 설치 + meta정보는 postgresql 9.3에 저장 2015.04.30 635
위로