Hive HBase integration failure -


i using hadoop 2.7.0, hive 1.2.0 , hbase 1.0.1.1

i have created simple table in hbase

hbase(main):021:0> create 'hbasetohive', 'colfamily'  0 row(s) in 0.2680 seconds    => hbase::table - hbasetohive  hbase(main):022:0> put 'hbasetohive', '1s', 'colfamily:val','1strowval'  0 row(s) in 0.0280 seconds    hbase(main):023:0> scan 'hbasetohive'  row                                  column+cell                                                                                                  1s                                  column=colfamily:val, timestamp=1434644858733, value=1strowval                                              1 row(s) in 0.0170 seconds

now have tried access hbase table through hive external table. while select external table getting below error.

hive (default)> create external table hbase_hivetable_k(key string, value string)                > stored 'org.apache.hadoop.hive.hbase.hbasestoragehandler'                > serdeproperties ("hbase.columns.mapping" = "colfamily:val")                > tblproperties("hbase.table.name" = "hbasetohive");  ok  time taken: 1.688 seconds  hive (default)> select * hbase_hivetable_k;  ok  hbase_hivetable_k.key	hbase_hivetable_k.value  warn: method class org.apache.commons.logging.impl.slf4jlogfactory#release() invoked.  warn: please see http://www.slf4j.org/codes.html#release explanation.  exception in thread "main" java.lang.nosuchmethoderror: org.apache.hadoop.hbase.client.scan.setcaching(i)v  	at org.apache.hadoop.hive.hbase.hivehbaseinputformatutil.getscan(hivehbaseinputformatutil.java:123)  	at org.apache.hadoop.hive.hbase.hivehbasetableinputformat.getrecordreader(hivehbasetableinputformat.java:99)  	at org.apache.hadoop.hive.ql.exec.fetchoperator$fetchinputformatsplit.getrecordreader(fetchoperator.java:673)  	at org.apache.hadoop.hive.ql.exec.fetchoperator.getrecordreader(fetchoperator.java:323)  	at org.apache.hadoop.hive.ql.exec.fetchoperator.getnextrow(fetchoperator.java:445)  	at org.apache.hadoop.hive.ql.exec.fetchoperator.pushrow(fetchoperator.java:414)  	at org.apache.hadoop.hive.ql.exec.fetchtask.fetch(fetchtask.java:140)  	at org.apache.hadoop.hive.ql.driver.getresults(driver.java:1667)  	at org.apache.hadoop.hive.cli.clidriver.processlocalcmd(clidriver.java:233)  	at org.apache.hadoop.hive.cli.clidriver.processcmd(clidriver.java:165)  	at org.apache.hadoop.hive.cli.clidriver.processline(clidriver.java:376)  	at org.apache.hadoop.hive.cli.clidriver.executedriver(clidriver.java:736)  	at org.apache.hadoop.hive.cli.clidriver.run(clidriver.java:681)  	at org.apache.hadoop.hive.cli.clidriver.main(clidriver.java:621)  	at sun.reflect.nativemethodaccessorimpl.invoke0(native method)  	at sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)  	at sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)  	at java.lang.reflect.method.invoke(method.java:601)  	at org.apache.hadoop.util.runjar.run(runjar.java:221)  	at org.apache.hadoop.util.runjar.main(runjar.java:136)

it totally coming out of hive prompt self.

can please tell me issue here.

the below .hiverc using hive/conf directory :

set hive.cli.print.header=true;  set hive.cli.print.current.db=true;  set hive.auto.convert.join=true;  set hbase.scan.cacheblock=0;  set hbase.scan.cache=10000;  set hbase.client.scanner.cache=10000;  add jar /usr/lib/hive/auxlib/zookeeper-3.4.6.jar;  add jar /usr/lib/hive/auxlib/hive-hbase-handler-1.2.0.jar;  add jar /usr/lib/hive/auxlib/guava-14.0.1.jar;  add jar /usr/lib/hive/auxlib/hbase-common-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-client-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-hadoop2-compat-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-hadoop-compat-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/commons-configuration-1.6.jar;  add jar /usr/lib/hive/auxlib/hadoop-common-2.7.0.jar;  add jar /usr/lib/hive/auxlib/hbase-annotations-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-it-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-prefix-tree-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-protocol-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-rest-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-server-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-shell-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/hbase-thrift-1.0.1.1.jar;  add jar /usr/lib/hive/auxlib/high-scale-lib-1.1.1.jar;  add jar /usr/lib/hive/auxlib/hive-serde-1.2.0.jar;  add jar /usr/lib/hbase/lib/commons-beanutils-1.7.0.jar;  add jar /usr/lib/hbase/lib/commons-beanutils-core-1.8.0.jar;  add jar /usr/lib/hbase/lib/commons-cli-1.2.jar;  add jar /usr/lib/hbase/lib/commons-codec-1.9.jar;  add jar /usr/lib/hbase/lib/commons-collections-3.2.1.jar;  add jar /usr/lib/hbase/lib/commons-compress-1.4.1.jar;  add jar /usr/lib/hbase/lib/commons-digester-1.8.jar;  add jar /usr/lib/hbase/lib/commons-el-1.0.jar;  add jar /usr/lib/hbase/lib/commons-io-2.4.jar;  add jar /usr/lib/hbase/lib/htrace-core-3.1.0-incubating.jar;  add jar /usr/local/src/spark/lib/spark-assembly-1.3.1-hadoop2.6.0.jar;

i having same issue, issue because of hive 1.2.0 version not compatible hbase version 1.x.

as mentioned in hbaseintegration:

version information of hive 0.9.0 hbase integration requires @ least hbase 0.92, earlier versions of hive working hbase 0.89/0.90

version information hive 1.x remain compatible hbase 0.98.x , lower versions. hive 2.x compatible hbase 1.x , higher. (see hive-10990 details.) consumers wanting work hbase 1.x using hive 1.x need compile hive 1.x stream code themselves.

so make hive 1.x work hbase 1.x have download source code of hive 2.0 branch hive on github , build it, after building replace hive-hbase-handler jar file newer version work.


Comments

Popular posts from this blog

PHP DOM loadHTML() method unusual warning -

python - How to create jsonb index using GIN on SQLAlchemy? -

c# - TransactionScope not rolling back although no complete() is called -