ItGo.me - 专注IT技术分享

首页 > Exception > Hive Specified key was too long; max key length is 767 bytes

Hive Specified key was too long; max key length is 767 bytes

时间:2015-04-26来源:网友分享 点击:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:javax.jdo.JDODataStoreException: An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767 bytes
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
    at sun.reflect.GeneratedConstructorAccessor30.newInstance(Unknown Source)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.Util.getInstance(Util.java:386)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1054)

数据库字符集问题。解决方法:修改MySQL中Hive元数据库MetaStore的字符集便可。
alter database hive_metadata character set latin1;

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx-wx--x

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx-wx--x...

No live SolrServers available to handle this request

org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://192.168.0.2:8983/solr/test, http://192.168.0.3:8983/solr/test, http://192.168.0.4:8983/solr/test]...

Hadoop HDFS Wrong FS: hdfs:/ expected file:///的解决方法

在eclipse编写hadoop程序并运行时,会出现Hadoop HDFS Wrong FS: hdfs:/ expected file:///的错误 Exception in thread main java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.0.22:49000/tmp/mahout-work-hadoop/product-kmeans/clustere...

An exception was thrown while adding/validating class(es) : Specified key was too long; max key length is 767 bytes
------分隔线----------------------------