Hive常见问题

  • 在操作Hive分区表是报:
 Fatal error occurred when node tried to create too many dynamic partitions. The maximum number of dynamic partitions is controlled by hive.exec.max.dynamic.partitions and hive.exec.max.dynamic.partitions.pernode. Maximum was set to

在spark-sql或hive中设置属性开启动态分区:

-- 开启动态分区
set hive.exec.dynamic.partition=true;
-- 取值[strict,nonstrict],默认值是strict。strict:要求分区字段必须有一个是静态的分区值,nonstrict:动态分区
set hive.exec.dynamic.partition.mode=nonstrict; 
  • 创建hive表成功,但在select表时候提示:
RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException 
org.apache.hadoop.hive.hbase.HBaseSerDe: 
columns has 273 elements while hbase.columns.mapping has 204 elements (counting the key if implicit))

解决办法:

1.登录hive MySQL数据库:mysql -uroot -p123456

2.执行语句:

use hive;
-- 最大值是65535
alter table SERDE_PARAMS MODIFY PARAM_VALUE VARCHAR(60000);
如果觉得我的文章对你有用,请随意赞赏