Java自学者论坛

 找回密码
 立即注册

手机号码,快捷登录

恭喜Java自学者论坛(https://www.javazxz.com)已经为数万Java学习者服务超过8年了!积累会员资料超过10000G+
成为本站VIP会员,下载本站10000G+会员资源,会员资料板块,购买链接:点击进入购买VIP会员

JAVA高级面试进阶训练营视频教程

Java架构师系统进阶VIP课程

分布式高可用全栈开发微服务教程Go语言视频零基础入门到精通Java架构师3期(课件+源码)
Java开发全终端实战租房项目视频教程SpringBoot2.X入门到高级使用教程大数据培训第六期全套视频教程深度学习(CNN RNN GAN)算法原理Java亿级流量电商系统视频教程
互联网架构师视频教程年薪50万Spark2.0从入门到精通年薪50万!人工智能学习路线教程年薪50万大数据入门到精通学习路线年薪50万机器学习入门到精通教程
仿小米商城类app和小程序视频教程深度学习数据分析基础到实战最新黑马javaEE2.1就业课程从 0到JVM实战高手教程MySQL入门到精通教程
查看: 907|回复: 0

hive:默认允许动态分区个数为100,超出抛出异常:

[复制链接]
  • TA的每日心情
    奋斗
    2024-4-6 11:05
  • 签到天数: 748 天

    [LV.9]以坛为家II

    2034

    主题

    2092

    帖子

    70万

    积分

    管理员

    Rank: 9Rank: 9Rank: 9

    积分
    705612
    发表于 2021-4-18 17:53:17 | 显示全部楼层 |阅读模式

    在创建好一个分区表后,执行动态分区插入数据,抛出了错误:

    Caused by: org.apache.hadoop.hive.ql.metadata.HiveFatalException: [Error 20004]: Fatal error occurred when node tried to create too many dynamic partitions. The maximum number of dynamic partitions is controlled by
    hive.exec.max.dynamic.partitions and 
    hive.exec.max.dynamic.partitions.pernode. Maximum was set to: 100
     2017-11-06 14:54:31,381 FATAL [IPC Server handler 9 on 27102] org.apache.hadoop.mapred.TaskAttemptListenerImpl: 
    Task: attempt_1500969698103_8078701_m_000278_0 - exited : java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:
    Hive Runtime Error while processing row {"gridid":"148420952","longitude":"120.315241543934","latitude":"30.191598257425056",
    "objectid":"888665533","cellname":"7777建设_1","scrsrp":"-91.67163276672363","gridx":"12001","gridy":"7825","doortype":null,"biuldingid":null,"calibrategridid":"309","nobjectid":"193619203","ncrsrp":"-122.38408386707306","height":"0","p_group":0} at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:180) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1711) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:174) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
    {"gridid":"148420952","longitude":"120.315241543934","latitude":"30.191598257425056","objectid":"998867665","cellname":"777建设_1","scrsrp":"-91.67163276672363","gridx":"12001","gridy":"7825","doortype":null,"biuldingid":null,"calibrategridid":"309","nobjectid":"193619203","ncrsrp":"-122.38408386707306","height":"0","p_group":0} at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163) ... 8 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveFatalException: [Error 20004]:
    Fatal error occurred when node tried to create too many dynamic partitions.
    The maximum number of dynamic partitions is controlled by hive.exec.max.dynamic.partitions and hive.exec.max.dynamic.partitions.pernode.
    Maximum was set to: 100 at org.apache.hadoop.hive.ql.exec.FileSinkOperator.getDynOutPaths(FileSinkOperator.java:936) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:713) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:111) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838) at org.apache.hadoop.hive.ql.exec.FilterOperator.process(FilterOperator.java:122) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838) at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552) ... 9 more

    解决方案:

    在执行插入数据到分区时,添加参数设置:

    set hive.exec.dynamic.partition=true;
    set hive.exec.dynamic.partition.mode=nonstrict;
    set hive.exec.max.dynamic.partitions.pernode=10000;
    set hive.exec.max.dynamic.partitions=10000;
    set hive.exec.max.created.files=10000;

     

    哎...今天够累的,签到来了1...
    回复

    使用道具 举报

    您需要登录后才可以回帖 登录 | 立即注册

    本版积分规则

    QQ|手机版|小黑屋|Java自学者论坛 ( 声明:本站文章及资料整理自互联网,用于Java自学者交流学习使用,对资料版权不负任何法律责任,若有侵权请及时联系客服屏蔽删除 )

    GMT+8, 2024-5-15 08:26 , Processed in 0.064443 second(s), 29 queries .

    Powered by Discuz! X3.4

    Copyright © 2001-2021, Tencent Cloud.

    快速回复 返回顶部 返回列表