spark 设置hive.exec.max.dynamic.partition不生效
spark脚本和程序中设置ive.exec.max.dynamic.partition不生效
正确写法:
spark-submit \
--master yarn \
--deploy-mode client \
--driver-memory 1G \
--executor-memory 12G \
--num-executors 8 \
--executor-cores 4 \
--conf spark.hadoop.hive.exec.max.dynamic.partitions=10000 \
--conf spark.hadoop.hive.exec.max.dynamic.partitions.pernode=10000 \