当前位置: 首页 > article >正文

spark log4j日志配置

1.spark启动参数

先把log4j配置文件放到hdfs:hdfs://R2/projects/log4j-debug.properties

--conf spark.yarn.dist.files=hdfs://R2/projects/log4j-debug.properties#log4j-first.properties \
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:log4j-first.properties" \
--conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/heapdump.hprof -Dlog4j.configuration=file:log4j-first.properties" \

2.log4j.properties(INFO日志)

# Set everything to be logged to the console
log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=INFO

# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=ERROR
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=WARN
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=WARN
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache=WARN
log4j.logger.parquet=ERROR
log4j.logger.org.apache.spark.deploy.yarn=INFO

log4j.logger.org.apache.hudi=INFO

log4j.logger.org.apache.hadoop.hive.metastore.HiveMetaStoreClient=INFO
log4j.logger.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient=INFO
log4j.logger.hive.metastore=INFO

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

3.log4j-debug.properties(DEBUG日志)

# Set everything to be logged to the console
log4j.rootCategory=DEBUG, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=INFO

# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=ERROR
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=WARN
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=WARN
log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache=WARN
log4j.logger.parquet=ERROR
log4j.logger.org.apache.spark.deploy.yarn=INFO

log4j.logger.org.apache.hudi=INFO

log4j.logger.org.apache.hadoop.hive.metastore.HiveMetaStoreClient=INFO
log4j.logger.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient=INFO
log4j.logger.hive.metastore=INFO

# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

http://www.kler.cn/news/160370.html

相关文章:

  • Amazon CodeWhisperer 正式可用, 并面向个人开发者免费开放
  • redis应用-分布式锁
  • Java - InetAddress#isReachable 方法解析
  • EPICS modbus 模块数字量读写练习
  • 分类与群组:解析分类和聚类分析技术
  • Kubernetes入门笔记——(2)k8s设计文档
  • java之stringbuf
  • 【9】PyQt对话框
  • Ubuntu 20.04 安装 mysql8 LTS
  • 【AI-ChatGPT-Prompt】什么是Prompt
  • Redis生产实战-热key、大key解决方案、数据库与缓存最终一致性解决方案
  • Centos7如何安装MySQL
  • HBase-架构与设计
  • 面试冲刺 - 算法题 1
  • 大数据生态架构:探索未来科技的无限可能。
  • Word文件设置了只读模式,为什么还能编辑?
  • 开发重要网站
  • 同旺科技 USB TO RS-485 定制款适配器--- 拆解(四)
  • 要求CHATGPT高质量回答的艺术:提示工程技术的完整指南—第 4 章:控制温度和 Top-p 采样
  • k8s 安装 Longhorn
  • 【数据结构】动态规划(Dynamic Programming)
  • qt 5.15.2 网络文件下载功能
  • Pair<T, U>
  • Ubuntu22.04 安装nvida-docker2和改路径
  • 分布式数据库HBase
  • 使用Go快速开发TCP公共服务
  • 深信服技术认证“SCSA-S”划重点:XSS漏洞
  • APP测试的测试内容有哪些,常见的Bug分类介绍!
  • 网络和Linux网络_11(数据链路层)以太网(MAC帧)协议+局域网转发+ARP协议
  • jvs智能bi新增:数据集添加sql自定义节点、添加websocket任务进度动态展示等等