org.apache.logging.slf4j.Log4j.Marker is not public 解决方案
# 背景说明
在大数据平台适配和二次集成过程中,Hive 作为核心组件,其版本升级和跨组件适配(如 Spark 3.5.5)时经常遇到新旧依赖冲突。特别是在 Bigtop RPM 构建链条下,环境依赖、组件交互与底层日志实现极易引发编译异常。
兼容性挑战
升级 Spark 版本,往往触发底层依赖(如 log4j、slf4j)的访问权限和 API 差异问题,需要针对性兼容处理。
# 1. 典型编译命令与参数
如下为本次 Hive 3.1.3 构建核心命令:
mvn -Dhbase.version=2.4.13 \
-Dzookeeper.version=3.5.9 \
-Dhadoop.version=3.3.4 \
-DskipTests \
-Dtez.version=0.10.1 \
-Dspark.version=3.5.5 \
-Dscala.binary.version=2.12 \
-Dscala.version=2.12.13 \
-Dguava.version=27.0-jre \
-Dcurator.version=4.2.0 \
clean install -Pdist
1
2
3
4
5
6
7
8
9
10
11
2
3
4
5
6
7
8
9
10
11
如需局部调试可追加 -rf :hive-llap-server -X
参数,仅重编失败模块且输出详细日志。
# 2. 报错现象
编译 Hive 过程中,hive-llap-server
模块失败,其它依赖模块则全部跳过。核心日志片段如下:
[INFO] Hive Serde ......................................... SUCCESS [ 16.187 s]
[INFO] Hive Standalone Metastore .......................... SUCCESS [ 35.562 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 19.423 s]
[INFO] Hive Vector-Code-Gen Utilities ..................... SUCCESS [ 1.366 s]
[INFO] Hive Llap Common ................................... SUCCESS [ 15.151 s]
[INFO] Hive Llap Client ................................... SUCCESS [ 15.258 s]
[INFO] Hive Llap Tez ...................................... SUCCESS [ 18.428 s]
[INFO] Hive Spark Remote Client ........................... SUCCESS [ 22.602 s]
[INFO] Hive Query Language ................................ SUCCESS [01:15 min]
[INFO] Hive Llap Server ................................... FAILURE [ 36.957 s]
[INFO] Hive Service ....................................... SKIPPED
[INFO] Hive Accumulo Handler .............................. SKIPPED
[INFO] Hive JDBC .......................................... SKIPPED
[INFO] Hive Beeline ....................................... SKIPPED
[INFO] Hive CLI ........................................... SKIPPED
[INFO] Hive Contrib ....................................... SKIPPED
[INFO] Hive Druid Handler ................................. SKIPPED
[INFO] Hive HBase Handler ................................. SKIPPED
[INFO] Hive JDBC Handler .................................. SKIPPED
[INFO] Hive HCatalog ...................................... SKIPPED
[INFO] Hive HCatalog Core ................................. SKIPPED
[INFO] Hive HCatalog Pig Adapter .......................... SKIPPED
[INFO] Hive HCatalog Server Extensions .................... SKIPPED
[INFO] Hive HCatalog Webhcat Java Client .................. SKIPPED
[INFO] Hive HCatalog Webhcat .............................. SKIPPED
[INFO] Hive HCatalog Streaming ............................ SKIPPED
[INFO] Hive HPL/SQL ....................................... SKIPPED
[INFO] Hive Streaming ..................................... SKIPPED
[INFO] Hive Llap External Client .......................... SKIPPED
[INFO] Hive Shims Aggregator .............................. SKIPPED
[INFO] Hive Kryo Registrator .............................. SKIPPED
[INFO] Hive TestUtils ..................................... SKIPPED
[INFO] Hive Packaging ..................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:39 min
[INFO] Finished at: 2025-06-07T06:54:06Z
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-llap-server: Compilation failure
[ERROR] /opt/modules/bigtop/build/hive/rpm/BUILD/apache-hive-3.1.3-src/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/QueryTracker.java:[30,32] org.apache.logging.slf4j.Log4j
Marker is not public in org.apache.logging.slf4j; cannot be accessed from outside package
[ERROR]
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :hive-llap-server
error: Bad exit status from /var/tmp/rpm-tmp.51OilM (%build)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
警告
该报错实际是 log4j 和 slf4j 版本组合不兼容引起的。
# 3. 问题定位与主因分析
- 根因分析: Hive 3.1.3 默认依赖 Spark 3.2.3 及相关日志库。强制适配 Spark 3.5.5 后,日志体系由 log4j1.x 向 log4j2.x 迁移,API 出现访问权限限制。
- 异常类定位:
org.apache.logging.slf4j.Log4j.Marker
的可见性被 log4j2 限制,hive-llap-server
模块直接访问报编译错误。
注意
简单升 Spark 版本,未同步兼容 log4j2.x 配置,极易触发此类 API 不兼容。
# 4. 解决方案与依赖修正
# 4.1 pom.xml 依赖补充
为兼容 log4j2.x,需要显式补充 log4j 相关依赖到 Hive pom.xml,并同步排除历史遗留的 log4j-slf4j2-impl 冲突:
+ <dependency>
+ <groupId>org.apache.logging.log4j</groupId>
+ <artifactId>log4j-api</artifactId>
+ <version>${log4j2.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.logging.log4j</groupId>
+ <artifactId>log4j-core</artifactId>
+ <version>${log4j2.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.logging.log4j</groupId>
+ <artifactId>log4j-web</artifactId>
+ <version>${log4j2.version}</version>
+ </dependency>
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# 4.2 spark-client/pom.xml 冲突排除
避免二次依赖冲突,需同时在 spark-client 下排除 log4j-slf4j2-impl:
+ <exclusion>
+ <groupId>org.apache.logging.log4j</groupId>
+ <artifactId>log4j-slf4j2-impl</artifactId>
+ </exclusion>
1
2
3
4
2
3
4
笔记
实际修复时建议同时对 slf4j-log4j12、commons-logging 等多余依赖进行清理,保持日志链路唯一,减少后续冲突概率。
# 4.3 patch 参考
如下 patch 示例可直接集成至大数据组件定制流程:
Subject: [PATCH] feature: 支持spark 3.5.5
---
diff --git a/pom.xml b/pom.xml
+++ b/pom.xml
@@ -365,6 +365,21 @@
+ <dependency>
+ <groupId>org.apache.logging.log4j</groupId>
+ <artifactId>log4j-api</artifactId>
+ <version>${log4j2.version}</version>
+ </dependency>
+ ...
diff --git a/spark-client/pom.xml b/spark-client/pom.xml
+++ b/spark-client/pom.xml
@@ -86,6 +86,10 @@
+ <groupId>org.apache.logging.log4j</groupId>
+ <artifactId>log4j-slf4j2-impl</artifactId>
+ </exclusion>
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
# 5. 编译通过验证与效果
应用上述修复后重新编译,即可顺利通过 Hive 3.1.3 + Spark 3.5.5 组合编译,相关依赖链路不再报 log4j Marker 访问权限错误。