TT Bigdata TT Bigdata
首页
  • 部署专题

    • 常规安装
    • 一键部署
  • 组件安装

    • 常规&高可用
  • 版本专题

    • 更新说明
  • Ambari-Env

    • 环境准备
    • 开始使用
  • 组件编译

    • 专区—Ambari
    • 专区—Bigtop
  • 报错解决

    • 专区—Ambari
    • 专区—Bigtop
  • 其他技巧

    • Maven镜像加速
    • Gradle镜像加速
    • Bower镜像加速
    • 虚拟环境思路
    • R环境安装+一键安装脚本
    • Ivy配置私有镜像仓库
    • Node.js 多版本共存方案
    • Ambari Web本地启动
    • Npm镜像加速
    • PostgreSQL快速安装
    • Temurin JDK 23快速安装
  • 成神之路

    • 专区—Ambari
    • 专区—Bigtop
  • 集成案例

    • Redis集成教学
    • Dolphin集成教学
    • Doris集成教学
    • 持续整理...
  • 模板代码

    • 各类组件
    • 通用模板
  • 国产化&其他系统

    • Centos系列
    • Kylin系列
    • OpenEuler系列
    • Rocky系列
    • Ubuntu系列
  • 生产调优

    • 组件调优指南
    • 1v1指导调优
  • 定制开发

    • 组件版本定制
    • 样式风格定制
  • 蓝图愿景
  • 技术支持
  • 合作共建
GitHub (opens new window)

JaneTTR

数据酿造智慧,每一滴都是沉淀!
首页
  • 部署专题

    • 常规安装
    • 一键部署
  • 组件安装

    • 常规&高可用
  • 版本专题

    • 更新说明
  • Ambari-Env

    • 环境准备
    • 开始使用
  • 组件编译

    • 专区—Ambari
    • 专区—Bigtop
  • 报错解决

    • 专区—Ambari
    • 专区—Bigtop
  • 其他技巧

    • Maven镜像加速
    • Gradle镜像加速
    • Bower镜像加速
    • 虚拟环境思路
    • R环境安装+一键安装脚本
    • Ivy配置私有镜像仓库
    • Node.js 多版本共存方案
    • Ambari Web本地启动
    • Npm镜像加速
    • PostgreSQL快速安装
    • Temurin JDK 23快速安装
  • 成神之路

    • 专区—Ambari
    • 专区—Bigtop
  • 集成案例

    • Redis集成教学
    • Dolphin集成教学
    • Doris集成教学
    • 持续整理...
  • 模板代码

    • 各类组件
    • 通用模板
  • 国产化&其他系统

    • Centos系列
    • Kylin系列
    • OpenEuler系列
    • Rocky系列
    • Ubuntu系列
  • 生产调优

    • 组件调优指南
    • 1v1指导调优
  • 定制开发

    • 组件版本定制
    • 样式风格定制
  • 蓝图愿景
  • 技术支持
  • 合作共建
GitHub (opens new window)
  • 准备阶段

  • 理解packages.gradle

  • 理解bigtop.bom

  • Hadoop编译

  • Flink编译

    • version-1.15.3

    • version-1.17.2

      • Flink_1.17.2编译
        • 1. 构建环境准备
        • 2. 编译命令执行
        • 3. 下载依赖包过程
        • 4. 构建过程详解
        • 5. 编译时间与系统资源消耗
        • 6. 编译输出结果
        • 7.涉及修改点
      • [O]Flink版本适配改造(三)
      • [O]Flink版本适配改造(四)
      • [O]Flink版本适配改造(五)
      • [B]Flink版本适配改造(一)
      • [B]Flink版本适配改造(二)
      • [B]Flink版本适配改造(三)
  • Spark编译

  • Atlas编译

  • Superset编译

  • Zookeeper编译

  • Hbase编译

  • Hive编译

  • Kafka编译

  • Solr编译

  • Tez编译

  • Zeppelin编译

  • 组件编译-Bigtop
  • Flink编译
  • version-1.17.2
JaneTTR
2023-03-02
目录

Flink_1.17.2编译1.0.7+

# 1. 构建环境准备

整个构建流程基于 Bigtop 提供的打包能力,依赖如下基础工具环境:

环境组件 推荐版本 安装参考
JDK 1.8 ONEKEY——安装JDK1.8
Maven 3.8.4 ONEKEY——安装Maven3.8.4
Gradle 内嵌于 Bigtop ONEKEY——安装Gradle5.6.4

# 2. 编译命令执行

使用如下指令启动 Flink 的打包过程:

gradle flink-rpm -PparentDir=/usr/bigtop -Dbuildwithdeps=true -PpkgSuffix
1

此命令的含义如下:

  • -PparentDir 指定 Bigtop 工程根目录
  • -Dbuildwithdeps=true 强制构建所需依赖组件
  • -PpkgSuffix 添加版本后缀(可省略)

# 3. 下载依赖包过程

执行命令后,首先会检查 dl/ 目录下是否存在所需的 Flink 源码压缩包:

下载源码包

如无则自动联网下载。建议提前手动放置压缩包以加快编译速度并支持离线构建。

# 4. 构建过程详解

+ STATUS=0
+ '[' 0 -ne 0 ']'
+ cd flink-1.17.2
+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w .
+ exit 0
Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.fOpstl
+ umask 022
+ cd /opt/modules/bigtop/build/flink/rpm//BUILD
+ cd flink-1.17.2
+ bash /opt/modules/bigtop/build/flink/rpm//SOURCES/do-component-build
++ dirname /opt/modules/bigtop/build/flink/rpm//SOURCES/do-component-build
+ . /opt/modules/bigtop/build/flink/rpm//SOURCES/bigtop.bom
++ ZOOKEEPER_VERSION=3.5.9
++ HADOOP_VERSION=3.3.4
++ HBASE_VERSION=2.4.13
++ HIVE_VERSION=3.1.3
++ TEZ_VERSION=0.10.1
++ OOZIE_VERSION=5.2.1
++ SOLR_VERSION=8.11.2
++ SPARK_VERSION=3.5.5
++ FLINK_VERSION=1.17.2
++ PHOENIX_VERSION=5.1.2
++ BIGTOP_GROOVY_VERSION=2.5.4
++ BIGTOP_UTILS_VERSION=3.2.0-SNAPSHOT
++ BIGTOP_SELECT_VERSION=3.2.0-SNAPSHOT
++ BIGTOP_JSVC_VERSION=1.2.4
++ ALLUXIO_VERSION=2.8.0
++ KAFKA_VERSION=2.8.1
++ YCSB_VERSION=0.17.0
++ ZEPPELIN_VERSION=0.10.1
++ GPDB_VERSION=5.28.5
++ AMBARI_VERSION=2.7.5
++ BIGTOP_AMBARI_MPACK_VERSION=2.7.5
++ LIVY_VERSION=0.7.1
++ RANGER_VERSION=2.4.0
++ SQOOP_VERSION=1.4.7
++ REDIS_VERSION=7.4.0
++ DOLPHINSCHEDULER_VERSION=3.2.2
++ DORIS_VERSION=2.1.7
++ NIGHTINGALE_VERSION=7.7.2
++ CATEGRAF_VERSION=0.4.1
++ VICTORIAMETRICS_VERSION=1.109.1
++ CLOUDBEAVER_VERSION=24.3.3
++ CELEBORN_VERSION=0.5.3
++ OZONE_VERSION=1.4.1
++ IMPALA_VERSION=4.4.1
++ TRINO_VERSION=474
++ HUDI_VERSION=1.0.1
++ PAIMON_VERSION=1.1.0
++ JDK_VERSION=1.8
++ SCALA_VERSION=2.12.13
+ '[' x86_64 = powerpc64le ']'
+++ dirname /opt/modules/bigtop/build/flink/rpm//SOURCES/do-component-build
++ cd /opt/modules/bigtop/build/flink/rpm//SOURCES/../../../..
++ pwd
+ git_path=/opt/modules/bigtop
+ cmd_from='cd ../.. && husky install flink-runtime-web/web-dashboard/.husky'
++ sed -e 's/[&\\/]/\\&/g; s/$/\\/' -e '$s/\\$//'
+ repl_from='cd ..\/.. \&\& husky install flink-runtime-web\/web-dashboard\/.husky'
+ [[ /opt/modules/bigtop/build/flink/rpm//SOURCES/do-component-build == *rpm* ]]
+ package_json_path=build/flink/rpm/BUILD/flink-1.17.2/flink-runtime-web/web-dashboard
+ cmd_to='cd /opt/modules/bigtop && husky install build/flink/rpm/BUILD/flink-1.17.2/flink-runtime-web/web-dashboard/.husky'
++ sed -e 's/[&\\/]/\\&/g; s/$/\\/' -e '$s/\\$//'
+ repl_to='cd \/opt\/modules\/bigtop \&\& husky install build\/flink\/rpm\/BUILD\/flink-1.17.2\/flink-runtime-web\/web-dashboard\/.husky'
+ sed -i 's/cd ..\/.. \&\& husky install flink-runtime-web\/web-dashboard\/.husky/cd \/opt\/modules\/bigtop \&\& husky install build\/flink\/rpm\/BUILD\/flink-1.17.2\/flink-runtime-web\/web-dashboard\/.husky/
' flink-runtime-web/web-dashboard/package.json
+ mvn -X install -Drat.skip=true -DskipTests -Dhadoop.version=3.3.4
Apache Maven 3.9.9 (8e8579a9e76f7d015ee5ec7bfcdc97d260186937)
Maven home: /opt/modules/apache-maven-3.9.9
Java version: 1.8.0_202, vendor: Oracle Corporation, runtime: /opt/modules/jdk1.8.0_202/jre
Default locale: en_US, platform encoding: ANSI_X3.4-1968
OS name: "linux", version: "5.15.167.4-microsoft-standard-wsl2", arch: "amd64", family: "unix"
[DEBUG] Created new class realm maven.api


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75

我们从日志中提取出最终的实际 Maven 编译命令如下:

mvn install -Drat.skip=true -DskipTests -Dhadoop.version=3.3.4


# 提取的 flink-runtime-web的 环境脚本
/opt/modules/bigtop/dl/flink-1.17.2/flink-runtime-web/web-dashboard/node/node \
/opt/modules/bigtop/dl/flink-1.17.2/flink-runtime-web/web-dashboard/node/node_modules/npm/bin/npm-cli.js \
install \
--unsafe-perm \
--verbose \
--save \
--progress \
--registry=https://registry.npmmirror.com/ \
 --ignore-scripts
1
2
3
4
5
6
7
8
9
10
11
12
13

mvn编译成功截图

[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ flink-ci-tools ---
[INFO] Installing /opt/modules/bigtop/dl/flink-1.17.2/tools/ci/flink-ci-tools/target/flink-ci-tools-1.17.2.jar to /root/.m2/repository/org/apache/flink/flink-ci-tools
[INFO] Installing /opt/modules/bigtop/dl/flink-1.17.2/tools/ci/flink-ci-tools/dependency-reduced-pom.xml to /root/.m2/repository/org/apache/flink/flink-ci-tools/1.17.
[INFO] Installing /opt/modules/bigtop/dl/flink-1.17.2/tools/ci/flink-ci-tools/target/bom.xml to /root/.m2/repository/org/apache/flink/flink-ci-tools/1.17.2/flink-ci-t
[INFO] Installing /opt/modules/bigtop/dl/flink-1.17.2/tools/ci/flink-ci-tools/target/bom.json to /root/.m2/repository/org/apache/flink/flink-ci-tools/1.17.2/flink-ci-
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : Runtime web 1.17.2:
[INFO]
[INFO] Flink : Runtime web ................................ SUCCESS [ 48.150 s]
[INFO] Flink : Connectors : Datagen ....................... SUCCESS [ 10.215 s]
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SUCCESS [ 19.038 s]
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SUCCESS [ 29.147 s]
[INFO] Flink : Connectors : SQL : Hive 2.3.9 .............. SUCCESS [ 42.383 s]
[INFO] Flink : Connectors : SQL : Hive 3.1.3 .............. SUCCESS [ 53.997 s]
[INFO] Flink : Formats : Sequence file .................... SUCCESS [ 24.574 s]
[INFO] Flink : Formats : Compress ......................... SUCCESS [ 22.993 s]
[INFO] Flink : Formats : Protobuf ......................... SUCCESS [ 25.876 s]
[INFO] Flink : Formats : SQL Csv .......................... SUCCESS [  3.232 s]
[INFO] Flink : Formats : SQL Json ......................... SUCCESS [  3.130 s]
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SUCCESS [  9.993 s]
[INFO] Flink : Formats : SQL Protobuf ..................... SUCCESS [  3.401 s]
[INFO] Flink : Examples : ................................. SUCCESS [  4.252 s]
[INFO] Flink : Examples : Batch ........................... SUCCESS [ 19.066 s]
[INFO] Flink : Examples : Streaming ....................... SUCCESS [ 25.199 s]
[INFO] Flink : Examples : Table ........................... SUCCESS [ 24.726 s]
[INFO] Flink : Examples : Build Helper : .................. SUCCESS [  4.018 s]
[INFO] Flink : Examples : Build Helper : Streaming State machine SUCCESS [ 11.947 s]
[INFO] Flink : Container .................................. SUCCESS [ 10.148 s]
[INFO] Flink : Queryable state : Runtime .................. SUCCESS [ 14.559 s]
[INFO] Flink : Tests ...................................... SUCCESS [01:09 min]
[INFO] Flink : Dist-Scala ................................. SUCCESS [  7.922 s]
[INFO] Flink : Kubernetes ................................. SUCCESS [ 31.863 s]
[INFO] Flink : Yarn ....................................... SUCCESS [ 40.409 s]
[INFO] Flink : Table : API Java Uber ...................... SUCCESS [ 15.061 s]
[INFO] Flink : External resources : ....................... SUCCESS [  2.760 s]
[INFO] Flink : External resources : GPU ................... SUCCESS [  5.762 s]
[INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [  5.495 s]
[INFO] Flink : Metrics : Graphite ......................... SUCCESS [  5.261 s]
[INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [ 11.976 s]
[INFO] Flink : Metrics : Prometheus ....................... SUCCESS [  9.563 s]
[INFO] Flink : Metrics : StatsD ........................... SUCCESS [  5.461 s]
[INFO] Flink : Metrics : Datadog .......................... SUCCESS [  5.955 s]
[INFO] Flink : Metrics : Slf4j ............................ SUCCESS [  5.466 s]
[INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 22.587 s]
[INFO] Flink : Libraries : State processor API ............ SUCCESS [ 14.682 s]
[INFO] Flink : Dist ....................................... SUCCESS [01:08 min]
[INFO] Flink : Yarn Tests ................................. SUCCESS [ 53.017 s]
[INFO] Flink : E2E Tests : ................................ SUCCESS [  2.762 s]
[INFO] Flink : E2E Tests : CLI ............................ SUCCESS [  6.705 s]
[INFO] Flink : E2E Tests : Parent Child classloading program SUCCESS [  6.814 s]
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SUCCESS [  2.870 s]
[INFO] Flink : E2E Tests : Dataset allround ............... SUCCESS [  4.070 s]
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SUCCESS [  5.574 s]
[INFO] Flink : E2E Tests : Datastream allround ............ SUCCESS [ 14.401 s]
[INFO] Flink : E2E Tests : Batch SQL ...................... SUCCESS [  7.477 s]
[INFO] Flink : E2E Tests : Stream SQL ..................... SUCCESS [  7.469 s]
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SUCCESS [  6.718 s]
[INFO] Flink : E2E Tests : High parallelism iterations .... SUCCESS [ 20.344 s]
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SUCCESS [  7.939 s]
[INFO] Flink : E2E Tests : Queryable state ................ SUCCESS [ 12.381 s]
[INFO] Flink : E2E Tests : Local recovery and allocation .. SUCCESS [  8.286 s]
[INFO] Flink : Quickstart : ............................... SUCCESS [  6.743 s]
[INFO] Flink : Quickstart : Java .......................... SUCCESS [  4.077 s]
[INFO] Flink : E2E Tests : Quickstart : Dummy-Dependency .. SUCCESS [  2.894 s]
[INFO] Flink : E2E Tests : Quickstart ..................... SUCCESS [  7.706 s]
[INFO] Flink : E2E Tests : Confluent schema registry ...... SUCCESS [ 12.088 s]
[INFO] Flink : E2E Tests : Stream state TTL ............... SUCCESS [ 15.647 s]
[INFO] Flink : E2E Tests : Common ......................... SUCCESS [ 20.588 s]
[INFO] Flink : E2E Tests : SQL client ..................... SUCCESS [ 27.093 s]
[INFO] Flink : E2E Tests : SQL Gateway .................... SUCCESS [ 49.349 s]
[INFO] Flink : E2E Tests : File sink ...................... SUCCESS [  6.933 s]
[INFO] Flink : E2E Tests : State evolution ................ SUCCESS [  8.122 s]
[INFO] Flink : E2E Tests : RocksDB state memory control ... SUCCESS [  9.458 s]
[INFO] Flink : E2E Tests : Metrics availability ........... SUCCESS [  9.407 s]
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SUCCESS [  9.884 s]
[INFO] Flink : E2E Tests : Heavy deployment ............... SUCCESS [ 24.206 s]
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [  7.240 s]
[INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [ 19.956 s]
[INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [  2.662 s]
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [  3.860 s]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [  3.779 s]
[INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [  4.977 s]
[INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [01:21 min]
[INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [ 11.021 s]
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [  6.560 s]
[INFO] Flink : E2E Tests : Python ......................... SUCCESS [ 15.574 s]
[INFO] Flink : E2E Tests : HBase .......................... SUCCESS [ 35.204 s]
[INFO] Flink : E2E Tests : Scala .......................... SUCCESS [ 24.078 s]
[INFO] Flink : E2E Tests : SQL ............................ SUCCESS [ 21.772 s]
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [  9.302 s]
[INFO] Flink : Contrib : .................................. SUCCESS [  2.822 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [  9.462 s]
[INFO] Flink : FileSystems : Tests ........................ SUCCESS [ 22.368 s]
[INFO] Flink : Docs ....................................... SUCCESS [ 35.736 s]
[INFO] Flink : Walkthrough : .............................. SUCCESS [  2.817 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [  9.269 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [  2.918 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [  5.121 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101

# 5. 编译时间与系统资源消耗

整个 Flink 编译过程较长,建议至少预留 4C8G 的资源与 10GB 以上磁盘空间。一般在 1~2 小时之间完成。

编辑结果

# 6. 编译输出结果

构建完成后,可在以下目录中找到构建好的 RPM 包:

/opt/modules/bigtop/output/flink/noarch/
1

如下图所示:

编译后的结果

# 7.涉及修改点

按顺序修改即可

  • Flink版本适配改造(三)
  • Flink版本适配改造(四)
  • Flink版本适配改造(五)
  • B-Flink版本适配改造(一)
  • B-Flink版本适配改造(二)
  • B-Flink版本适配改造(三)
#Flink#Bigtop#RPM打包#分布式计算
[O]Flink版本适配改造(二)
[O]Flink版本适配改造(三)

← [O]Flink版本适配改造(二) [O]Flink版本适配改造(三)→

最近更新
01
Pandoc 缺失导致 SparkR 构建失败
06-08
02
Cyrus SASL/GSASL 缺失解决
06-07
03
Hadoop_3.3.4 编译实战 1.0.0+
06-06
更多文章>
Theme by Vdoing | Copyright © 2017-2025 JaneTTR | MIT License
  • 跟随系统
  • 浅色模式
  • 深色模式
  • 阅读模式