当前位置: 首页 > article >正文

使用Dinky快速提交Flink operator任务

官网地址:K8s集成 | Dinky

1.目前使用版本

Dinky1.2.0、Flink1.18.1、Flink operator0.10.0

2.制作镜像

2.1创建DockerFile

ARG FLINK_VERSION=1.18.1
FROM flink:${FLINK_VERSION}-scala_2.12
RUN mkdir -p /opt/flink/usrlib
COPY   commons-cli-1.3.1.jar                            /opt/flink/lib/
COPY   dinky-app-1.18-1.2.0-jar-with-dependencies.jar   /opt/flink/usrlib/
COPY   flink-metrics-prometheus-1.18.1.jar              /opt/flink/lib/
COPY   flink-table-planner_2.12-1.18.1.jar              /opt/flink/lib/
COPY   mysql-connector-java-8.0.30.jar                  /opt/flink/lib/
COPY   flink-shaded-hadoop-3-uber-3.1.1.7.2.1.0-327-9.0.jar /opt/flink/lib/
COPY   commons-math3-3.6.1.jar   /opt/flink/lib/
RUN rm -rf ${FLINK_HOME}/lib/flink-table-planner-loader-*.jar

2.2 构建镜像并推送到私有镜像仓库

docker build -t dinky-flink:1.18.1   . --no-cache
docker tag dinky-flink:1.18.1 registry.cn-hangzhou.aliyuncs.com/dinkyhub/dinky-flink:1.18.1
docker push  registry.cn-hangzhou.aliyuncs.com/dinkyhub/dinky-flink:1.18.1

2.3创建serviceaccount等

kubectl create namespace  flink-apps
kubectl -n flink-apps create serviceaccount flink-serviceaccount
kubectl -n flink-apps create clusterrolebinding flink-role-binding --clusterrole=cluster-admin --serviceaccount=flink-apps:flink-serviceaccount
--这里注意--clusterrole=cluster-admin 权限级别较高 默认edit即可。

kubectl create secret docker-registry flink-apps-secret \
--docker-server=registry.cn-hangzhou.aliyuncs.com \
--docker-username=xx \
--docker-password=xxxx \
-n flink-apps

kubectl patch serviceaccount flink-serviceaccount -p '{"imagePullSecrets": [{"name": "flink-apps-secret"}]}' -n  flink-apps

3.Dinky中配置

3.1页面上的配置

3.2 Flink sql任务

set 'taskmanager.numberOfTaskSlots' = '2';
set 'parallelism.default' = '2';
set 'kubernetes.container.image' = 'registry.cn-hangzhou.aliyuncs.com/dinkyhub/dinky-flink:1.18.1';
set 'kubernetes.service-account' = 'flink-serviceaccount';
set 'job.autoscaler.enabled' = 'true';
set 'job.autoscaler.metrics.window' = '20s';
set 'job.autoscaler.target.utilization' = '0.30';
set 'job.autoscaler.scale.up.threshold' = '0.05';
set 'job.autoscaler.scale.down.threshold' = '0.1';
set 'job.autoscaler.stabilization.interval' = '5s';
set 'job.autoscaler.cooldown.period' = '5s';
set 'job.autoscaler.scale.up.max.factor' = '1.5';
set 'job.autoscaler.scale.down.max.factor' = '0.5';

set 'metrics.reporters' = 'prometheus';
set 'metrics.reporter.prometheus.factory.class' = 'org.apache.flink.metrics.prometheus.PrometheusReporterFactory';
set 'metrics.reporter.prometheus.port' = '9249';

set 'jobmanager.scheduler' = 'adaptive';
set 'state.backend' = 'rocksdb';
set 'jobmanager.archive.fs.dir'='file:///tmp';
set 'state.checkpoints.dir' = 'file:///tmp/checkpoints';
set 'state.savepoints.dir' = 'file:///tmp/savepoints';
set 'execution.checkpointing.interval' = '10000';
set 'execution.checkpointing.mode' = 'EXACTLY_ONCE';
set 'execution.checkpointing.timeout' = '600000';
set 'execution.checkpointing.min.pause' = '10000';
set 'execution.checkpointing.max.concurrent.checkpoints' = '1';
set 'metrics.latency.granularity' = 'operator';
set 'web.backpressure.refresh-interval' = '1000';
set 'metrics.backpressure.enabled' = 'true';
set 'metrics.backpressure.interval' = '1000';
set 'metrics.backpressure.timeout' = '60000';
set 'kubernetes.service.exposed.type' = 'NodePort';
set 'kubernetes.rest-service.exposed.type' = 'NodePort';
set 'kubernetes.jobmanager.service-account' = 'flink-serviceaccount';


--创建源表datagen_source
CREATE TABLE datagen_source
(
    id BIGINT,
    name STRING
)
WITH ( 'connector' = 'datagen');
--创建结果表blackhole_sink
CREATE TABLE blackhole_sink
(
    id BIGINT,
    name STRING
)
WITH ( 'connector' = 'blackhole');
--将源表数据插入到结果表
INSERT INTO blackhole_sink
SELECT id,
       name
from datagen_source;

3.3 Jar包任务

上传任务jar包复制地址。右键复制jar包地址。比如rs:/flink-test-1.0-SNAPSHOT.jar

set 'taskmanager.numberOfTaskSlots' = '2';
set 'parallelism.default' = '2';
set 'kubernetes.container.image' = 'registry.cn-hangzhou.aliyuncs.com/dinkyhub/dinky-flink:1.18.1';
set 'kubernetes.service-account' = 'flink-serviceaccount';

set 'job.autoscaler.enabled' = 'true';
set 'job.autoscaler.metrics.window' = '20s';
set 'job.autoscaler.target.utilization' = '0.30';
set 'job.autoscaler.scale.up.threshold' = '0.05';
set 'job.autoscaler.scale.down.threshold' = '0.1';
set 'job.autoscaler.stabilization.interval' = '5s';
set 'job.autoscaler.cooldown.period' = '5s';
set 'job.autoscaler.scale.up.max.factor' = '1.5';
set 'job.autoscaler.scale.down.max.factor' = '0.5';

set 'jobmanager.scheduler' = 'adaptive';
set 'metrics.reporters' = 'prometheus';
set 'metrics.reporter.prometheus.port' = '9249';
set 'metrics.reporter.prometheus.factory.class' = 'org.apache.flink.metrics.prometheus.PrometheusReporterFactory';


set 'state.checkpoints.dir' = 'file:///tmp/checkpoints';
set 'state.savepoints.dir' = 'file:///tmp/savepoints';
set 'execution.checkpointing.interval' = '100000';
set 'execution.checkpointing.mode' = 'EXACTLY_ONCE';
set 'execution.checkpointing.timeout' = '600000';
set 'execution.checkpointing.min.pause' = '10000';
set 'execution.checkpointing.max.concurrent.checkpoints' = '1';


-- REST service 配置
SET 'kubernetes.rest-service.exposed.type' = 'NodePort';

注意点:

1.修改这里的Dinky地址,不然会下载报错。

2.集群配置地址时。

local:///opt/flink/usrlib/dinky-app-1.18-1.2.0-jar-with-dependencies.jar
注意local后边是三个斜杠,少写一个会报错

3.数据库配置要写静态ip,别写127.0.0.1.不然会报错。


http://www.kler.cn/a/468513.html

相关文章:

  • unity学习8:unity的基础操作 和对应shortcut
  • QT上实现SVM进行数据分类
  • 跨站脚本攻击(XSS)详解
  • 【C++】深入解析二维数组初始化与越界问题
  • Redis--高可用(主从复制、哨兵模式、分片集群)
  • Uniapp Android 本地离线打包(详细流程)
  • 【zig】0.zig的下载安装
  • 【Python基础语法】
  • Leecode刷题C语言之设计一个ATM机器
  • Gitee上传项目代码教程(详细)
  • MySQL中表之间关联不同方式操作详解
  • Spring Boot 的自动配置,以rabbitmq为例,请详细说明
  • 凸包(convex hull)简述
  • 全国青少年信息学奥林匹克竞赛(信奥赛)备考实战之循环结构(while循环语句)
  • 20241231在Ubuntu20.04.5系统中下载安装Android Studio 2024.2.1.12
  • Kafka 消费者专题
  • 如何通过本地部署的DIFY辅助学习算法(PS可以辅助帮你学习任何想学习的资料)
  • 探索WebAssembly:前端与后端的新未来
  • unity学习6:unity的3D项目的基本界面和菜单
  • MCP(Model Context Protocol)模型上下文协议 进阶篇3 - 传输
  • 互动为王:开源AI智能名片链动2+1模式商城小程序在社群运营中的深度应用与价值探索
  • 解锁AI Agent潜能:智能时代的信息处理利器2(18/30)
  • ES-深度分页问题
  • LeetCode题练习与总结:随机翻转矩阵--519
  • 使用FDBatchMove的几个问题总结
  • 数据结构:ArrayList与顺序表