是的,您可以使用 Flink 的 Connect JDBC(也称为 JDBC Sink)来实现数据的入库和删除操作。对于同一个对象的数据,您可以根据字段的值来判断是执行入库操作还是删除操作 2023-08-15 17:31:10 发布于北京 举报 赞同 评论 打赏 问答分类: 流计算 实时计算 Flink版 问答标签: 实时计算 Flink版字段 实时计算 Flink版...
"connection.url": "jdbc:mysql://xxx.xxx.xxx.xxx:3306/baseqx?useUnicode=true&characterEncoding=utf8&allowMultiQueries=true" } } 2.kafka-connect创建主题中的默认数据格式为 {"schema":{"type":"struct","fields":[{"type":"int32","optional":false,"field":"ID"},{"type":"string","optional...
"optional":false,"name":"org.apache.kafka.connect.data.Timestamp","version":1,"field":"CREATE_TIME"}],"optional":false,"name":"test_demo"},"payload":{"ID":1,"NAME":"prestoEtl","CREATE_TIME":1606902182000
2.kafka-connect创建主题中的默认数据格式为 {"schema":{"type":"struct","fields":[{"type":"int32","optional":false,"field":"ID"},{"type":"string","optional":false,"field":"NAME"},{"type":"int64","optional":false,"name":"org.apache.kafka.connect.data.Timestamp","vers...