下载Flink地址: https://flink.apache.org/downloads/ 下载connector地址:https://repo.maven.apache.org/maven2/org/apache/flink/ flink-1.17.0-bin-scala_2.12.tgz ---flink的安装包 flink-sql-connector-postgres-cdc-2.4.2.jar -- flink sql连接到pgsql的cdc flink-connector-kafka-1.17.0.jar -- fli...
https://flink.apache.org/downloads/ 其他必需的jar包(cdc、jdbc、mysq和oracle等驱动包) 下载Flink后,直接解压到指定目录下即可; tar zxvf flink-1.20.0-bin-scala_2.12.tgz 将所有必须的jar包放在lib目录下,我这边的目录为/u01/flink-1.20.0/lib; 启动flink: 1 2 3 4 [root@gcv-b-test-gmes-oracle ...
DownloadFlink CDC tar, unzip it and put jars of pipeline connector to Flinklibdirectory. Create aYAMLfile to describe the data source and data sink, the following example synchronizes all tables under MySQL app_db database to Doris :
Download Flink CDC tar, unzip it and put jars of pipeline connector to Flink lib directory. Create a YAML file to describe the data source and data sink, the following example synchronizes all tables under MySQL app_db database to Doris : source: type: mysql name: MySQL Source hostname:...
Docker 安装包地址:https://download.docker.com/linux/static/stable/x86_64/ Oceanus Sink Kudu 总结:https://cloud.tencent.com/developer/article/1845785 Oracle CDC 官方文档:https://ververica.github.io/flink-cdc-connectors/master/content/connectors/oracle-cdc.html 流计算 Oceanus 限量秒杀专享活动火...
说明:如果没有安装hadoop,那么可以不用yarn,直接用flink standalone环境吧。 2. 下载下列依赖包 下面两个地址下载flink的依赖包,放在lib目录下面。 flink-sql-connector-hive-2.2.0_2.11-1.13.5.jar 如果你的Flink是其它版本,可以来这里下载。 说明:我hive版本是2.1.1,为啥这里我选择版本号是2.2.0呢,这是官方...
CDC是(Change Data Capture 变更数据获取)的简称。核心思想是,监测并捕获数据库的变动(包括数据 或 数据表的插入INSERT、更新UPDATE、删除DELETE等),将这些变更按发生的顺序完整记录下来,写入到消息中间件中以供其他服务进行订阅及消费。
https://downloads.apache.org/flink/flink-1.13.6/flink-1.13.6-bin-scala_2.12.tgz 下载数据库connector wget https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.12/1.13.6/flink-sql-connector-elasticsearch7_2.12-1.13.6.jar wget https://repo1.maven.org/mav...
https://doris.apache.org/downloads/downloads.html 解压到指定目录: tarzxvf apache-doris-1.1.0-bin.tar.gz -C doris-1.1 解压到目录结构是这样的: . ├── apache_hdfs_broker │ ├── bin │ ├──conf│ └── lib ├── be
https://www.mongodb.com/try/download/shell 并启动: ./mongosh 另外需要初始化 replSet,否者 MongoDB Server 会一直报错。 rs.initiate() 步骤三: 下载Flink,请到官网下载最新 Flink: https://www.apache.org/dyn/closer.lua/flink/flink-1.18.0/flin... ...