max_batch_interval/max_batch_rows/max_batch_size 这三个参数用于控制单个任务的执行时间。其中任意一个阈值达到,则任务结束。其中 max_batch_rows 用于记录从 Kafka 中读取到的数据行数。max_batch_size 用于记录从 Kafka 中读取到的数据量,单位是字节。目前一个任务的消费速率大约为 5-10MB/s。 那么假设一行...
其中Config.max_routine_load_task_concurrrent_num 是系统的一个默认的最大并发数限制。这是一个 FE 配置,可以通过改配置调整。默认为 5。 其中partition num 指订阅的 Kafka topic 的 partition 数量。alive_backend_num 是当前正常的 BE 节点数。 2.1.4 max_batch_interval/max_batch_rows/max_batch_size ...
"max_batch_interval" = "10", "max_batch_rows" = "1000000", "max_batch_size" = "109715200", "strict_mode" = "false", "format" = "json" ) FROM KAFKA ( "kafka_broker_list" = "host:port", "kafka_topic" = "log_topic", "property.group.id" = "your_group_id", "property.sec...
2.1.4 max_batch_interval/max_batch_rows/max_batch_size 这三个参数用于控制单个任务的执行时间。其中任意一个阈值达到,则任务结束。其中 max_batch_rows 用于记录从 Kafka 中读取到的数据行数。max_batch_size 用于记录从 Kafka 中读取到的数据量,单位是字节。目前一个任务的消费速率大约为 5-10MB/s。 那么...
max_batch_size:每个子任务最多读取的字节数。单位是字节,范围是 100MB 到 1GB。默认是 100MB。 1. 2. 3. 这三个参数,用于控制一个子任务的执行时间和处理量。当任意一个达到阈值,则任务结束。使用举例: "max_batch_interval" = "20", "max_batch_rows" = "300000", "max_batch_size" = "...
"max_batch_interval"="10", "max_batch_rows"="1000000", "max_batch_size"="109715200", "strict_mode"="false", "format"="json" ) FROMKAFKA ( "kafka_broker_list"="host:port", "kafka_topic"="log_topic", "property.group.id"="your_group_id", ...
CREATEROUTINELOADexample_db.test1ONexample_tbl[WITHMERGE|APPEND|DELETE]COLUMNS(k1,k2,k3,v1,v2,label),WHEREk1>100and k2 like"%doris%"[DELETEONlabel=true]PROPERTIES("desired_concurrent_number"="3","max_batch_interval"="20","max_batch_rows"="300000","max_batch_size"="209715200","strict...
( "desired_concurrent_number"="3", "max_batch_interval" = "20", "max_batch_rows" = "300000", "max_batch_size" = "209715200", "strict_mode" = "false", "format" = "json", "jsonpath" = "{\"jsonpath\":[{\"column\":\"category\",\"value\":\"$.store.book.category\"},{...
"max_batch_interval" = "20", "max_batch_rows" = "300000", "max_batch_size" = "209715200", "strict_mode" = "false", "format" = "json" )FROM KAFKA ( "kafka_broker_list"= "10.150.20.12:9092", "kafka_topic" = "bigDataSensorAnalyse", ...
"max_batch_interval" = "20", "max_batch_rows" = "200000", "max_batch_size" = "104857600", "strict_mode" = "false", "strip_outer_array" = "true", "format" = "json", "json_root" = "$.data", "jsonpaths" = "[\"$.ACCOUNT_LINE_ID\",\"$.update_time\",\"$.type\"]...