但是在执行上面的代码时,出现如下所示的异常信息,意思是Person不能作为key: Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: This type (GenericType<com.xueai8.ch03.TransformerKeyBy2.Person>) cannot be used as key. at org.apache.flink.api.common.operators.Keys$Express...
记录一次flink报错 在本地测试一个flink程序的时候报了一个异常:org.apache.flink.api.common.InvalidProgramException: This type (GenericType<com.bart.flink.datasource.WordWithCount>) cannot be used as key 观察异常堆栈打印定位到代码中的keyBy这里,报错的代码如下: KeyedStream<WordWithCount, Tuple> word =...
flink 执行keyBy操作出现异常情况: Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: This type (GenericType<com.hmh.entity.SensorEntity>) cannot be used as key. at org.apache.flink.api.common.operators.Keys$ExpressionKeys.<init>(Keys.java:330) at org.apache.flink....
使用Flink keyBy函数时,参数为POJO对象报错 org.apache.flink.api.common.InvalidProgramException: This type XXX cannot be used as key 原因 POJO需要满足: class必须是public的 必须有public无参构造函数 属性名不能为is开头
第二种方式,通过指定字段名来指定key。这个字段名是有一定要求的,后面我们再详细解释。 /** * It creates a new {@link KeyedStream} that uses the provided key for partitioning * its operator states. * * @param key * The KeySelector to be used for extracting the key for partitioning ...
如果您需要获取嵌套的JSON数据,则源表DDL中使用ROW格式定义JSON Object,结果表DDL中定义好要获取的JSON数据对应的Key,在DML语句中设置好Key获取的方式,就可以获取到对应的嵌套Key的Value值。代码示例如下: 测试数据 { "a":"abc", "b":1, "c":{ "e":["1","2","3","4"], "f":{"m":"567"} ...
Flink和Kafka之间的网络连通并不意味着能读取数据,只有Kafka Broker在bootstrap过程中返回的集群metadata中描述的Endpoint, 才可以连通Flink和Kafka,并读取到Kafka的数据,详情请参见Flink-cannot-connect-to-Kafka。检查办法为: 使用zkCli.sh或者zookeeper-shell.sh工具登录Kafka使用的Zookeeper。
// we need to make sure that any triggers scheduled in open() cannot be // executed before all operators are opened synchronized (lock) { // both the following operations are protected by the lock // so that we avoid race conditions in the case that initializeState() ...
在DLI上提交Flink SQL作业,作业运行失败,在作业日志中有如下报错信息:connect to DIS failed java.lang.IllegalArgumentException: Access key cannot be null该Flink SQL作业在配置作业运行参数时,有选择保存作业日志或开启Checkpoint,配
Flink CDC谁知道这是啥错吗?从页面上传jar包以后提交就报错?Caused by: org.apache.flink.api.common.InvalidProgramException: The LocalStreamEnvironment cannot be used when submitting a program through a client, or running in a TestEnvironment context. ...