validate The ultimate transformation of a check with a dataframe input for validation agnosticControls pysparkCheckDescriptionDataType completeness Zero nulls agnostic information Zero nulls and cardinality > 1 agnostic intelligence Zero nulls, zero empty strings and cardinality > 1 agnostic percentage_fill ...
与spark-shell类似,使用源数据创建DataFrame之后调用connector进行写入 Expand All @@ -253,7 +253,7 @@ df2.write.format("hologres").option( 启动pyspark并加载connector ```shell spark-sql --jars hologres-connector-spark-3.x-1.4.1-SNAPSHOT-jar-with-dependencies.jar spark-sql --jars holog...