from pyspark.sql.functions import when, col df.withColumn('new_col', when(col('col1')==val1, val2).otherwise(val3)) 复制 In the above syntax, withColumn() function adds a new column to the DataFrame. when() function takes a Boolean expression as the first argument and a value as...
Like SQL "case when" statement and Swith statement from popular programming languages, Spark SQL Dataframe also supports similar syntax using "when otherwise" or we can also use "case when" statement.