SPRKSCL1136

org.apache.spark.sql.functions.min

This issue code is deprecated since Spark Conversion Core 4.3.2

Message: org.apache.spark.sql.functions.min has a workaround, see documentation for more info

Category: Warning

Description

This issue appears when the SMA detects a use of the org.apache.spark.sql.functions.min function, which has a workaround.

Scenario

Input

Below is an example of the org.apache.spark.sql.functions.min function, first used with a column name as an argument and then with a column object.

val df = Seq(1, 3, 10, 1, 3).toDF("value")
val result1 = df.select(min("value"))
val result2 = df.select(min(col("value")))

Output

The SMA adds the EWI SPRKSCL1136 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.

val df = Seq(1, 3, 10, 1, 3).toDF("value")
/*EWI: SPRKSCL1136 => org.apache.spark.sql.functions.min has a workaround, see documentation for more info*/
val result1 = df.select(min("value"))
/*EWI: SPRKSCL1136 => org.apache.spark.sql.functions.min has a workaround, see documentation for more info*/
val result2 = df.select(min(col("value")))

Recommended fix

Snowpark has an equivalent min function that receives a column object as an argument. For that reason, the Spark overload that receives a column object as an argument is directly supported by Snowpark and does not require any changes.

For the overload that takes a string argument, you can convert the string into a column object using the com.snowflake.snowpark.functions.col function as a workaround.

val df = Seq(1, 3, 10, 1, 3).toDF("value")
val result1 = df.select(min(col("value")))
val result2 = df.select(min(col("value")))

Additional recommendations

Last updated