The SMA adds the EWI SPRKSCL1134 to the output code to let you know that this function is not fully supported by Snowpark, but it has a workaround.
val df = Seq(10.0, 20.0, 30.0, 40.0).toDF("value")/*EWI: SPRKSCL1134 => org.apache.spark.sql.functions.log has a workaround, see documentation for more info*/val result1 = df.withColumn("log_value", log(10, "value"))/*EWI: SPRKSCL1134 => org.apache.spark.sql.functions.log has a workaround, see documentation for more info*/val result2 = df.withColumn("log_value", log(10, col("value")))/*EWI: SPRKSCL1134 => org.apache.spark.sql.functions.log has a workaround, see documentation for more info*/val result3 = df.withColumn("log_value", log("value"))/*EWI: SPRKSCL1134 => org.apache.spark.sql.functions.log has a workaround, see documentation for more info*/val result4 = df.withColumn("log_value", log(col("value")))
Recommended fix
Below are the different workarounds for all the overloads of the log function.
You can pass lit(Math.E) as the first argument and convert the column name into a column object using the com.snowflake.snowpark.functions.col function and pass it as the second argument.
4. def log(e: Column): Column
You can pass lit(Math.E) as the first argument and the column object as the second argument.