SPRKPY1079

pyspark.context.SparkContext.setLogLevel

Message: The argument of the pyspark.context.SparkContext.setLogLevel function is not a valid PySpark log level

Category: Warning

Description

This issue appears when the SMA detects the use of the pyspark.context.SparkContext.setLogLevel function with an argument that is not a valid log level in PySpark, and therefore an equivalent could not be determined in Snowpark.

Scenario

Input

here the log level uses "INVALID_LOG_LEVEL" that is not a valid Pyspark log level.

sparkSession.sparkContext.setLogLevel("INVALID_LOG_LEVEL")

Output

SMA can not recognize the log level "INVALID_LOG_LEVEL", even though SMA makes the conversion the EWI SPRKPY1079 is added to indicate a possible problem.

#EWI: SPRKPY1079 => INVALID_LOG_LEVEL is not a valid PySpark log level, therefore an equivalent could not be determined in Snowpark. Valid PySpark log levels are: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
logging.basicConfig(stream = sys.stdout, level = logging.INVALID_LOG_LEVEL)

Recommended fix

Make sure that the log level used in the pyspark.context.SparkContext.setLogLevel function is a valid log level in PySpark or in Snowpark and try again.

logging.basicConfig(stream = sys.stdout, level = logging.DEBUG)

Additional recommendations

Last updated