SPRKPY1047
pyspark.context.SparkContext.setLogLevel has workaround
This issue appears when the tool detects the usage of pyspark.context.SparkContext.setLogLevel which has a workaround.
Input code:
sparkSession.sparkContext.setLogLevel("WARN")
Output code:
#EWI: SPRKPY1047 => pyspark.context.SparkContext.setLogLevel has a workaround, see documentation for more info
sparkSession.sparkContext.setLogLevel("WARN")
setLogLevel(self, logLevel: str) -> None
Action: Replace the "setLogLevel" function usage with "logging.basicConfig" that provides a set of convenience functions for simple logging usage. In order to use it, we need to import two modules, "logging" and "sys", and the level constant should be replaced using the "Level equivalent table":
import logging
import sys
logging.basicConfig(stream=sys.stdout, level=logging.WARNING)
Level source parameter | Level target parameter |
---|---|
"ALL" | This has no equivalent |
"DEBUG" | logging.DEBUG |
"ERROR" | logging.ERROR |
"FATAL" | logging.CRITICAL |
"INFO" | logging.INFO |
"OFF" | logging.NOTSET |
"TRACE" | This has no equivalent |
"WARN" | logging.WARNING |
- For more support, you can email us at [email protected]. If you have a contract for support with Snowflake, reach out to your sales engineer and they can direct your support needs.
Last modified 21d ago