Technical Documentation
This page contains the internal technical documentation of the snowconvert-helpers project, generated automatically by pydoc
Last updated
This page contains the internal technical documentation of the snowconvert-helpers project, generated automatically by pydoc
Last updated
All the functions defined in the project.
Use the real uid/gid to test for access to a path.
dir_fd, effective_ids, and follow_symlinks may not be implemented on your platform. If they are unavailable, using them will raise a NotImplementedError.
Note that most operations will use the effective uid/gid, therefore this routine can be used in a suid/sgid environment to test if the invoking user has the specified access to the path.
path,
Path to be tested; can be string, bytes, or a path-like
mode,
Operating-system mode bitfield. Can be F_OK to test existence, or the inclusive-OR of R_OK, W_OK, and X_OK
dir_fd,
If not None, it should be a file descriptor open to a directory, and path should be relative; path will then be relative to that directory
effective_ids,
If True, access will use the effective uid/gid instead of the real uid/gid
follow_symlinks,
If False, and the last element of the path is a symbolic link, access will examine the symbolic link itself instead of the file the link points to
Executes at the exit of the execution of the script.
Prints colored text from the specified color.
text
,
The text to be printed
color="blue"
,
The color to print
Configures the logging that will be performed for any data-related execution on the snowflake connection. The log file is named 'snowflake_python_connector.log' by default.
Parameters:
configuration_path
,
The configuration path of the file that contains all the settings desired for the logging
Drops the transient table with the specified name.
Parameters:
tempTableName
,
The name of the temporary table
con=None
,
The connection to be used, if None is passed it will use the last connection performed
Parameters:
exctype
value
tback
Executes a sql string using the last connection, optionally it uses arguments or an specific connection. Examples:
exec("SELECT * FROM USER")
exec("SELECT * FROM USER", con)
exec("SELECT * FROM CUSTOMER WHERE CUSTOMERID= %S", customer)
Parameters:
sql_string
,
The definition of the sql
using=None
,
The optional parameter that can be used in the sql passed
con=None
,
The connection to be used, if None is passed it will use the last connection performed
Reads the content of a file and executes the sql statements contained with the specified connection.
Parameters:
filename
,
The filename to be read and executed
con=None
,
The connection to be used, if None is passed it will use the last connection performed
Executes a command in the operative system.
Executes a sql statement in the connection passed, with the optional arguments.
Parameters:
sql_string
,
The sql containing the string to be executed
con
,
The connection to be used
using
,
The optional parameters to be used in the sql execution
Expands the statement passed with the parameters.
Parameters:
statement
,
The sql containing the string to be executed
params
,
The parameters of the sql statement
Expands the variable from the string passed.
Parameters:
str
,
The string to be expanded with the variables
Expand environment variables of form $var and ${var}. If parameter 'skip_escaped' is True, all escaped variable references (i.e. preceded by backslashes) are skipped. Unknown variables are set to 'default'. If 'default' is None, they are left unchanged.
Parameters:
path
,
params
,
skip_escaped=False
,
Executes the fast load with the passed parameters target_schema, filepath, stagename and target_table_name.
Parameters:
target_schema
,
The name of the schema to be used in the fast load
filepath
,
The filename path to be loaded in the table
target_table_name
,
The name of the table that will have the data loaded
con=None
,
The connection to be used, if None is passed it will use the last connection performed
Parameters:
filename
,
Gets the argument key value from the passed string. It must start with the string '--param-'
Parameters:
astr
,
The argument string to be used. The string should have a value similar to --param-column=32 and the returned string will be '32
Gets the error position from the file using the information of the stack of the produced error.
Gets the argument from the position specified or gets the value from the table vars or gets the environment variable name passed.
Parameters:
arg_pos
,
The argument position to be used from the arguments parameter
variable_name
,
The name of the variable to be obtained
vars
,
The hash with the variables names and values
args
,
The arguments array parameter
Imports data to a temporary table using an input data place holder.
Parameters:
tempTableName,
The temporary table name.
inputDataPlaceholder,
The input place holder used that is a stage in the snowflake database
con,
The connection to be used
Imports the passed filename with the optional separator.
Parameters:
filename,
The filename path to be imported
separator=' ',
The optional separator
Imports the file passed to a temporary table. It will use a public stage named as the temporary table with the prefix Stage_. At the end of the loading to the temporary table, it will delete the stage that was used in the process.
Parameters:
filename,
The name of the file to be read
tempTableName,
The name of the temporary table
columnDefinition,
The definition of all the fields that will have the temporary table
Prints a message to the console (standard output) or to the log file, depending on if logging is enabled
Parameters:
*msg,
The message to print or log
level=20,
writter=None,
Logs on the snowflake database with the credentials, database, warehouse, role, login_timeout and authenticator passed parameters.
Parameters:
user,
The user of the database
password
The password of the user of the database
database,
The database to be connected
warehouse,
The warehouse of the database to be connected
role,
The role to be connected
login_timeout,
The maximum timeout before giving error if the connection is taking too long to connect
authenticator,
The authenticator supported value to use like SNOWFLAKE, EXTERNALBROWSER, SNOWFLAKE_JWT or OAUTH
token,
The OAUTH or JWT token
Parameters:
args,
Prints the dictionary without exposing user and password values.
Parameters:
dictionary,
Quits the application and optionally returns the passed code.
Parameters:
code=None,
The code to be returned after it quits
Reads the parameter arguments from the passed array.
Parameters:
args,
The arguments to be used
Reads the given filename lines and optionally skips some lines at the beginning of the file.
Parameters:
line,
The filename to be read
skip=0,
The lines to be skipped
Prints the argument.
Parameters:
arg,
The argument to be printed
Repeats the previous executed sql statement(s).
Parameters:
con=None,
Connection if specified. If it is not passed it will use the last connection performed
n=1,
The number of previous statements to be executed again
Parameters:
severity_value,
Parameters:
arg,
severity_value,
Executes a simple fast load in the connection and the passed parameter target_schema, filepath, stagename and target table name.
Parameters:
arg,
The connection to be used
target_schema,
The name of the schema to be used in the fast load
filepath,
The filename path to be loaded in the table
target_table_name,
The name of the table that will have the data loaded
Perform a stat system call on the given path. dir_fd and follow_symlinks may not be implemented on your platform. If they are unavailable, using them will raise a NotImplementedError. It's an error to use dir_fd or follow_symlinks when specifying path as an open file descriptor
Parameters:
dir_fd,
If not None, it should be a file descriptor open to a directory, and path should be a relative string; path will then be relative to that directory
follow_symlinks,
If False, and the last element of the path is a symbolic link, stat will examine the symbolic link itself instead of the file the link points to
Execute the command in a subshell.
Parameters:
command
,
Parameters:
*argv
,
All the classes defined in the project
This class contains the import_file_to_tab
static function which provides support for the BEGIN LOADING and associated commands in FastLoad.
import_file_to_tab()
Parameters:
target_schema_table
the target schema (optional) and table name
define_file
The name of the file to be read
define_columns
The definition of all the columns for the temporary table
begin_loading_columns
The column names to insert. Dictates the order in which values are inserted
begin_loading_values
The list of raw insert values to convert
field_delimiter
The field delimiter
(optional) skip_header
The number of rows to skip
(optional) input_data_place_holder
The location of the file in a supported cloud provider. Set parameter when the file is not stored locally
(optional) con
The connection to be used
Static methods in the class
defaults()
null(value=None)
record_mode(value=None)
report(file, separator=' ')
reset()
separator_string(value=None)
separator_width(value=None)
side_titles(value=None)
title_dashes(value=None, withValue=None)
title_dashes_with(value=None)
width(value=None)
Data and other attributes defined here
expandedfilename = None
separator = ''
\
Methods in the class
reset()
Static methods in the class
file(file, separator=' ')
using(globals, *argv)
Data and other attributes defined in the class
expandedfilename = None
no_more_rows = False
read_obj = None
reader = None
separator = ' '
Parameters
ClassData and other attributes defined in the class
passed_variables = {}
path,
Path to be examined; can be string, bytes, a path-like or
open-file-descriptor int