Spark SQL - Databricks SQL
Welcome to Snowflake SnowConvert for Spark SQL - Databricks SQL. Let us be your guide on the road to a successful migration.
What is SnowConvert for Spark SQL - Databricks SQL?
SnowConvert is a software that understands SQL scripts and converts this source code into functionally equivalent Snowflake code.
Conversion Types
Specifically, SnowConvert performs the following conversions:
Spark SQL - Databricks SQL to Snowflake SQL
SnowConvert understands the Spark SQL - Databricks SQL source code and converts the Data Definition Language (DDL), Data Manipulation Language (DML), and functions in the source code to the corresponding SQL in the target: Snowflake.
Sample code
Spark SQL - Databricks SQL basic input code:
Snowflake SQL output code:
As you can see, most of the structure remains the same. For example, some cases require the data types to be transformed.
SnowConvert Terminology
Before we get lost in the magic of these code conversions, here are a few terms/definitions so you know what we mean when we start dropping them all over the documentation:
SQL (Structured Query Language): the standard language for storing, manipulating, and retrieving data in most modern database architectures.
SnowConvert: the software that converts your PostgreSQL & based languages files securely and automatically to the Snowflake cloud data platform.
Conversion rule or transformation rule: rules that allow SnowConvert to convert from a portion of source code to the expected target code.
Parse: Parsing is an initial process by SnowConvert to understand the source code and build up an internal data structure required for executing the conversion rules.
Last updated