LogoLogo
SnowflakeDocumentation Home
  • Snowpark Migration Accelerator Documentation
  • General
    • Introduction
    • Getting Started
      • Download and Access
      • Installation
        • Windows Installation
        • MacOS Installation
        • Linux Installation
    • Conversion Software Terms of Use
      • Open Source Libraries
    • Release Notes
      • Old Version Release Notes
        • SC Spark Scala Release Notes
          • Known Issues
        • SC Spark Python Release Notes
          • Known Issues
    • Roadmap
  • User Guide
    • Overview
    • Before Using the SMA
      • Supported Platforms
      • Supported Filetypes
      • Code Extraction
      • Pre-Processing Considerations
    • Project Overview
      • Project Setup
      • Configuration and Settings
      • Tool Execution
    • Assessment
      • How the Assessment Works
      • Assessment Quick Start
      • Understanding the Assessment Summary
      • Readiness Scores
      • Output Reports
        • Curated Reports
        • SMA Inventories
        • Generic Inventories
        • Assessment zip file
      • Output Logs
      • Spark Reference Categories
    • Conversion
      • How the Conversion Works
      • Conversion Quick Start
      • Conversion Setup
      • Understanding the Conversion Assessment and Reporting
      • Output Code
    • Using the SMA CLI
      • Additional Parameters
  • Use Cases
    • Assessment Walkthrough
      • Walkthrough Setup
        • Notes on Code Preparation
      • Running the Tool
      • Interpreting the Assessment Output
        • Assessment Output - In Application
        • Assessment Output - Reports Folder
      • Running the SMA Again
    • Conversion Walkthrough
    • Sample Project
    • Using SMA with Docker
    • SMA CLI Walkthrough
  • Issue Analysis
    • Approach
    • Issue Code Categorization
    • Issue Codes by Source
      • General
      • Python
        • SPRKPY1000
        • SPRKPY1001
        • SPRKPY1002
        • SPRKPY1003
        • SPRKPY1004
        • SPRKPY1005
        • SPRKPY1006
        • SPRKPY1007
        • SPRKPY1008
        • SPRKPY1009
        • SPRKPY1010
        • SPRKPY1011
        • SPRKPY1012
        • SPRKPY1013
        • SPRKPY1014
        • SPRKPY1015
        • SPRKPY1016
        • SPRKPY1017
        • SPRKPY1018
        • SPRKPY1019
        • SPRKPY1020
        • SPRKPY1021
        • SPRKPY1022
        • SPRKPY1023
        • SPRKPY1024
        • SPRKPY1025
        • SPRKPY1026
        • SPRKPY1027
        • SPRKPY1028
        • SPRKPY1029
        • SPRKPY1030
        • SPRKPY1031
        • SPRKPY1032
        • SPRKPY1033
        • SPRKPY1034
        • SPRKPY1035
        • SPRKPY1036
        • SPRKPY1037
        • SPRKPY1038
        • SPRKPY1039
        • SPRKPY1040
        • SPRKPY1041
        • SPRKPY1042
        • SPRKPY1043
        • SPRKPY1044
        • SPRKPY1045
        • SPRKPY1046
        • SPRKPY1047
        • SPRKPY1048
        • SPRKPY1049
        • SPRKPY1050
        • SPRKPY1051
        • SPRKPY1052
        • SPRKPY1053
        • SPRKPY1054
        • SPRKPY1055
        • SPRKPY1056
        • SPRKPY1057
        • SPRKPY1058
        • SPRKPY1059
        • SPRKPY1060
        • SPRKPY1061
        • SPRKPY1062
        • SPRKPY1063
        • SPRKPY1064
        • SPRKPY1065
        • SPRKPY1066
        • SPRKPY1067
        • SPRKPY1068
        • SPRKPY1069
        • SPRKPY1070
        • SPRKPY1071
        • SPRKPY1072
        • SPRKPY1073
        • SPRKPY1074
        • SPRKPY1075
        • SPRKPY1076
        • SPRKPY1077
        • SPRKPY1078
        • SPRKPY1079
        • SPRKPY1080
        • SPRKPY1081
        • SPRKPY1082
        • SPRKPY1083
        • SPRKPY1084
        • SPRKPY1085
        • SPRKPY1086
        • SPRKPY1087
        • SPRKPY1088
        • SPRKPY1089
        • SPRKPY1101
      • Spark Scala
        • SPRKSCL1000
        • SPRKSCL1001
        • SPRKSCL1002
        • SPRKSCL1100
        • SPRKSCL1101
        • SPRKSCL1102
        • SPRKSCL1103
        • SPRKSCL1104
        • SPRKSCL1105
        • SPRKSCL1106
        • SPRKSCL1107
        • SPRKSCL1108
        • SPRKSCL1109
        • SPRKSCL1110
        • SPRKSCL1111
        • SPRKSCL1112
        • SPRKSCL1113
        • SPRKSCL1114
        • SPRKSCL1115
        • SPRKSCL1116
        • SPRKSCL1117
        • SPRKSCL1118
        • SPRKSCL1119
        • SPRKSCL1120
        • SPRKSCL1121
        • SPRKSCL1122
        • SPRKSCL1123
        • SPRKSCL1124
        • SPRKSCL1125
        • SPRKSCL1126
        • SPRKSCL1127
        • SPRKSCL1128
        • SPRKSCL1129
        • SPRKSCL1130
        • SPRKSCL1131
        • SPRKSCL1132
        • SPRKSCL1133
        • SPRKSCL1134
        • SPRKSCL1135
        • SPRKSCL1136
        • SPRKSCL1137
        • SPRKSCL1138
        • SPRKSCL1139
        • SPRKSCL1140
        • SPRKSCL1141
        • SPRKSCL1142
        • SPRKSCL1143
        • SPRKSCL1144
        • SPRKSCL1145
        • SPRKSCL1146
        • SPRKSCL1147
        • SPRKSCL1148
        • SPRKSCL1149
        • SPRKSCL1150
        • SPRKSCL1151
        • SPRKSCL1152
        • SPRKSCL1153
        • SPRKSCL1154
        • SPRKSCL1155
        • SPRKSCL1156
        • SPRKSCL1157
        • SPRKSCL1158
        • SPRKSCL1159
        • SPRKSCL1160
        • SPRKSCL1161
        • SPRKSCL1162
        • SPRKSCL1163
        • SPRKSCL1164
        • SPRKSCL1165
        • SPRKSCL1166
        • SPRKSCL1167
        • SPRKSCL1168
        • SPRKSCL1169
        • SPRKSCL1170
        • SPRKSCL1171
        • SPRKSCL1172
        • SPRKSCL1173
        • SPRKSCL1174
        • SPRKSCL1175
      • SQL
        • SparkSQL
          • SPRKSPSQL1001
          • SPRKSPSQL1002
          • SPRKSPSQL1003
          • SPRKSPSQL1004
          • SPRKSPSQL1005
          • SPRKSPSQL1006
        • Hive
          • SPRKHVSQL1001
          • SPRKHVSQL1002
          • SPRKHVSQL1003
          • SPRKHVSQL1004
          • SPRKHVSQL1005
          • SPRKHVSQL1006
      • Pandas
        • PNDSPY1001
        • PNDSPY1002
        • PNDSPY1003
        • PNDSPY1004
      • DBX
        • SPRKDBX1001
    • Troubleshooting the Output Code
      • Locating Issues
    • Workarounds
    • Deploying the Output Code
  • Translation Reference
    • Translation Reference Overview
    • SIT Tagging
      • SQL statements
    • SQL Embedded code
    • HiveSQL
      • Supported functions
    • Spark SQL
      • Spark SQL DDL
        • Create Table
          • Using
      • Spark SQL DML
        • Merge
        • Select
          • Distinct
          • Values
          • Join
          • Where
          • Group By
          • Union
      • Spark SQL Data Types
      • Supported functions
  • Workspace Estimator
    • Overview
    • Getting Started
  • INTERACTIVE ASSESSMENT APPLICATION
    • Overview
    • Installation Guide
  • Support
    • General Troubleshooting
      • How do I give SMA permission to the config folder?
      • Invalid Access Code error on VDI
      • How do I give SMA permission to Documents, Desktop, and Downloads folders?
    • Frequently Asked Questions (FAQ)
      • Using SMA with Jupyter Notebooks
      • How to request an access code
      • Sharing the Output with Snowflake
      • DBC files explode
    • Glossary
    • Contact Us
Powered by GitBook
On this page
  • Description
  • Installation
  • Commands
  • Installing an access code
  • Checking which access codes are installed
  • Converting
  • Performing an Assessment
  • Checking the tool version
  • Enabling conversion of Databricks notebooks to Jupyter Notebooks
  • Setting the SQL Flavor of the source code
  • Disable checkpoints feature
  • Need more help?
  1. User Guide

Using the SMA CLI

Programmatically assess and convert with the SMA CLI

PreviousOutput CodeNextAdditional Parameters

Last updated 1 month ago

Description

The Snowpark Migration Accelerator (SMA) has a Commond Line Interface (CLI). This CLI can be run using a series of commands that will execute the code processor, install/show an access code, and any other operation you can do from the SMA application.

Note that the SMA has one single code processor for . Note that you do not need to specify any arguments for this.

Installation

Before installing the CLI, you will need to an accessible location. Depending on your operating system, you can review the corresponding installation guide:

Commands

To run the tool it needs to be set a sequence of commands according to your needs. To do that, use the following syntax using the long-command or short-command options:

sma [command] [argument] [command] [argument] ...

The available commands are listed below. Click on a command to see a detailed explanation.

Long-command
Short-Command
Description

-h

Show help information.

-v

Show the version of the tool.

install-ac

Install an access code.

show-ac

Show the installed access code(s).

-i

The input folder path

-o

The output folder path

-a

Add it to execute the tool in assessment mode.

-m

Folder path where the custom mapping files are stored.

-j

Flag to indicate if the conversion of Databricks notebooks to Jupyter is enabled or not.

-f

Database engine syntax to be used when a SQL command is detected.

-e

Configure the customer email.

-c

Configure the customer company.

-p

Configure the customer project.

-y

Skip asking for continue when running.

-d

Disable checkpoints feature.

Installing an access code

Before converting your code, you need to install an access code. You can do this by specifying the access code, or by specifying the path to the file that contains the access code information (this is useful when installing the access code without an Internet connection or under restrictive firewall settings).

You can use the following command to install the access code by writing the code:

sma install-access-code <access-code>

This command is equivalent to the previous command:

sma install-ac <access-code>

If you want to install an access code using a file, you can use the --file / -f option, as shown in the following commands:

sma install-access-code --file <path-to-file>
or
sma install-access-code -f <path-to-file>

If there is an error during the license installation, an error is shown.

Checking which access codes are installed

If you want to know which access codes are installed on your computer, you can use the following command:

sma show-access-code

This command shows the information for each access code that is installed on your computer.

Converting

Once you installed a valid license, you can execute the code processor to perform a conversion. To do that you must provide the required arguments:

  • Input path: The folder path that contains the source code.

  • Output path: The folder path where the converted code is placed.

Project Information

  • Customer Email: The customer email value ( you must add a valid mail )

  • Customer Company: the customer company.

  • Project Name: The name of the project.

This example shows how to run the code processor with the minimum requirements:

sma -i <input-path> -o <output-path> -e <client email> -c <client company> -p <project name> <additional-parameters>

Once you set the sequence of commands and hit "Enter", the tool shows the current configuration and prompts for confirmation to start the process:

If you need to add/modify any argument, you can type "n" to abort the process or "y" to start it.

Skipping the Project Confirmation

To skip the confirmation where the tools prompts you to continue (as shown above), you can add the argument --yes or -y. Note that if you are going to use this programmatially, this will be required. Without this parameter, the tool will prompt you to confirm that your information is correct with each execution.

Performing an Assessment

When performing an assessment you can use the same commands that you would use for conversion, but you need to add the option --assessment or -a. The commands you would use would look like these:

sma --input <input-path> --output <output-path> --assessment <additional-parameters>

Each of these commands might also receive additional parameters (check the Converting section).

Checking the tool version

You can use any of the following commands to check the tool version and code-processing engine:

sma --version
sma -v

Enabling conversion of Databricks notebooks to Jupyter Notebooks

This option takes input .python and/or .scala files that are in the source directory, and instead of converting them to output .python and/or .scala files, it will convert them to .ipynb Jupyter Notebook files. Note that this does not take into account if the source file(s) were exported from a notebook or were standard code files.

To enable this conversion, you can add the flag command '--enableJupyter' or the shortcut '-j':

sma -i <input-path> -o <output-path> --enableJupyter

Setting the SQL Flavor of the source code

You can set the SQL syntax for use when a SQL command is detected; the command is '--sql' or the shortcut '-f'. the supported syntaxes are 'SparkSql' the default, and 'HiveSql'

sma --input <input-path> --output <output-path> --sql SparkSql
sma --input <input-path> --output <output-path> --sql HiveSql

Disable checkpoints feature

If you do not want to try checkpoints by using the checkpoints file that the SMA generates, you can deactivate it by adding the following commands:

sma -i <input-path> -o <output-path> --disableCheckpoints
sma -i <input-path> -o <output-path> -d

Need more help?

If you want to see general help for the CLI, you can use the following commands:

sma --help
sma -h

You can get more information about some commands by executing this command:

sma <command> --help

For example, you can execute sma install-access-code --help to get more information about how to install an access code.

If you need an access code, you can reach out to .

There are some arguments required the first time you run the code processor (these values are saved for further executions). This is the same information that's required to . The required fields are:

For more information about all the additional parameters, please refer to the current .

sma-support@snowflake.com
link
--mapDirectory
--help
--version
install-access-code
show-access-code
--input
--output
--assessment
--enableJupyter
--sql
--customerEmail
--customerCompany
--projectName
--yes
--disableCheckpoints
the different supported source platforms
Linux
Windows
MacOS
to download it
create a new project in the application
Current configuration before start process.
Help information