Zero-to-Snowflake

Get started scripts with Snowflake - Build for the Cloud Data Warehouse

View project on GitHub

alt text

Zero-to-Snowflake

The scripts you need to get started with Snowflake, the enterprise data warehouse built for the cloud.

The purpose of this repository is to help companies when they start using Snowflake. It contains help for some of the most common tasks when developing your Data Lake or Analytics Platform using Snowflake.

The exercises/tasks can be done using SNOWFLAKE_SAMPLE_DATA database (you’ll see this sample database on your snowflake account). This means that there are no prerequisites, besides an S3 bucket with write permissions (Access_Key & Secret_Key)

This repository of tips and tools is managed by Vision.bi, Snowflake’s consulting and technology partner. For high-level consulting queries, you can contact us at snowflake@vision.bi . Feel free to ask for specific example or ask questions at the issues section

If you don’t have an account you can start your trial here

*NOTE: The demo will load data from S3 into Snowflake using scripts, which can be scheduled and executed with Python, Airflow, or others. We highly recommend using Rivery - Data Pipeline to Snowflake in order to Schedule SQL tasks, run insert scripts or load data from external sources (i.e. Facebook Social, Facebook Ads or Google Adwords).

Step 1 - Prepare Data For The Demo

First we will prepare some data. Snowflake includes a Sample database (Using their snowshare), we will use it for the demo, you’ll need to prepare only a bucket with secret & access keys.

see documentation here

Step 2 - Load Data

Use Copy from different sources as Parquet, CSV & Json.

see documentation here

Step 3 - Data Extract & Load

Resize Warehouse, Extract data from Semi Structure, Merge and more… see documentation here

And More…

Explore tye repository for more examples, please fill free to ask for samples in the issues section

###Enjoy Snowflaking…