1

So this may be confusing but just looking to see if anyone has any insight on how to maybe do this in a simpler fashion.

So our company utilizes Fivetran for data transport, Snowflake for warehousing and Tableau for BI. Unfortunately, we weren't able to get table or column descriptions from our On-prem into Snowflake; so all of our columns have system names (like FFDITM rather than Item_Number for instance).

This obviously was not going to work so I created a new "Transformed" DB and manually added all columns with updated names. I am now in the process of creating streams to update the transformed DB on a daily basis but am running into many problems.

I've done a bit of research and I do see that Snowflake does have a Cloning option for cloning our schemas and tables or even our full DB.

If this is the case, can I clone our production DB once, then updated schema, table, and column names, and then finally set up streams to update it? That way its not so cumbersome?

I attempted to manually do this all manually but it seems there are automated work arounds which will also not eat at our storage.

AKBirite
  • 11
  • 1

1 Answers1

1

You can make a clone using

CREATE DATABASE new_clone CLONE current_production;

The clone will contain all tables and data. Program-y things like stored procedures and streams copy across too. Code that specifically references database name may have to be adjusted.

From the docs

A clone is writable and is independent of its source. Changes made to the source or clone aren’t reflected in the other object.

You can make changes to new_clone without affecting current_production in any way. Of course the opposite is also true - new data in production will not automatically appear in the clone.

Michael Green
  • 25,255
  • 13
  • 54
  • 100