Import from Snowflake
Create a Snowflake User for DataSyncs
Run the following SQL in Snowflake:
Assign a Role and Permissions
Create a role for DataSyncs:
Assign the role to the user:
Enable Query Execution
Grant the ability to create and run queries:
Gather Connection Information for DataSyncs
Account Identifier (e.g.,
account-identifier
) (does not include the .snowflakecomputing.com part)User:
hockeystack_datasyncs_user
Password / Private Key & Private Key Passhphrase (if applicable)
We can authenticate using either a User & Password or a User & Private Key depending on your preference. (Private Keys are recommended for better security*)
Role:
DATASYNCS_ROLE
Warehouse:
your_warehouse
Database:
your_database
Schema:
your_schema
Table: the specific table that you are looking to import data from
Configure an account limit with Resource Monitors (optional)
Larger imports (>100k rows) can increase compute costs. Setting a quota can help ensure no unexpected compute costs:
Column Schema Requirements
Please confirm that your column schema matches what is described here for your given import:
How to help us optimize data retrieval
Given that Snowflake is a table based warehouse, we required a column called added_at
which will be used to index your data. If necessary, this column can be generated as a copy of any other date column in your table.
Generating a Unique ID column (Properties imports)
When importing custom properties (such as Outreach Calls, User based App data, etc) a unique_id
is needed to help us understand the data grain (the set of columns that define the uniqueness of a row).
For example, if column1
, column3
, and column5
combined define the uniqueness of a row- you can create a unique_id
by doing this:
Or, if you prefer a randomly generated ID, you can do:
Last updated