Who can use this feature?
- Available with Data Direct.
- Requires an admin or architect role to configure.
Snowflake is a cloud-based data storage and analytics service that can be used as a data warehouse. Fullstory's Snowflake integration allows you to send your structured event data directly to Snowflake.
Enabling the integration (Snowflake)
Terraform Infrastructure as Code (IaC)
Fullstory provides a terraform module to help simplify the setup of the necessary permissions. See the module docs for more information.
Manual Configuration
Note: To experience all of the benefits of this integration, a paid version of Snowflake is required.
Key-Pair Authentication
Fullstory recommends using key-pair authentication to connect Fullstory to your Snowflake account. If you are an existing Destinations customer and have set up the integration with user-based authentication, you can visit your Settings to update the authentication to key-pair without any gap in service.
To generate a key for authentication to Snowflake, you can follow their documentation here.
Note: When saving the key in Fullstory, you must save the unencrypted version. Fullstory will encrypt the key at rest using Google's key management service
User/Password Authentication
We recommend that you use the names provided in the code block below. However, for simplicity, we recommend them to be unquoted identifiers.
Here is an example of how to create a secure and random password using the openssl CLI tool. This generates a pseudorandom series of 32 bytes and base64 encodes them.
>> openssl rand -base64 32
xiuXQGOjgpByMyCmOvzkMUGo8sV7GrwF63QepqT9HdU=
Replace the blank user_password
string in the script below with the output of this command.
Important Note on Data Security: Fullstory should only be granted access to read/write the data that we will be managing as part of this sync. To ensure all sensitive data remains secure, Fullstory strongly recommends creating a unique user, role and database within your data warehouse, specifically for Fullstory to access. This user and role should not be given permission to access any other data in any way.
Setup
Once you've set the parameters, run the script below in your Snowflake Worksheet to create all of the required objects and grant the necessary privileges to the role/user that Fullstory will use to perform the data sync.
set warehouse_name = 'compute_wh';
set database_name = 'fullstory';
set role_name = 'fullstory_loader';
set user_name = 'fullstory_user';
set user_password = 'SECURE_PASSWORD'; // leave blank if using public key
set user_public_key = 'PUBLIC KEY';
set storage_integration = 'fullstory_gcs';
use role useradmin;
create role if not exists identifier($role_name);
create user if not exists identifier($user_name)
password = $user_password
rsa_public_key = $user_public_key
must_change_password = false;
grant role identifier($role_name) to user identifier($user_name);
use role sysadmin;
create database if not exists identifier($database_name);
grant all on database identifier($database_name) to role identifier($role_name);
grant usage on warehouse identifier($warehouse_name) to role identifier($role_name);
use role accountadmin;
create storage integration identifier($storage_integration)
type = external_stage
storage_provider = 'GCS'
enabled = true
storage_allowed_locations = ('gcs://fullstoryapp-warehouse-sync-bundles/');
grant usage on integration identifier($storage_integration) to role identifier($role_name);
Note: Fullstory takes advantage of Snowflake's Storage integration feature to optimize how we load data into your warehouse. This does not require your Snowflake instance to be hosted in Google Cloud, nor does it require that you have a Google Cloud account. Click here to read more about storage integration. The values set for `storage_provider` and `storage_allowed_locations` in the script above should not be edited. If your region is eu1, you need to replace thestorage_allowed_locations
in the above script withgcs://fullstoryapp-eu1-warehouse-sync-bundles/
.
Setting a network policy (optional)
The CIDR block for the IP addresses that Fullstory uses to connect to the provided snowflake instance are:
- NA: 8.35.195.0/29
- EU: 34.89.210.80/29
To create the network policy and apply it to the Fullstory snowflake user, run the following:
CREATE NETWORK POLICY FULLSTORY_POLICY ALLOWED_IP_LIST = ('8.35.195.0/29');
ALTER USER FULLSTORY_USER SET NETWORK_POLICY = FULLSTORY_POLICY;
Fullstory does not currently support two-factor authentication.
Enabling the integration (Fullstory)
To start syncing, follow the steps below:
- Navigate to Settings > Integrations > Destinations.
- Find the Snowflake integration and click Install.
- As shown in the screenshot below, in the menu that appears, add in your Snowflake Account ID, Warehouse, the Database you created, Username and Password, and the Storage Integration for the new Fullstory user. The ID to use for Account ID should be the appropriate one for your region and provider. See the FAQ below if you are unsure what this value is, or are having issues saving the credentials.
Note: Please ensure all credentials are correctly entered. If any are incorrectly entered, the integration sync will fail. - When you are ready, click Save at the bottom.
- After saving, you will see data start flowing into your warehouse within an hour.
FAQ
How can I find the Snowflake Account value to configure with Fullstory?
Fullstory uses the gosnowflake driver to connect to your Snowflake account. This driver requires the account locator to connect. You can find this value by running the following command and cross referencing the values with this table.
SELECT CURRENT_ACCOUNT(), CURRENT_REGION();
Can you set up more than one destination in your account?
Yes. Repeat setup steps for different destinations as needed.
Is a GCS Storage Integration mandatory?
Yes. This integration enables Snowflake to read staged files from Fullstory's cloud using a service account. We utilize the associated service account to set the Access Control List (ACL) for the files in GCS. You can choose the integration name by updating the storage_integration
variable in the setup script shown above.