Who can use this feature?
- Available with Data Direct.
- Requires an admin or architect role to configure.
Introduction
Google BigQuery is a cloud-based data storage and analytics service that can be used as a data warehouse. Our Google BigQuery integration allows you to send your raw event data directly to Google BigQuery.
Enabling the integration (Google)
Fullstory provides a terraform module to help simplify the setup of the necessary permissions. See the module docs for more:
https://registry.terraform.io/modules/fullstorydev/fullstory-bigquery-setup/google/latest
Before activating the integration in Fullstory, you must ensure you are fully set up in Google BigQuery. Please review their setup documentation here before getting started in Fullstory.
Important Note on Data Security: Fullstory should only be granted access to read/write the data that we will be managing as part of this sync. To ensure all sensitive data remains secure, Fullstory strongly recommends creating a unique user, role and dataset within your data warehouse, specifically for Fullstory to access. This user and role should not be permitted to access any other customer data in any way.
Create service account
To integrate Data Warehouse Destinations with Google BigQuery, create a service account with Google Cloud.
Then, create a private JSON key for the service account. Make a note of it, as you will need it later in order to configure the integration in Fullstory. The private JSON key should be in the following format:
{ "type": "service_account", "project_id": "random-project-12345", "private_key_id": "abcdefg", "private_key": "*****", "client_email": "name@project.iam.gserviceaccount.com", "client_id": "12345678", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://oauth2.googleapis.com/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/name%40project.iam.gserviceaccount.com" }
Configure service account
Next, you must configure the service account you created previously. The minimum required permissions are bigquery.jobUser
at the project level and bigquery.dataEditor
at the dataset level.
Create a new dataset or use an existing dataset
If you want to use a new dataset, go to your project and click "Create dataset".
In the popup on the right, type in the ID for the new dataset. We will use fs_data_destination as an example.
If you want to use an existing dataset, go to the dataset and follow the next steps.
Share the dataset with the service account
Go to the new dataset or the existing dataset from the last step, and click "Share".
In the popup on the right, click "ADD PRINCIPAL".
Select the service account and the "BigQuery Data Editor" role for the dataset.
Now the service account has access to all the data in the dataset.
Add query permission to the service account
Go to the Project Principals list and select the service account. Edit its permission to have the "BigQuery Job User" role.
Now the service account can run queries on all the data in the dataset.
Enabling the integration (Fullstory)
To start syncing, follow the steps below:
- Navigate to Settings > Integrations > Destinations
- Find the Google BigQuery integration and click Install.
- As shown in the screenshot below, in the menu that appears, add in your BigQuery Service Account Key that you created above.
Note: Please ensure all credentials are correctly entered. If any are incorrectly entered, the integration sync will fail. - When you are ready, click Save at the bottom.
- After saving, you will see data start flowing into your warehouse after 1 hour.
FAQ
Can you set up more than one Data Destination in your account?
Yes. Repeat setup steps for different destinations as needed.