You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »


 To enhance your DataSync integration for Snowflake, you can configure the Snowflake Bulk Load Meshlet with the additional features available below!


What's on this page?




Compare tables from ServiceNow to Snowflake 

Table Compare allows you to compare tables from one ServiceNow instance to Snowflake. This is useful because you can see each table's record count, as well as a list of record discrepancies by sys_id between the two tables you're comparing—in other words, you can see which records exist in one table but not the other. To get started with comparing your ServiceNow tables to Snowflake, see Table Compare: ServiceNow to database table compare.  

(info) NOTE: This requires version Helium and above Perspectium Core update set. 

↑ Go to top of page




Load Schema Files 

This feature allows you to load your ServiceNow table schemas to the meshlet when a connection cannot be made to the originating ServiceNow instance.  Schema files are used to create and update tables in Snowflake. Enabling this feature will force the meshlet to read schemas via files exclusively from the local file storage and disable all calls to ServiceNow. Any new tables shared will require table schemas to be added to the local file storage.

Export the table schemas in your ServiceNow instance. See Download Table Schemas.

Unzip the folder that was created from the previous step. Then, put the folder in the static/config directory. 

The final location for each table should look like static/config/schemas/incident.xml.

In the application-dev.yml, set the following configurations:

perspectium:  
	auth:    
		useLocalSchema: true

(info) NOTE: By default, useLocalSchema is set to false.

↑ Go to top of page




Azure External Storage

To start using Microsoft Azure as an external stage for bulk loading, follow these steps:

Configure bulk loading from Microsoft Azure by following configuring an Azure container for loading data.

In the application-dev.yml, set the following configurations:

DirectiveDescription
url

Azure storage URL. 

perspectium:  
	azure:    
		url: azure://pspsnowflaketest.blob.core.windows.net/snowflakedev

Where pspsnowflaketest is the storage account and snowflakedev is the container.

sasString

Shared Access Signatures (SAS) connection string for Azure Storage. See SAS connection string.

To access the connection string, go to Storage Account > Account Name > Shared Access Signature. Then enter the required fields, and generate SAS and connection String. 

Allowed Resource Types:

    • Container
    • Object

Allowed Permissions: 

    • Read
    • Write
    • Delete
    • List
    • Add
    • Create

perspectium:  
	azure:    
  		sasString: ?sv=2020-08-04&ss…..ejl%2BTE%3D
connectionUrl

Connection URL for your Azure. To access the URL, go to Azure Portal > Storage Account > Access Keys > Show Keys > Connection String.

perspectium:  
	azure:    
    	connectionUrl: DefaultEndpointsProtocol=.....EndpointSuffix=core.windows.net
destinationContainer

Container you want to share your data to.

perspectium:  
	azure:    
		destinationContainer: snowflakedev
deleteFiles

Stores files in Azure if necessary.

perspectium:  
	fileSubscriber:    
 		deleteFiles: true