Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

 To enhance your DataSync integration for Snowflake, you can configure the Snowflake Bulk Load Meshlet with the additional features available below!


Anchor
top
top

Panel
titleWhat's on this page?

Table of Contents
maxLevel2
absoluteUrltrue





Compare tables from ServiceNow to Snowflake 

Table Compare allows you to compare tables from one ServiceNow instance to Snowflake. This is useful because you can see each table's record count, as well as a list of record discrepancies by sys_id between the two tables you're comparing—in other words, you can see which records exist in one table but not the other. To get started with comparing your ServiceNow tables to Snowflake, see Table Compare: ServiceNow to database table compare.  

(info) NOTE: This requires version Helium and above Perspectium Core update set. 


↑ Go to top of page




Load Schema Files 

This feature allows you to load your ServiceNow table schemas to the meshlet when a connection cannot be made to the originating ServiceNow instance.  Schema files are used to create and update tables in Snowflake. Enabling this feature will force the meshlet to read schemas via files exclusively from the local file storage and disable all calls to ServiceNow. Any new tables shared will require table schemas to be added to the local file storage.

UI Steps
sizesmall


UI Step

Export the table schemas in your ServiceNow instance. See Download Table Schemas.


UI Step

Unzip the folder that was created from the previous step. Then, put the folder in the static/config directory. 

The final location for each table should look like static/config/schemas/incident.xml.


UI Step

In the application-dev.yml, set the following configurations:

Code Block
languageyml
perspectium:  
	auth:    
		useLocalSchema: true

(info) NOTE: By default, useLocalSchema is set to false.



↑ Go to top of page




Azure External Storage

To start using Microsoft Azure as an external stage for bulk loading, follow these steps:

UI Steps
sizesmall


UI Step

Configure bulk loading from Microsoft Azure by following configuring an Azure container for loading data.


UI Step

In the application-dev.yml, set the following configurations:

DirectiveDescription
url

Azure storage URL. 

Code Block
perspectium:  
	azure:    
		url: azure://pspsnowflaketest.blob.core.windows.net/snowflakedev

Where pspsnowflaketest is the storage account and snowflakedev is the container.

sasString

Shared Access Signatures (SAS) connection string for Azure Storage. See SAS connection string.

To access the connection string, go to Storage Account > Account Name > Shared Access Signature. Then enter the required fields, and generate SAS and connection String. 

Allowed Resource Types:

    • Container
    • Object

Allowed Permissions: 

    • Read
    • Write
    • Delete
    • List
    • Add
    • Create

Code Block
perspectium:  
	azure:    
  		sasString: ?sv=2020-08-04&ss…..ejl%2BTE%3D


connectionUrl

Connection URL for your Azure. To access the URL, go to Azure Portal > Storage Account > Access Keys > Show Keys > Connection String.

Code Block
perspectium:  
	azure:    
    	connectionUrl: DefaultEndpointsProtocol=.....EndpointSuffix=core.windows.net


destinationContainer

Container you want to share your data to.

Code Block
perspectium:  
	azure:    
		destinationContainer: snowflakedev


deleteFiles

Stores files in Azure if necessary.

Code Block
perspectium:  
	fileSubscriber:    
 		deleteFiles: true





↑ Go to top of page




Soft Deletes

This feature allows you to add a column or columns, either boolean and/or a date time field, that gets updated when a delete record is processed. The field will update said columns appropriately and will not delete the record in Snowflake. At least 1 of the following table columns are required to achieve softDeletes, if neither column has been added to the configurations, no delete will occur. This is to prevent deletion of records when there is an intention of using soft deletes.

UI Steps
sizesmall


UI Step

In the application-dev.yml, set the following configurations:

Code Block
perspectium:    
	snowflake:       
		softDelete: true



UI Step

Add a boolean field to mark a record has been deleted. 

In static/config/databaseConfig.json, add the following:

Code Block
"delete": {
 	"columnName":"is_delete"
 }



UI Step

Add IODateTime field to input execution time for deleted record.

In static/config/databaseConfig.json, add the following:

Code Block
"IODateTime": {
	"insert":["psp_per_insert_dt"],
	"update":["psp_per_update_dt"],
	"delete" :["psp_per_delete_dt"]
}




↑ Go to top of page