You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

The procedures described in this documentation are best practices for setting up Perspectium DataSync for the typical use cases when sharing data from your ServiceNow instance to target databases and files. 

For other ways to get started with DataSync, click here or contact Perspectium Support


What's on this page?




Prerequisites


(warning) First, request for the downloadable link for Perspectium for ServiceNow application.

(warning) Make sure that the device you are installing the DataSync Agent on meets these basic requirements.

(warning) You will also need to request the Agent Installer from Perspectium Support.

(warning) You will need the following information from Perspectium Support:

  • Perspectium Integration Mesh (MBS) server URL
  • Integration Mesh (MBS) username
  • Integration Mesh (MBS) password




ServiceNow Install

To install the Perspectium Application for ServiceNow you need to have admin access to the ServiceNow instance and should have received the update set from Perspectium Support which will be imported, validated and committed during installation. You can follow these steps to install in ServiceNow:

  1. Go to System Update Sets >> Retrieved Update Sets and import the update set xml file.
  2. Open the update set and click Preview. Errors or warnings are not expected but if there are any those need to be reviewed on a one to one basis.
  3. Click the Commit Update Set button to commit the update set.
  4. In the ServiceNow Navigator go to the new Perspectium module and open the Finish Install page.
  5. Click on the “Confirm” button and wait for a successful confirmation message.

Set up MBS Credentials and Encryption Keys

Once installation is finished you need set up some initial configurations:

  1. Go to Perspectium >> Properties >> General Settings.
  2. Enter the Perspectium Message Bus credentials provided to you by Perspectium
    1. Perspectium Integration Mesh username.
    2. Perspectium Integration Mesh password.
    3. Perspectium Integration Mesh server URL - this is usually in the form https://customername.perspectium.net
    4. The rest of the settings you can leave with their default values and scroll down to the bottom of the page to click Save.
    5. Go back to Perspectium >> Properties and navigate to the DataSync settings.
    6. Populate the Encryption key entry with an encryption string of your choice. We advise you to use at least 32 characters encryption key to be able to leverage AES256 payload encryption. This encryption key does not need to be shared with Perspectium. It is used to encrypt the data shared in transit when flowing through the Perspectium message bus so it can be properly secured.
    7. In the Decryption Key input specify a decryption string which can be used by the ServiceNow instance to descrypt any inbound messages. In many cases you will use the same key as your encryption key so data can be encrypted and decrypted the same in both directions.
    8. Scroll down to the bottom of the page to save it.

Create Queues

In order to share data to the Perspectium Integration Mesh (Message Broker Service aka MBS) you need to configure at least one shared queue since MBS is queue based. This queue will be used by the ServiceNow instance to share records while the DataSync Agent will subscribe to the same queue in order to be able to receive records, decrypt the shared content and issue the necessary SQL queries to populate those records into the target database.

  1. Go to Perspectium >> Shared Queues.
  2. Click the New button to create a new record.
  3. Set the status to Active.
  4. In the Queue type select DataSync from the dropdown.
  5. Enter a name of the queue without spaces e.g. acmedefault.
  6. You should notice that the queue user, password and encryption key should auto populate based on the details entered in the Perspectium Properties.
  7. Save the queue and notice in the record header the full queue name with the automatically added prefix. It should look like psp.out.replicator.acmedefault. You will need this queue name later when configuring the DataSync Agent to subscribe to it.
  8. Click on the Get Queue Status related link to query the Perspectium MBS for the validity of the queue. If the credentials are valid you should get a message indicating that either the queue exist and the credentials are valid or a message saying that the queue does not exist yet due to no sata being shared to it but still have valid credentials and will be auto created once you start sharing records.
  9. Go to Perspectium >> Subscribed Queues. You should notice one queue already existing there with a name of psp.out.servicenow.instancename where instancename is the name of your ServiceNow instance. Subscribed queues are used by ServiceNow to be able to receive messages form MBS. For example, messages are sent to the instance when the DatraSync agent needs to send some response back to ServiceNow. Usually you are fine with just the default subscribed queue and don’t need to add any more subscribed queues.




DataSync Agent Install

Install the Agent

The Agent offers a variety of options for target endpoints the Agent will save data into, with databases being the most common use case. Below are the most common ways to install the Agent when saving to diffferent target endpoints.

Install Agent to Share to a Database

The Perspectium DataSync Agent can be installed both on Windows or Linux machines. Before installing consulting with this article for minimum requirements.

The virtual machine where you will be installing the agent needs to have connectivity to:

  1. The Perspectium message bus endpoint e.g. customername.perspectium.net via http port 443.

  2. The target database via the port this database will listen to.

  3. The source ServiceNow instance. This connection is only needed for the Agent to be able to fetch specific ServiceNow table schemas when it needs to create or alter table schema in the target database to reflect its ServiceNow structure.

Make sure the necessary network connectivity exists and the needed firewall rules are set properly before you start the installation.

Once you start the installation package perform the following steps:

  1. Accept the Perspectium Licence Agreement.

  2. Specify the directory where the Agent should be installed.

  3. Specify the installed Java version which you want to use.

  4. In the DataSync type specify the type of database for which you are installing the Agent.

  5. Next specify the Perspectium Integration Mesh details as provided to you by Perspectium

    1. Server - this is your Perspectium Integration Mesh endpoint in the form of customername.perspectium.net.

    2. User - your Integration Mesh user.

    3. Password - your Message Bus password.

    4. Queue name - this is the shared queue you have created in ServiceNow, usually psp.out.replicator.mysharedqueuename.

    5. Encryption/Decryption key - this should match the encryption key specified in ServiceNow Under Perspectium >> Properties >> DataSync Properties >> Encryption Key.

    6. Max Reads - maximum number of reads per connection to the Message Bus - you can leave the default number here

    7. Polling interval - the interval in seconds between two agent polls to fetch Message Bus records - you can leave the default number here.

  6. Next you need to specify the information about the ServiceNow instance

    1. Instance - the name of the instance in the form of customer.service-now.com

    2. User - this must be a ServiceNow system account with at least the “perspectium” role. This user is needed by the Agent to be able to fetch dynamically the table schema for any table being shared to the database so that the Agent can create a similar table schema in the target database.

    3. Password - this user's password.

  7. Next step is to enter the database details

    1. Server - this is the address of the database.

    2. Port - which can be used by the Agent to connect to the database

    3. User - the username of the system account which is used to connect to the database.

    4. Password - this user's passwod

    5. Database - the hostname of the database which is created or should be created by the Agent to store the replicated ServiceNow data. Note that if the Agent needs to create this database it needs the specific CREATE DATABASE or similar permission against the DB server.

  8. On the next step review the specified settings and continue until the installation completes.

Install Agent to Share to Files

You can install the DataSync Agent to be able to share to files in a directory instead of a database. Currently the Agent supports sharing to json, xml or csv files.

For this type of set up you need to select “File DataSync (advanced)” as a DataSync Type when installing the Agent and then do some manual configurations in the agent.xml. Namely you need to specify:

  1. Handler to specify the output file type as one of

    1. <handler>com.perspectium.replicator.file.XMLFileSubscriber</handler>

    2. <handler>com.perspectium.replicator.file.JSONFileSubscriber</handler>

    3. <handler>com.perspectium.replicator.file.CSVFileSubscriber</handler>

  2. Files directory in the form of <files_directory>Users\Downloads\subscribefiles\</files_directory>.

  3. Usually you would want to create different files for the various tables you share out of ServiceNow by having the table name as part of the output file name and add a timestamp to each file so you can track when it was generated.. To achieve this in agent.xml you can specify <file_prefix>$table_$d{yyyyMMdd}_$i</file_prefix>.

  4. If this is not needed and you want records form all table to be stored in a single file you can use the <filename> directory instead like <file_name>records.json</file_name>.

  5. If your files will be separate for each shared table you should also specify file prefix and suffix as following based on your type of output

    1. <file_suffix>.json</file_suffix>

    2. <file_suffix>.xml</file_suffix>

    3. <file_suffix>.csv</file_suffix>

  6. Specify the maximum size of the file before the Agent moves to write to a new one in the form of <file_max_size>1MB</file_max_size>.

  7. If you are not sharing too much data out of ServiceNow constantly, sometimes it may take too long before a file reaches its maximum size to be closed off. You can set up the Agent to close a file even if it has not reached the maximum size when it has not received new records for a table for a given specified interval using the following directive: <close_file_interval>180</close_file_interval> where the integer value is specified in seconds.

Your agent.xml should look like this:

<?xml version="1.0" encoding="ISO-8859-1" ?>
<config>
    <agent>
        <!-- the following subscribe fragment defines subscribing class -->
        <!-- and its arguments -->      
        <subscribe>
            <task>
                <task_name>test_file_subscriber</task_name>
                <message_connection password="password_here" user="admin" queue="psp.in.meshlet.example">https://<customer>.perspectium.net</message_connection>
                <instance_connection password="Adminadmin1" user="admin">https://<instance>.service-now.com</instance_connection>        	
                <max_reads_per_connect>1</max_reads_per_connect>
                <polling_interval>3</polling_interval>         	
                <decryption_key>Example_decryption_key_here</decryption_key>  
                <handler>com.perspectium.replicator.file.JSONFileSubscriber</handler>
 
                <buffered_writes>10</buffered_writes>          	
                <files_directory>/Users/You/Downloads/Example</files_directory>                         	
                <file_prefix>$table_$d{yyyyMMdd}_$i</file_prefix>
                <file_suffix>.json</file_suffix>
                <file_max_size>50KB</file_max_size>
                <translate_newline>%13</translate_newline>
                <separate_files>table</separate_files>
                <enable_audit_log/>
                <close_file_interval>180</close_file_interval>
            </task>
        </subscribe>
    </agent>
</config>


Install Agent to Share to Amazon S3 Bucket

To install the Agent to share to Amazon S3 buckets during installation in the selection for DataSync Type select “Manual Configuration (advanced)” option. Then follow these steps:

  1. Add the .jar files listed here to your Agent’s “extlib” folder.

  2. Open agent.xml and modify the following tags like this:

<task_name>s3_agent_subscribe</task_name>

<handler>com.perspectium.replicator.file.S3Subscriber</handler>

  1. Specify decryption key

<decryption_key>The cow jumped over the moon and back</decryption_key>

  1. Specify the following options based on how you want to configure the replication

<access_key> - the access key associated with your AWS account, only when the DataSync Agent is not installed on an EC2 instance

<secret_access_key> - the secret access key associated with your AWS account, only when the DataSync Agent is not installed on an EC2 instance

<use_instance_credentials/> - if you are not using access key or secret access key, then you can use the instance permissions based on the IAM roles of the EC2 instance

<region> - this is the region your S3 bucket resides in

<s3_bucket> - Name of your AWS S3 bucket, including subdirectories if desired, to specify where the records will be uploaded e.g. bucketName/folder1/folder2

<file_format> - json or xml file format

<s3_bucket_encryption/> - this is optional setting . To use AWS server-side encryption when pushing files to the S3 bucket. Configuring this option will have the Agent request S3 to encrypt the data at rest using S3's built-in functionality

Your agent.xml should look like this:

<?xml version="1.0" encoding="UTF-8"?>
<config>
   <agent>
      <share />
      <subscribe>
         <task>
            <task_name>s3_agent_subscribe</task_name>
            <message_connection password="password" user="user">https://mesh.perspectium.net</message_connection>
            <instance_connection password="password" user="user">https://myinstance.service-now.com</instance_connection>  
            <handler>com.perspectium.replicator.file.S3Subscriber</handler>
            <decryption_key>The cow jumped over the moon</decryption_key>
            <access_key>AccessKey</access_key>
            <secret_access_key>SecretAccessKey</secret_access_key>
            <region>us-west-2</region>
            <s3_bucket>examples3bucket</s3_bucket>
            <file_format>json</file_format>
         </task>
      </subscribe>
      <polling_interval>40</polling_interval>
   </agent>
</config>


Install Agent to Share to Azure Blob Storage

You can also set up your Agent to share .json or .xml files in an Azure Blob storage. 

During installation in the selection for DataSync Type select “Manual Configuration (advanced)” option. Then follow these steps:

  1. Open agent.xml and modify the following tags like this:
    1. Set up <task_name> directive to  <task_name>azure_blob_storage_agent_subscribe</task_name>
    2. Set up <file_format> directive to either <file_format>json</file_format> or <file_format>xml</file_format>
    3. Set up <abs_container> to specify the name of your Azure Blob container, including subdirectories if desired, to configure where the records will be uploaded e.g. container/folder1/folder2.
    4. set up <connection_string> to specify the string to connect to Azure and upload to Azure Blob Storage container.

Your agent.xml should look like this:

<?xml version="1.0" encoding="UTF-8"?>
<config>
   <agent>
      <share />
      <subscribe>
         <task>
            <task_name>azure_blob_storage_agent_subscribe</task_name>
            <message_connection password="password" user="user">https://mesh.perspectium.net</message_connection>
            <instance_connection password="password" user="user">https://myinstance.service-now.com</instance_connection>  
            <handler>com.perspectium.replicator.file.AzureBlobStorage</handler>
            <decryption_key>Some decryption key here</decryption_key>            
            <abs_container>pspcontainer</abs_container>            
            <file_format>json</file_format>
            <connection_string></connection_string>
          </task>
      </subscribe>
      <polling_interval>40</polling_interval>
   </agent>
</config>


Agent Plugins Install

The Agent includes optional plugins you can configure to customize how your Agent saves data to the database. This is only applicable for the database use case. The one plugin that is used often is the IODatetime plugin.

IO Datetime Plugin

The IO Datetime plugin allows the DataSync agent to add additional columns to the synchronized tables to specify the timestamps of when those records were first inserted, last updated or deleted into the database. As opposed to the usual ServiceNow sys_updated, sys_created_on which keep the date and time when those transactions happened in ServiceNow the IO Datetime plugin columns keep the timestamps of the database transactions triggered by the agent. This plugin can be activated by adding the following lines to the agent.xml within the <task> directive:

<!-- Create the columns -->
<dynamic_columns>    
    <dynamic_column column_type="93" column_size="32" deletes_only="true">psp_per_delete_dt</dynamic_column>
    <dynamic_column column_type="93" column_size="32" updates_only="true">psp_per_update_dt</dynamic_column>
    <dynamic_column column_type="93" column_size="32" inserts_only="true">psp_per_insert_dt</dynamic_column>
</dynamic_columns>
  
<!-- Update the columns -->
<plugins>
    <plugin insert_column="psp_per_insert_dt" update_column="psp_per_update_dt" delete_column="psp_per_delete_dt">com.perspectium.replicator.sql.plugin.SQLSubscriberIODateTimePlugin</plugin>
</plugins>

Validate Agent Configuration

After installation, you can use a specific executable to validate if your Agent is installed properly or get early information about specific issues. Go to the Agent's install directory and then navigate to the bin folder and start validateConfiguration.bat (Windows) or validateConfiguration (Unix) in order to perform this validation which will print out any error the Agent sees when it tests its configuration. The Agent will also do this validation on start up.

Install the Agent to Run as a Service

Usually when in operation the DataSync Agent is left running continuously unattended on a virtual machine. This is best achieved when the Agent is set up to run as a Windows or Linux service. To install the Agent as a Service, execute installService.bat or installService.sh from the agent bin directory. This should create a Perspectium DataSync Agent service that automatically starts when the machine starts. 

If the service is not able to start automatically the best place to see any troubleshooting information would be to enable the wrapper log:

  1. Go to the Agent's conf folder.
  2. Open wrapper.conf in a notepad editor.
  3. Find the line which says #wrapper.logfile= and just below it uncomment the line to specify where the logs should be written e.g. ..\logs\wrapper.log
  4. Try to start the service again and check the log file for possible errors.




End to End Test

Once you have done the installation for both ServiceNow and the DataSync Agent it is always good to run a quick test by sharing records from ServiceNow and make sure they arrive in the database. Perform the following steps:

1. In ServiceNow go to Perspectium >> Shares >> Bulk Shares >> Create New Bulk Share.

2. Create a new bulk share with only some basic configuration:

    1. Name - specify a name for the bulk share.
    2. Table name - select a relatively small table like cmn_location.
    3. Sharing set up - Share  the table records only.
    4. Encryption method - AES256.
    5. Target queue - the shared queue you created during installation.
    6. Save the bulk share and click the “Execute Now” button.
    7. Once you execute the Bulk Share you will be taken to the bulk share list.
    8. Open the Bulk Share once again and wait until it finishes. Once finished it’s status should change to Completed and you should see a number or Records processed greater than 0.
    9. Go to Perspectium >> Outbound >> Outbound messages to verify outbound payloads are generated and set to Sent status. You should see a number of messages where the topic is replicator and the name is tablename.bulk such as cmn_location.bulk. This number should match the number of records processed in the Bulk Share. Check the status of those messages to make sure they are in Sent status which would mean those were already sent to Perspectium MBS.

3. Go to the VM hosting your DataSync Agent. If the Agent is not yet running as a service you can start the service or start the Agent in the foreground by double clicking the Agent.bat (Windows) or agefile in the bin folder.

4. Once the Agent is started it should process the records from the MBS queue. You can verify this in the Agent's logs by going to the logs folder and opening the most recent perspectium.log file. You should see lines in the log similar to the following:

2024-09-26 16:23:38.370 DEBUG - dev50002_subscribe - SQLSubscriber - dev50002_subscribe - processMessage active..., processed 123 messages

5. Check the logs for any errors or warnings indicating issues with message processing or connectivity to the target database. Errors and warnings are indicated as ERROR or WARNING in the logs.

6. If no errors are observed in the log, check your target database to make sure all of the shared records have arrived there successfully.



Start Sharing Data

Once you have verified end to end connectivity and that you are able to share ServiceNow data successfully into the target database, you are ready to configure your shares.

Bulk Shares Configuation

A bulk share is a job that can be triggered manually or based on a schedule to share all or a subset of records from a given table to the DataSync Agent. Bulk shares are usually triggered manually to run an initial share to synchronize all records from a table and then added to run on a schedule to keep the target database in sync by only sharing the delta i.e. the records in the table created or updated after the previous run.

To configure a bulk share follow those steps:

  1. Go to Perspectium >> Shares >> Bulk Shares >> Create New Bulk Share.
  2. Specify a share name in the Name field.
  3. In Table Name select the table you want to share.
  4. In Sharing Setup select whether you want to share only the records from the specific table (Share the table records only), the records from the specific table and all records belonging to any tables extending the current one (Share all table records and child records) or share only the last descending child of a record from the tables extending the current one if there are any extensions (Share the last descending child of each table record). For more information, see Replicating ServiceNow's Parent Child Hierarchy.
  5. Encryption method - select the encryption algorithm which you want to use to encrypt the payload before sharing so that confidential information cannot be read while in transit through Perspectium MBS. It is recommended that you use AES128 or AES256 as a secured way of encrypting data. 
  6. Select the target queue which is the shared queue to which you want to share this data. This target queue should be configured in your agent.xml file for your agent to be able to subscribe and fetch data from it.
  7. From the Related Links list at the bottom you can switch to Standard view to access some more advanced configurations of the bulk share
  8. Share only selected fields - allows you to specify only certain fields from the source table to be shared. This is useful in case you are only interested in some of the table columns and you do not want to overload your target database with unnecessary data.
  9. Include attachments - to share only attachment records related to the records of the main table you are sharing. Note that you can share the sys_attachment and sys_attachment_doc tables separately via dedicated shares but this option here allows you to easily limit to only share attachments related to the actual records you are sharing and not orphan attachments.
  10. Include journal fields - to share related sys_journal_field records such as comments or work notes. Note that journals along with attachments will be shared to the target database as separate records residing in their respective tables such as sys_journal_field, sys_attachment and sys_attachment_doc and not to the main table you are sharing within this Bulk Share.
  11. Share updates since then - usually you want this checkbox enabled to enable the Bulk Share to run incrementally and only share delta after its first run and not to share the whole table dataset every time it executes. This field is tied to the Last share time field which holds a timestamp of when this Bulk Share has been run last time. Initially when you create a Bulk Share the Last share time field is empty which will allow the Bulk Share to share all records from the table. But on subsequent runs it will only share records inserted or updated in ServiceNow after its past execution as defined in its Last share time timestamp and given that the Share updates since then checkbox is enabled.
  12. Condition - you can specify a filter condition to only share a subset of the records from the table which will match this filter.
  13. Scripting - you can customize your Bulk Share to run specific ServiceNow Javascript before it starts, for each record or once it finishes in case you want to modify or enrich the outgoing payload.
  14. Number of jobs - by default your Bulk Share will run with one job only. In case you are sharing a large table or a table which generates a large amount of delta records you can speed up sharing by selecting a higher number of jobs. This will speed up the generation of outbound messages which are placed in the Perspectium Outbound table. Note that each job here will consume one worker out of your ServiceNow nodes during execution time.
  15. Note that in order to save time, if you are creating Bulk Shares with similar settings you can clone existing Bulk Share using the Clone bulk configuration related link and do any changes needed on the cloned one before executing.