Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview


This guide provides a methodical approach to identifying causes for data to be out of sync.   It requires a birds-eye understanding of the DataSync Architecture architecture to see how data flows out of ServiceNow and into the Perspectium Integration Mesh where the data sits until the Perspectium DataSync Agent is able to consume the messages to perform the insert/update/delete into the Target Databasetarget database.

(info) NOTE: It is recommend you use Table Compare to first check for discrepancies. This guide will then help you troubleshoot further to determine the cause of discrepancies.

Here is a diagram that shows the DataSync workflow:

Listed below are the components where this document details how to troubleshoot each of them:


  1. ServiceNow (Source)
  2. Perspectium Integration Mesh
  3. DataSync Agent
  4. Database (Target)


Before deep diving from Source to Target, it is best to perform STEP 1 the following to identify problematic tables where discrepancies have been identified.   Prioritizing the tables then allows us to focus on specific shares and possibly time frame of an issue.   If possible, it would also be helpful to find sys_id values that either exist in ServiceNow, but not in the Database database or found in the Database database but not in ServiceNow. 

UI Steps

Anchor
discrepancy
discrepancy

ui-step

Identify tables where there is discrepancy


Create a spreadsheet that includes the following columns and their values:

  • Table Name
  • Count in ServiceNow
  • Count in Database
  • Count difference
  • Date/time of when counts were recorded
  • Flag indicating if there are more records in Database or ServiceNow
  • Examples (if possible): sys_id values with a discrepancy 

Click here to download an example template. 

Image Added

Progress through these upcoming steps as we investigate from source to target to possibly identify the root cause for each discrepancy identified or isolate where the problem may be.


ServiceNow


The Perspectium DataSync Application in

the

ServiceNow

platform

simply creates PSP Outbound Message (psp_out_message) records and publishes them to the Perspectium Integration Mesh (MBS). 

Majority of discrepancy issues from a ServiceNow perspective are due to:

(info) NOTE: If you are using MBS 3.0, you can use the mesh_entry_id field to verify that the outbound message is saved to a file in a queue in the Perspectium Integration Mesh.

The majority of discrepancy issues from a ServiceNow perspective are due to:

Dynamic/Bulk Share not creating psp_out_message record
  • The PSP Outbound Message is not properly created (
Example:
  • e.g. empty value
.  See Share does
  • share does not capture final version of source record.
‘Perspectium
  • Perspectium MultiOutput
Processing’
  • ProcessingJob(s)
is inactive
  •  inactive or platform blocks
it
  • from running.
‘Perspectium
  • Perspectium MultiOutput
Processing’ Job
  • Processing Job(s)
is unable
  •  unable to communicate with Perspectium Integration Mesh.


ui-steps


UI Step

Anchor
sn_step_one
sn_step_one

Test Dynamic/

Bulk Share

bulk share for table(s) identified in

the previous step (ServiceNow)

the discrepancies 

We want to test the Dynamic/Bulk Share the dynamic share or bulk share for the problematic table to see if it properly creates the PSP Outbound Message and gets published to your Perspectium Integration Mesh.   If this is successful, then we can quickly eliminate the Share share configuration in ServiceNow and connection to the Integration Mesh from being an issue.  

Dynamic Share


Use Test Record feature or create/update a test

record 

record.

Bulk Share

  • Clone Bulk Share
  • Rename Bulk Share



    Divbox
    stylebackground:white


    UI Steps
    sizesmall


    UI Step

    Clone bulk share.


    UI Step

    Rename bulk share.


    UI Step

    Populate Limit number of records shared to a small value like 5 or set a filter to share a specific set of records.


    UI Step

    Confirm the expected number of psp_out_message records are created with topic=replicator and are eventually Sent.



    (info) NOTEThe Attributes field of psp_out_message records PSP Outbound Messages records contain useful information to help determine whether the expected PSP Outbound Messages are Messages are created:

    FieldDescription
    set_idsys_id of Dynamic/Bulk Share that dynamic share or bulk share that created PSP Outbound Message
    RecordIdsys_id of Source record being replicated
    cipherEncryption method used by the Dynamic/Bulk Sharedynamic share or bulk share
    sharedQueuesys_id of Shared Queue shared queue record


    ui-step

    Anchor
    sn_step_two
    sn_step_two

    Review the Dynamic Share

    /

    , Bulk Share, and/or Scheduled Bulk Share configuration for tables found in

    Step 1 (ServiceNow)

    the discrepancies 

    Review the configuration to ensure it is configured to capture everything expected.   Here a couple things to keep in mind when reviewing the share options:

    Dynamic Share


    Parent/Child Hierarchy > Child Table Only (Default)

    This DEFAULT setting is enabled if the 'Share base table only' and ' and Include all child tables' options  options are NOT selected. It will share all Child Tables 1 child tables one level below the specified Tabletable. This may be misleading as most people expect Dynamic Shares expect dynamic shares are auto-configured to share data from the Table table configured.

    Trigger Conditions Tab > Interactive Only

    This option will only replicate data if a User user executed an insert/, update/, or delete. It will NOT share data if a Background Job background job or Script script performed an operation against the table. Customers may not always be aware of Background Jobs Background jobs or scripts performing actions on Tables they tables users are attempting to sync to a Databasedatabase may not always be visible.

    Filter and Enrichment Tab

    Review the conditions and all options in this tab to ensure nothing prevents replication from occurring for an expected record.

    Business Rule order and Business Rule when

    There is a possibility that the Perspectium Replicate’ Business Rule  business rule does not capture the final version of a record.   Ensure that the Dynamic Share the dynamic share is configured to share the record after all Business Rules business rules and Workflows workflows are complete.

    Duplicate Perspectium Replicate Business Rules

    Sometimes, a single Dynamic Share single dynamic share can have two or more business rules associated to it. This will lead to sharing the source record multiple times. The Dynamic Share Business Rules Dashboard allows you to quickly identify any duplicates and you can Reset the Dynamic Share Business Rules to re-create all business rules for your active Dynamic Shares active dynamic shares such that only a single business rule is associated with each active Dynamic Sharedynamic share.  

    Bulk Share


    Conditions

    Review the conditions and all options in this Tab tab to ensure nothing prevents replication from occurring for an expected record.

    Share updates since then

    This will only share records that have been inserted or updated since the date /or time shown in the 'Last share time ' Fieldfield. It simply modifies the query against the Base Table base table to capture records updated since the last execution time. This option is typically used with Scheduled Bulk Shares scheduled bulk shares.

    (info) NOTE: There is a Known Issue with this Bulk Share this bulk share option that is fixed in Fluorine Plus Patch 1.1 and above.

    Scheduled Bulk Shares


    Check to see if the Scheduled Bulk Sharethe scheduled bulk share(s) are configuerd configured to execute as expected and the associated Bulk Shares bulk shares are running at those times.

    Related Bulk Shares bulk shares with filters using date /or time stamps, such as sys_updated_on, can lead to data gaps if not properly configured based on the Repat Repeat Interval of the Scheduled Bulk Share.  scheduled bulk share. Ensure that the Bulk Share bulk share condition is configured to capture records update since the start time of the Scheduled Bulk Sharescheduled bulk share.


    (info) NOTE: Please review the KNOWN ISSUES Known Issues section of this document as it contains additional information that may impact the Perspectium Shares shares from functioning properly.


    ui-step

    Anchor
    sn_step_three
    sn_step_three

    Perspectium logs and debug

    logging (ServiceNow)

    logging 

    To access Perspectium logsLogs, go to Perspectium > Control and Configuration > Logs.

    If there is no explanation as to why PSP Outbound Message is not being generated for a Shareshare, then you can review the Perspectium Logs.   Sometimes it will capture errors during the processing of the Share share such as problems with finding or encrypting the source record.   It should also show if there have been any communication problems from ServiceNow to the Perspectium Integration Mesh.  

    If you can reproduce an issue on demand where a Dynamic or Bulk Share dynamic share or bulk share is not sharing expected records, then enabling debug logging can be used to gather additional info. 

    Depending on the scenario it can be enabled at the global level or from a specific share:


    Global level

    Go to Perspectium > Control and Configuration > Properties, then check the Generate debugging logs option.

    (info) NOTE: Do not enable in high-volume Instances such as Production unless advised by Perspectium Support.

    Dynamic Share 

    Go to Perspectium > Replicator > Dynamic Shares, then check the Advanced checkbox. An Advanced tab will appear. In the Advanced tab, check the Enable debug logging option. 

    Bulk Share 

    Go to Perspectium > Replicator > Bulk Shares, then check the Advanced checkbox. An Advanced tab will appear. In the Advanced tab, check the Enable debug logging option. 


    Global level debugging should not be used in Production environments unless coordinated with Perspectium Support.   Ideally, it should be used in sub-prod environments where you are able to reproduce a broad issue but still not able to isolate a specific Dynamic/Bulk Shares.  dynamic shares or bulk shares. However, you can temporarily use the Share share debug logging in a Production Instance instance if you are able to single out particular Shares shares with an issue. 




    DataSync Agent 


    The Perspectium DataSync Agent typically sits within

    a customer’s firewall/

    your firewall or network to ensure control and security of your data.

     

    Initial struggles with the installation of the DataSync Agent are often due to communication issues between the Agent and Perspectium Integration Mesh, source ServiceNow

    Instance

    instance or

    Database

    database.

     

    Once the Agent is able to establish connection with each item, it goes through the following workflow to sync data:

    1. Consume messages from the Target Queue in the Perspectium Integration Mesh
    2. Decrypts and
    Deserialize
    1. deserialize message
    2. If configured, Agent will fetch table schema from ServiceNow and alter tables in the
    Database 
    1. database 
    2. Query the
    Database
    1. database to determine if it needs to perform an insert or update/delete (for .bulk messages)
    2. Build the SQL Statement
    3. Execute SQL Statement

    Typically, longest delays occur during an interaction with the database. 


    Database (steps 4 and 6). 

    Ensure that your network, firewall or proxy whitelist the following:

  • Perspectium Mesh IP addresses (info) NOTE: Contact Perspectium Support for your IP addresses
  • perspectium.net domain
    UI Steps


    Test Connectivity (DataSync Agent) 

    If you are seeing a backlog of messages in the Queue, then you’ll want to confirm whether the Agent is able to communicate with the Perspectium Mesh to retrieve the data.  The Validate Configuration Tool can be used to quickly test the connection between the Agent and the following components as defined in the agent.xml file.

    • Perspectium Mesh 
    • Your ServiceNow Instance
    • Your Target Database

    The agent will not be able to startup successfully if connection to one of these endpoints is failing.  This tool can be executed with the following commands from the Agent’s base install directory:

    • Windows: bin\validateConfiguration.bat (info) NOTE: Double-clicking will work as well
    • Linux: bash bin/validateConfiguration

    The resulting output will be printed into the ConfigurationValidationReport.log file found in the bin directory/folder.  If the ConfigurationValidationReport.log shows a successful connection to all components, then there are no network related issues. 

    Example output from executing validateConfiguration:

    ui-step

    Anchor
    da_step

    Code Block
    languagebash
    Agent Information:
    Agent version: Fluorine_4.6.6
    						
    Agent name: 'psp_replicator_fluorine_466'
    						
    Configured Tasks:
            Task: psp_replicator_fluorine_466_subscribe Type: subscribe
    
    						
                   psp_replicator_fluorine_466_subscribe instances: 4
    
    						
    Attempting to connect to the Message Broker Service at: https://training.perspectium.net as user: training/training - SUCCESS
    						
    Attempting to connect to the database: psp_repl_466 at host: localhost port: 3306 as user: root - SUCCESS Attempting to fetch schema from: https://dev63629.service-now.com- SUCCESS
    						
    Validation Completed, results: SUCCESS
    
    
    Report has been saved in file: ConfigurationValidationReport.log 
    
    

    Typically, these connectivity issues are due to network, proxy, or firewall configuration.  Here is a list of ports that need to be open depending on the protocol used in the value of the <message_connection> directive in the agent.xml file:

    Protocol 

    Port

    Comment

    AMQPS

    TCP/5671

    Outbound to your Perspectium Mesh (amqps://<mesh_name>.perspectium.net)

    AMQP

    TCP/5672

    Outbound to your Perspectium Mesh (amqp://<mesh_name>.perspectium.net)

    HTTPS

    TCP/443

    • Outbound to your Perspectium Mesh (https://<mesh_name>.perspectium.net)
    • Outbound to your ServiecNow Instance(https://<instance_name>.perspectium.net

    _zero
    da_step_zero

    Check java path for Agent startup issues

    If the Agent is having issues starting up, check the java system command is working. Normally when Java is installed on a machine, the java system command is available to executed from any system command prompt in any directory. If you check for Java on your machine by executing the java -version and it doesn't work, then the java system command may not be set up properly.

    First, verify the user you are currently logged in on the machine has access to the Java install (it may have been installed by another user and your user wasn't given access).

    If the user does have access to Java, you may need to explicitly point to where the java executable is located for the Agent to run it. The executable is generally located in in the bin folder where Java is installed i.e.

    /usr/local/java/jre1.8.0_181/bin/java (Linux)

    C:\Program Files\Java\jre1.8.0_181\bin\java.exe (Windows)

    Once you have the path of the Java executable, open up the wrapper.conf configuration file located in the conf directory of the Agent's installation directory and update the wrapper.java.command configuration near the top of the file to point to the path.

    That is, change it from:

    Code Block
    languagebash
    wrapper.java.command=java

    To this (using the example above):

    Linux

    Code Block
    languagebash
    wrapper.java.command=/usr/local/java/jre1.8.0_181/bin/java

    Windows

    (info) NOTE: Use \\ for each folder in order for the path to be read correctly

    Code Block
    languagebash
    wrapper.java.command=C:\\Program Files\\Java\\jre1.8.0_181\\bin\\java.exe



    UI Step

    Anchor
    da_step_one
    da_step_one

    Test connectivity 

    If you are seeing a backlog of messages in the queue, then you’ll want to confirm whether the Agent is able to communicate with the Perspectium Integration Mesh to retrieve the data. The Validate Configuration tool can be used to quickly test the connection between the Agent and the following components as defined in the agent.xml file.

    • Perspectium Integration Mesh 
    • Your ServiceNow instance
    • Your target database

    The agent will not be able to startup successfully if connection to one of these endpoints is failing. This tool can be executed with the following commands from the Agent’s base install directory:

    Windows:  

    Code Block
    languagebash
    bin\validateConfiguration.bat

    (info) NOTEDouble-clicking on the .bat file will execute it as well.

    Linux:

    Code Block
    languagebash
    bash bin/validateConfiguration


    The resulting output will be printed into the ConfigurationValidationReport.log file found in the bin directory/folder. If the ConfigurationValidationReport.log shows a successful connection to all components, then there are no network related issues. 

    Example output from executing validateConfiguration:

    Code Block
    languagebash
    Agent Information:
    Agent version: Fluorine_4.6.6
    						
    Agent name: 'psp_replicator_fluorine_466'
    						
    Configured Tasks:
      Task: psp_replicator_fluorine_466_subscribe Type: subscribe
    
    						
          psp_replicator_fluorine_466_subscribe instances: 4
    
    						
    Attempting to connect to the Message Broker Service at: https://training.perspectium.net as user: training/training - SUCCESS
    						
    Attempting to connect to the database: psp_repl_466 at host: localhost port: 3306 as user: root - SUCCESS Attempting to fetch schema from: https://dev63629.service-now.com- SUCCESS
    						
    Validation Completed, results: SUCCESS
    
    
    Report has been saved in file: ConfigurationValidationReport.log 
    
    

    Typically, these connectivity issues are due to network, proxy, or firewall configuration. Here is a list of ports that need to be open depending on the protocol used in the value of the <message_connection> directive in the agent.xml file:

    Protocol 

    Port

    Description

    AMQPS

    TCP/5671

    Outbound to your Perspectium Integration Mesh (amqps://<mesh_name>.perspectium.net)

    AMQP

    TCP/5672

    Outbound to your Perspectium Integration Mesh (amqp://<mesh_name>.perspectium.net)

    HTTPS

    TCP/443

    • Outbound to your Perspectium Integration Mesh (https://<mesh_name>.perspectium.net)
    • Outbound to your ServiecNow Instance(https://<instance_name>.perspectium.net

    Ensure that your network, firewall or proxy whitelist the following to allow your server to communicate externally:

    • Perspectium Integration Mesh IP addresses 
    • perspectium.net domain

    (info) NOTE: Contact Perspectium Support for your IP addresses



    Warning
    Also verify the ServiceNow user entered for the <instance_connection> directive has the proper role to access the ServiceNow instance and get table schemas.



    UI Step

    Anchor
    da_step_two
    da_step_two

     Logging level 

    The Perspectium DataSync Agent has the ability to output more information to its logs in the event we are able to reproduce an issue consistently or field values are not matching up between record ServiceNow and row in the database.  

    The logging level can manually be set in the log4j2.xml found in the conf folder or directory. 

    By default, the logging level is set to info (this is located in line 33 of the log4j2.xml file)

    Code Block
     <root level="info"> 


    The parameter/tag can have the following values:

    Value

    Description

    info

    Default value and provides the least amount of info

    debug

    Mid-level debugging output

    finest

    Highest logging level which prints most details.  Do NOT use in production or high-volume environments unless in coordination with Perspectium Support


    The logging level can also be set dynamically with a duration using the setLogging command: 

    Code Block
    setLogging <logging_level> <duration>

    Example

    Code Block
    # Setting to finest logging for an hour
    bin/setLogging finest
      
    # Setting to finest logging for 3 hours (10800 seconds)
    bin/setLogging finest 10800
      
    # Setting to info logging indefinitely
    bin/setLogging info -1
    
    

    (info) NOTE: This logging level will be active until the duration expires or the Agent is restarted.


    UI Step

    Anchor
    da_step_three
    da_step_three

    Quick analysis of the logs 

    Sorting through the Agent logs will help you pinpoint issues such as:

    • Connection issues to Perspectium Integration Mesh, ServiceNow or database
    • SQL Response errors after failed insert/update/delete
    • Failure to successfully decrypt messages
    • Issues with building the SQL Statements
    • Malformed JSON message published to the queue

    All of these are possible problems that the DataSync Agent can encounter and provide an explanation for any discrepancy of data. 

    Here are a few search terms or phrases that immediately indicate issues with the Agent:


    Skipping 

    This indicates 9 (3x3) failed attempts were made to insert, update, or delete the message to the database

    Exception 

    Generic search term that indicates error occurred during the:

    • Processing of the message 
    • Requests to the database or Perspectium Integration Mesh

    ERROR

    Generic error occurred during the processing of the message or requests to the database. You may also find the Agent may have unexpectedly shut down.  

    javax.crypto.IllegalBlockSizeException: Input length must be multiple of 8 when decrypting with padded cipher

    The data from the value attribute of the JSON message was not properly encrypted by ServiceNow.

    A JSONObject text must begin with '{'

    Invalid JSON message published to the queue.

    SubscribeException: Failed to decrypt message, make sure the shared secret keys match!

    The encryption key in the Perspectium properties in ServiceNow does not match the decryption key defined in the agent.xml file.

    org.json.JSONException: Null key

    Generic error but often times see it when the value (encrypted data containing source record being shared) of the message is empty.

    failed: Connection timed out

    Connection failure to queue in Perspectium Integration Mesh.


    For additional errors found in the logs, see Troubleshooting Agent Installation and Configuration Errors.




    Contact Perspectium Support

    If you’ve gone through this Troubleshooting Guide and are still unable to figure out what may be causing your data to be out of sync, then please don’t hesitate to:

    • Visit the Perspectium Support Portal to open a Support Case
    • Give us a call at (888) 620 - 8880 to open a Support Case 
    • Reach out to your Perspectium Customer Success Manager 

    Include the items below when initially opening a Perspectium Support Case:


    UI Steps


    UI Step

    Troubleshooting Report or Download Share Configuration

    The Troubleshooting Report module creates psp_out_message records that will be sent to Perspectium Support to help with troubleshooting. To access this feature, go to Perspectium > Control and Configuration > Troubleshooting Report.

    This report will contain the following information:

    • Perspectium warning and error logs
    • Sample outbound (psp_out_message) and inbound (psp_inbound_message) data 
    • Perspectium configuration such as:
      • Dynamic shares
      • Bulk shares
      • Perspectium Properties
      • Subscribe configurations
      • Shared and subscribe queues

    (info) NOTE: The source record being shared remains encrypted.

    The Download Share Configurations is another option that will allow you to provide Perspectium Support your Perspectium Configuration for review. It can be executed with the following steps:

    Divbox
    stylebackground:white


    UI Steps
    sizesmall


    UI Step

    Go to Perspectium > Replicator > Tools > Download Share Configurations


    UI Step

    Check ALL the following options:

    • Bulk shares
    • Scheduled bulk shares
    • Dynamic shares
    • Table maps


    UI Step

    Click Download


    UI Step

    Send Perspectium Support the share_configurations.zip file





    UI Step

    DataSync Agent Report

    The Create Report Tool found in the Agent’s bin directory will gather config and logs files. The resulting zip file will contain files such as:

    • /conf/agent.xml
    • /conf/config.xml
    • /conf/log4j2.xml
    • /conf/wrapper.conf
    • /logs/* (ALL logs)
    • /bin/<instances schemas>

    (info) NOTE: The resulting zip file from the report may be large due to the number of log files. It would be a good idea to archive log files older than 1 week old.  

    It can be executed with the following commands from the Agent’s base install directory:

    Windows

    Code Block
    languagebash
     bin\createReport.bat

    (info) NOTEDouble-clicking will work as well.

    Linux

    Code Block
    languagebash
    bash bin/createReport





    Anchor
    known_issues
    known_issues

    Known Issues

    This is a list of Known Issues or ServiceNow configurations that will prevent Perspectium from capturing an insert/update/delete on your ServiceNow Instance:


    Expand
    titlePSP Outbound message created with an empty value. 

    Fix is in Fluorine Plus Patch 1.1 and up.

    (info) NOTE: This is not a complete fix.  It is a workaround that skips encrypting the problematic field so our code can proceed to populate the value field of the psp_out_message record.  Investigation is ongoing.


    Expand
    titleServiceNow behavioral flags that may prevent a Dynamic or Bulk Shares from capturing messages.

    current.setWorkflow(false) 

    Impacts Perspectium Replicate business rules from a dynamic shares. This will prevent any other business rule from running on the current record.

    current.autoSysFields(false)

    Impacts Scheduled Sync Ups or Scheduled bulk shares. This will prevent the fields updated and updated by to change when the record itself is updated.


    In regards to the .setWorkflow(false) behavioral flag, Perspectium’s GOLD Release has the option to create a Flow Designer dynamic share to get around this issue. 

    (info) NOTE: Deletes can only be captured by a business rule. 


    Expand
    titleInserts/Update/Deletes may not be captured on PPM Tables.

    Inserts, updates, or deletes may not be captured on PPM Tables such as the following:

    • task > planned_task
    • task > planned_task > pm_project
    • task > planned_task > pm_project_task

    Likely due to a ServiceNow setting which prevents any business rules from executing: KB0793430: Why is my Project Task not listed in the Deleted Records?


    Expand
    titleQuery Business Rule manipulating the query issued by a Bulk Share or Scheduled Sync Up so it’s not sharing all expected data.

    The workaround is to configure the Query business rule such that it doesn’t run when executed by the bulk share:


    Divbox
    stylebackground:white


    UI Steps
    sizesmall


    UI Step

    Create a new sys_user record with the perspectium Role only


    UI Step

    Confirm Query business rule does NOT have perspectium in the Role Conditions field/list


    UI Step

    Configure bulk share (Security Tab > Run As) to Run as the User created from finding the discrepancies





    Expand
    titleShare updates since then option in a Bulk Share is not honoring the Last Share Time. 

    Fix is in Fluorine Plus Patch 1.1 and up.