The Tools module does just what it sounds like—provides several tools for your DataSync integrations for ServiceNow.
The features available in the Tools module will help you maintain, support, and/or enhance your integration.
In order to make use of these tools, you need to install and configure DataSync for ServiceNow.
To access the Tools module, go to Perspectium > Perspectium Core > Tools.
Explore this page for all of the tools available in the module.
This feature allows you to compare tables between two ServiceNow instances (or one ServiceNow instance and another integrated database). This is useful because you can see each table's record count, as well as a list of record discrepancies by sys_id between the two tables you're comparing—in other words, you can see which records exist in one table but not the other.
Find information on table compare, including different contexts and instructions, here.
This feature allows you to stop or re-start your dynamic shares and scheduled bulk shares.
To do so, go to Perspectium > Perspectium Core > Tools > Start/Stop All Shares.
The page lists exactly what actions will be taken from either Stopping or Starting All Shares.
NOTE: Starting with Iodine 7.0.9 and newer, any dynamic shares or scheduled bulk shares whose name starts with "X" or "x" will not be changed from their current state (i.e. if the dynamic is inactive it will not be set to active if you choose Start).
The core concept behind Multiple MultiOutput Jobs is the ability to have multiple jobs responsible for their unique subset of outbound messages. This is useful if you are sending a high volume of messages to a single queue or spreading your messages across a high volume of queues.
Typically, outbound messages are sent out by a single job—Perspectium MultiOutput Processing—which goes to your outbound messages table and sends out the messages per queue. This process is sufficient to cover most cases for sending outbound messages. However, for high volumes, multiple MultiOutput jobs works by passing in an encoded query to the MultiOutput job in order to limit the scope of these jobs.
Before getting started with this, we recommend that you take a quick look at the Perspectium MultiOutput Processing job (at Perspectium > Control and Configuration > All Scheduled Jobs) to familiarize yourself with it and contact support@perspectium.com to validate your work if necessary.
You will also need to first create a ServiceNow dynamic share or bulk share.
Click the Job Type dropdown and select which type you want:
Type | Description |
---|---|
Multioutput processing by sys_id | Jobs will be created through a query of sys_id of outbound messages.
|
Multioutput processing by outbound queue | Jobs will be created based on the outbound (shared) queues so no queue will be processed in multiple jobs. For example, if you have three queues and choose 2, then one job will process one queue and the other job will process the other two queues. This approach will honor sequencing of messages in a queue.
|
|
This feature allows for concurrently running scheduled jobs to handle a high volume of messages. By default, the Perspectium application comes with the Perspectium Replicator Subscriber scheduled job, a single job responsible for initiating the reading, consuming, and processing of messages from queues in the Integration Mesh. This process should cover most cases for receiving inbound messages, but you can enable multiple inbound processors if you are receiving a high volume of messages from a single queue or want an overall improvement in throughput.
Here's how:
This tool is useful if you want to use your table schemas to build the tables in a database for a DataSync for ServiceNow integration with the DataSync Agent. This is only necessary if your Agent cannot establish a connection to the ServiceNow instance—otherwise, it will handle this automatically.
To download the table schemas for your ServiceNow instance:
That's it! This will output a downloadable .zip file of all the table schemas as XML files, which you can then provide to the Agent. It will do this for all the tables you have defined in your Bulk Shares and Dynamic Shares, with respect to all the child tables as necessary as well.
This tool is useful if you are planning to import your share configurations to another ServiceNow instance. It allows you to download the configurations that you have set for your ServiceNow dynamic shares, bulk shares, scheduled bulk shares, group shares, table maps, and transform maps.
NOTE:
|
After downloading your share configurations and/or table maps from the Download Share Configurations tool, you can import the downloaded zip folder to another ServiceNow instance with the Import Share Configurations tool.
NOTE: Only zip file downloaded from the Download Share Configurations tool or a single XML file of a share and table map will be accepted. |
This tool allows you to customize the table names and column names in your target database by specifying custom prefixes or suffixes.
Here's how:
In your sharing ServiceNow instance, go to Perspectium > Perspectium Core > Shares > Dynamic Share or Bulk Share.
If you send records with a custom name, the database will create new columns for the table. If you change this custom name property after the new columns have been created, the database will create a new column and show NULL in the previously created custom column for any records sent after you change the property. |
Receipts are generated for outbound messages when you enable the data guarantee feature. Message receipts indicate the delivery status for records that you have shared out, allowing you to quickly identify successful shares, pending shares, and errors.
Learn all about receipts here.
This tool allows you to manually reshare data using receipts from your ServiceNow instance from a time period of three days or less. This is useful for situations where records weren't all received at the target (such as an error at the database).
Here's how: