You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 12 Next »


Outbound messages are sent out by a single job Perspectium MultiOutput Processing, which goes to your outbound messages table and sends the messages out per queue. This process should cover most cases for sending outbound messages. However, if you are doing a high volume of messages to a single queue or spreading your messages across a high volume of queues, then you can use Multiple MultiOutput JobsThe core concept behind Multiple MultiOutput Jobs is the ability to pass in an encoded query to the MultiOutput job to limit the scope of these jobs. In other words you can have multiple jobs responsible for their own unique subset of outbound messages.

(info) NOTE: We DO NOT recommend for you to clone the default Perspectium MultiOutput Processing without making the following changes. Doing so can cause you to send the same set of messages out multiple times.


Prerequisites


(warning) First, you will need to create a ServiceNow dynamic sharecreate a ServiceNow bulk share, or set up a ServiceBond integration.

(warning) We also recommend that you take a quick look at the Perspectium MultiOutput Processing job to familiarize yourself with it and contact support@perspectium.com to validate your work if necessary.


Strategies


There are two main strategies behind this process. The strategy you will use will depend on your use case.

The details for the implementation for each are covered in the following section.

Strategy 1: Bulk Processing on a Queue

This method is not suggested for dynamic shares as messages will not be in order

This strategy refers to wanting to process a high volume of messages on a specific queue. If you are bulk sharing a large amount of messages for a single queue, then this is the path you should lean towards.

This strategy sets up sharing to divide the work for a queue into small distinct chunks - and having multiple jobs each process a chunk. The primary way to do this is by querying off of the sys_id of the outbound message.

(info) NOTE: Querying off the sys_id of the outbound message itself and not the record that the outbound message represents. Additionally, we share out the records in a way to preserve sequencing on a single queue, this method does not honor that sequencing. So we would recommend this strategy if you are bulk sharing a large set of data and are not concerned about the order they arrive in.

For an easy way to set up this strategy, go to Perspectium > Replicator Tools then click on Create Multiple MultiOutput Jobs. This will direct you to a UI where you can choose the number of output jobs you want running.

When the jobs are created, the default job Perspectium MultiOutput Processing is renamed XPerspectium MultiOutput Processing and set inactive, and new jobs are created in the format of Perspectium MultiOutput Processing [*] where * is a list of characters between 0-9 and a-f. The list of characters represents the first characters of sys_ids of outbound records that are processed by the job. For example, [8, 9, a, b] would mean any outbound records with sys_ids starting with 8, 9, a, or b will be processed by this output job. To restore default settings, first delete the new jobs then reactivate and rename the default job.

Strategy 2: Segregated Processing for a Group of Queues

This strategy refers to creating multiple jobs, each job to handle a specific queue. If you are sharing data to a large number of queues, then this is the path you should lean towards.

This strategy sets up sharing to divide the work of your outbound table into groupings based on the queue they are writing to. Since the queues are processed iteratively, this changes the workflow from 1 job processing all queues to X jobs processing their own subset of queues.

This will retain the sequencing of the data.


Procedure

To create multiple MultiOutput jobs, follow these steps:


Make a copy of the Perspectium MultiOutput Processing job

Navigate to  Perspectium > Replicator > Dynamic Share or Perspectium > Replicator > Scheduled Jobs. Then, click into the Perspectium MultiOutput job and make a copy of it and rename it appropriately.

Pass in an encodedQuery

Notice in the example above we create a variable named encodedQuery. This variable gets passed into psp.processMultiOutput(). You can pass any encoded query you want. A quick way to get an encoded query is to go to your outbound table, create a filter, and chose the Copy query


Bulk Processing Steps


Create a filter on your current outbound messages based on the sys_id starts with X flag. The sys_ids for these records start with (0, 1, 2, 3, 4, 5, 6, 7, 8, 9, a, b, c, d, e, f) for 16 values. We want to make sure we capture each of these distinctly.

The following is an example where we break these into groups of 4:

Create the other 3 jobs similarly to the step above. You may also want to limit this to a distinct queue by passing in a target queue into the encoded query.

The following is an example of the script: 

try {
    var encodedQuery="sys_idSTARTSWITH0^ORsys_idSTARTSWITH1^ORsys_idSTARTSWITH2^ORsys_idSTARTSWITH3";
 
    var psp = new Perspectium();
    psp.processMultiOutput(encodedQuery);
}
catch(e) {
    var logger = new PerspectiumLogger();
    logger.logError("error = " + e, "Perspectium MultiOutput Processing");
}


Queue Grouping Steps


Create a filter on your current outbound messages with the following: Target Queue is queue 1 OR Target Queue is queue 2 OR Target Queue is queue 3

Then click Copy query.

Pass the encoded query into the job. It should resemble the query below containing the sys_id of the target queues selected: 

u_target_queue=XXXX^ORu_target_queue=YYYY^ORu_target_queue=ZZZZ



Warnings


This is an advanced capability for the Replicator so we recommend running this through your test environment first.

(info) NOTEIt is important to know that the purpose of these strategies is to send outbound messages with multiple jobs without any overlap in data transit.

Original Job

The original job Perspectium MultiOutput Processing will go through each queue without any encoded query within it. If you do go down this path, you should either modify or de-activate this job to make sure your jobs are each processing their own subset of data.

You may place a X at the start of the name (XPerspectium MultiOutput Processing) to avoid the job from being auto restarted from the Start All Jobs. Follow Perspectium Update Set Releases to maintain these jobs.

Dot Walking

From an optimization standpoint, we do not recommend dot-walking with the queries. I.E. do not pass in an encoded query like:

var encodedQuery = "u_target_queue.u_nameLIKEdev18450"
var psp = new Perspectium();
psp.processMultiOutput(encodedQuery);

Overloaded Scheduler

A ServiceNow production instance will generally have 4 nodes that can execute 8 jobs each for a total of 32 available workers, so you can create a job per queue. Jobs can vary from a bulk share or a single MultiOutput processing.

(warning) WARNING: Be aware of the total available workers on your instance. For example, you should not create 16 individual MultiOutput processing jobs on a 4 node instance, doing so may take 16 of the 32 available workers and accelerate your processing. Thus, be aware of the environment of your instance to avoid affecting the job processing.