Page History
HTML |
---|
<style> .release-box { height: 30px; width: 100px; padding-top: 8px; text-align: center; border-radius: 5px; font-weight: bold; background-color: #8efeb3; border-color: #FCE28A; } .release-box:hover { cursor: hand; cursor: pointer; opacity: .9; } </style> <meta name="robots" content="noindex"> <div class="release-box"> <a href="https://docs.perspectium.com/display/krypton" style="text-decoration: none; color: #FFFFFF; display: block;"> Krypton </a> </div> |
After following the steps to configure your
Perspectium DataSync application for sharing, there are additional
options available on the Tools tab to help make your integration more robust and tailored to your organization's needs.
Panel |
---|
title | View receipts of your integration |
---|
style | background: white |
---|
Receipts are automatically generated in your Salesforce org for your integration. Receipts indicate the delivery status for records that have been shared from your ServiceNow instance to your Salesforce org, allowing you to quickly identify successfully synced records as well as any records that have not yet been synced. Receipt delivery statuses include:
Success Your records were synced successfully
Pending Your Salesforce org is still processing the records shared out from your ServiceNow instance
Error Your records were not synced successfully
To view the receipts generated in your Salesforce org for data being shared out by ServiceNow, follow these steps:
Log into your Salesforce organization and click the icon in the upper left-hand corner of the screen. Then, click the Perspectium Replicator app.In the navigation bar near the top of the screen, click Receipts. Then, on the resulting page, click into the receipt that you want to view. Note that the receipt's delivery status will be listed as the value in the Batch Type field for the receipt.
NOTE: Keep in mind that receipts generated within Salesforce are receipts for data coming into Salesforce (inbound data). To view information for receipts for data coming out of ServiceNow (outbound data), see ServiceNow messages & receipts.
title | Change Salesforce receipt default size |
---|
style | background: white |
---|
By default, Salesforce receipts will be generated for every 1,000 ServiceNow messages. However, you can change this default value within the Replicator Settings for your Salesforce org.
Log into your sandbox Salesforce organization and click the icon in the top right-hand corner of the screen. Then, click Setup.In the Quick Find window on the left side of the screen, type and then click Custom Settings (under Custom Code).
Locate the ReplicatorSettings label from the list of Custom Settings. Then, under the Action column, click Manage for ReplicatorSettings.
On the resulting ReplicatorSettings page, click Edit. Then, modify the values for the Batch Ack Size and Batch Error Size fields to be the number of messages (as a batch) that will trigger the generation of a receipt within Salesforce for inbound data.
Dynamic Share
title | Modify Salesforce Apex triggers |
---|
style | background: white |
---|
Salesforce Apex triggers for your Salesforce service integration indicate the conditions that will trigger your Salesforce dynamic share. Update the Apex triggers for your Salesforce dynamic shares to filter out data that is sent from Salesforce to another app. The conditions for the SOQL query that runs as part of a dynamic share's Apex trigger can be manually configured on the Apex Triggers page. To modify the default Apex triggers created for your Salesforce service integration, you will need to create a Base64 encoded condition and specify the fields you want to include in the dynamic share's outbound messages.
First, make sure you have successfully installed DataSync. Also, for your Salesforce dynamic shares, make sure that the Active, Create, Update, and Include Attachment boxes are checked.
WARNING! If you plan to make additional changes to your Apex triggers after manually modifying them, follow the procedure via SOQL query (rather than modifying your Apex triggers in the dynamic share form) to ensure that the manual modifications to your triggers are not overridden.title | Modify Apex triggers via SOQL queries |
---|
style | background: white |
---|
In the navigation bar near the top of the screen, click Dynamic Shares. Then, click into the dynamic share that you want to modify Apex triggers for.
To modify the SOQL query in your parent SObject's Apex trigger, create a query using the dropdowns next to Filter Field (choose a field to filter on, an operator, and type a value in the textbox). Then, click Add Filter to see how the triggerWhere clause in your Apex trigger will be updated. To save your parent SObject Apex trigger, scroll down to the bottom of the form and click Save. The Parent Apex Trigger Preview window will then update its triggerWhere clause to include the filter you just created. Finally, click Save Trigger to save the changes you made to your dynamic share.
title | Modify Apex triggers manually |
---|
style | background: white |
---|
In the Quick Find window on the left side of the screen, type and then click Apex Triggers (under Custom Code).
Locate the SObject's Apex trigger that you want to modify. Apex triggers will be named in the following convention: PSP_<dynamic share name>_<child SObject name>_<parent SObject name>. Then, under the Action column, click Edit for the Apex trigger you want to modify.
Replace the triggerWhere value with a Base64 encoded Apex trigger condition. For information on Apex trigger condition syntax, see Apex triggers. To Base64 encode your Apex trigger condition, you can use Base64encode.org.
Optionally, you can also modify the fields to be included in the Salesforce dynamic share's outbound message. To do this, locate the persp.PSPUtil.createPspOutMessage function and modify the third parameter for this function by including any table fields that you want to include in the outbound message when your Salesforce dynamic share triggers. Then, click Save near the top of the form to save your changes.
| ||
|
The following options are available:
Jobs
View the scheduled jobs that process inbound and outbound data to the Perspectium Integration Mesh. There are two job types:
MultiOutput Processing - The job that will run to process messages with a status of Ready in the Outbound Messages table and send them to the specified shared queues in the Integration Mesh.
Replicator Subscriber - The job that will consume messages from the specified subscribed queues in the Integration Mesh. The messages will be consumed and processed by the Perspectium application and saved into the Inbound Messages table in the app so you have the messages for reference.
When creating a new job, you have the following options available:
Description - A description to give the job for your reference (this isn't used). If no description is entered, the job type is used along with the datetime it will run or be aborted i.e. MultiOutput Processing Scheduled at 2024-02-08 14:24:15.036.
Job Type - The type of job as mentioned above. NOTE: It is recommended to have only one job for each job type to avoid race conditions where the same messages are sent out twice.
Job Interval - How often the job should run.
Job Delay Interval - An interval to delay the job and try again if the maximum number of Salesforce jobs running has been exceeded and the job cannot be executed. For example, if the job tries to run and it cannot execute and you have the delay set to 5 minutes, it will try again in 5 minutes. If not set, it will try again at the job interval which may fail again if the maximum number is still exceeded. This option is a way to give Salesforce time to complete executing other jobs without continuing trying to run again.
Email User - A user to email if the job has connectivity issues connecting to the Perspectium Integration Mesh.
Email Interval - How long we should email about connectivity issues for. If the issue is ongoing until you fix it (such as bad Integration Mesh credentials), you will only be notified for the time specified here at which point it will no longer email even if the issue is still occurring.
Table Maps
Using table maps allows you to map a record and its fields to a JSON with your preferred field names in the outbound message that is shared out of Salesforce. This can be useful for when you want to map the record to specific fields for use by your target database/system which is already set up to get data from specific fields.
To create a table map for use in your dynamic or bulk share:
- Create a new table map by going to the Tools tab and choosing the Table Maps option. Select the New Table Map option and the form will open to create the table map.
- Enter the following information for the table map:
- Table Map Name - A name for this table map to reference when selecting it in a dynamic or bulk share. This name is only used for reference and not as part of the outbound message.
- Source Table - The source table (SObject) to map from. This should match the table selected in the dynamic or bulk share where this table map is used.
- Target Table Name - The name of the table that will be used in the name field of outbound messages. Outbound messages have the format of <table_name>.<action> (i.e. Case.bulk) so the DataSync Agent that consumes a message knows what table the record is intended for and what action to take. So you can use this to change the outbound message to use the value specified here. For example, if you enter the value incident, a bulk share using this table map will create outbound messages with the name of incident.bulk.
- Click Save to save the table map.
NOTE: You will need to save the table map first before you can select the fields you want to map.
- Select the fields you want to map to include in the table map. Use the Add New Field Mapping section to add each field, choosing the field from the source table to map from (Source Field) and entering what the name of this field should be in the table map (Target Field). Source fields listed from the source table will be listed using their field labels.
For example, if you will be mapping fields from the Case table to incident and you want the Description field in the Case table to map to the short_description field in your generated table map, you would do the following:
Clicking Add Table Field Map to add this field mapping to the table map. Repeat this step for all the fields you want to mapped. - To delete a field mapping from the table map, choose the Delete option next to each field map (under the Action column). You can use the Delete All Table Field Maps to quickly remove all field mappings if you want to redo this table map and add new fields.
- Next create a dynamic or bulk share where you want to use this table map. On the dynamic or bulk share form, there is a Table Map where you can look up the table map you created. Select this table map and save the dynamic or bulk share after finishing any other configurations.
NOTE: You will need to use a table map that has the same source table as the table selected in your dynamic or bulk share.
For example, if you want to share Case records and want to share them to the incident table in your database that has the fields short_description, sys_id and state, you can create a table map to share out Case records as follows:
Logs
View the logs generated by the Perspectium application. The configuration for how the application logs can be found in Properties.
title | Salesforce Apex Trigger Test Class |
---|
style | background: white |
---|
Before deploying Apex triggers from sandbox to production, each trigger will require a test class to cover at least 75% of the trigger code.
A test class will need to be written to test object for the trigger code as follows:
- Trigger Actions - If the dynamic share has insert, update and/or delete selected. The test class will need to do the respective action selected (i.e. insert the object).
- Trigger Conditions - An object matching the dynamic share conditions will need to have matching action(s) (such as an insert).
- Apex Object Requirements - Mandatory fields required to do the action(s) above. For example, if the dynamic share is on the Contract object which requires the AccountId field be populated in order to insert a new Contract object. The AccountId field is a reference field to the Account object and requires an Account be created first.
- Extra Trigger Options - Any other additional related triggers (such as Attachment and ContentDocumentLink triggers created when selecting the Include Attachments option) created with the dynamic share to share out those related records that will also need to be tested for trigger code coverage.
See below test class for an example of testing a dynamic share Apex trigger created on the Contract object:
- Trigger Actions - The insert and update actions are selected on the dynamic share so the test class inserts and updates Contract objects.
- Trigger Conditions - The dynamic share is set to only trigger and executed when the Contract object's Status != 'Draft'. Contract objects are inserted with the Status = 'Draft' and then updated to 'In Progress'.
- Apex Object Requirements - The Contract object requires the AccountId, Contract__Type__c, StartDate, EndDate, Preferred_Complement_Date__c fields to be populated. These fields are populated when created test Contract objects. Since AccountId is a reference field to the Account object, test Account objects are created first to use for this field.
- Extra Trigger Options - The dynamic share has the Include Attachments option selected to trigger and share out attachments when they are added. Test Attachment and ContractDocumentLink objects are created to test these triggers and share out those objects.
Code Block | ||||
---|---|---|---|---|
| ||||
@isTest(SeeAllData=true)
global class PSPContractTriggerTest {
private static testmethod void TestPSPContractTrigger() {
List<persp__psp_replicate_conf__c> shares = [select Id,persp__create__c,persp__update__c,persp__delete__c,persp__table__c from persp__psp_replicate_conf__c where persp__table__c='Contract'];
System.debug('Found Share: ' + shares.size());
for(persp__psp_replicate_conf__c share : shares){
System.debug('processing table: ' + share.persp__table__c);
SObject shareObj = Schema.getGlobalDescribe().get(share.persp__table__c).newSObject() ;
//Account object is required to create a Contract
Account testAccount = new Account(name='PSP_Test Account');
insert testAccount;
String testName = 'PSP_Test';
List<Contract> contractList = new List<Contract>();
for(Integer i = 0; i <= 200; i++){
Contract testContract = new Contract(
AccountId = testAccount.id,
Name = testName + i,
Short_Description__c = testName,
StartDate = System.today() + 1,
EndDate = System.today() + 3,
Preferred_Completion_Date__c = System.today() + 2,
Status = 'Draft',
Contract_Type__c = 'Other'
);
contractList.add(testContract);
}
insert contractList;
if(share.persp__update__c){
String q = 'select Id, Status from ' + share.persp__table__c + ' WHERE Short_Description__c =: testName LIMIT 1';
List<Contract> sobjList = Database.query(q);
if (sobjList.size() == 0)
continue;
System.debug('returned: ' + sobjList.size());
System.assertEquals(sobjList.size(), 1);
for(Contract o : sobjList){
o.Status = 'In Progress';
update o;
String body1='x';
Blob b = Blob.valueOf(body1);
Attachment attach1= new Attachment();
attach1.Name = 'Test Attachment';
attach1.ParentId = o.Id;
attach1.Body = b;
insert attach1;
attach1.Name = 'Updates Test Attachment';
update attach1;
System.assertEquals(attach1.Name, 'Updates Test Attachment');
delete attach1;
//ContentDocumentLink Test
ContentVersion contentVersion = new ContentVersion(
Title = 'Test',
PathOnClient = 'Test',
VersionData = Blob.valueOf('Test Content'),
IsMajorVersion = true
);
insert contentVersion;
List<ContentDocument> documents = [SELECT Id, Title, LatestPublishedVersionId FROM ContentDocument];
//create ContentDocumentLink record
ContentDocumentLink cdl = New ContentDocumentLink();
cdl.LinkedEntityId = o.id;
cdl.ContentDocumentId = documents[0].Id;
cdl.shareType = 'V';
insert cdl;
cdl.shareType = 'I';
update cdl;
try {
Delete cdl;
} catch (Exception e) {
System.debug('Error deleting ContentDocumentLink');
}
} // end for(Contract o : sobjList){
} // end if(share.persp__update__c){
} // end for(persp__psp_replicate_conf__c share : shares)
} // end testmethod()
}
|
title | Bypass users |
---|
Divbox | ||
---|---|---|
| ||
This feature allows you to limit unnecessary users from triggering a dynamic share. This is especially useful for Salesforce users that are loading large records via Data Loader and subsequently do not want the records to be dynamically shared. This feature is located within the Dynamic Share Trigger Builder Properties page. To use the trigger bypass, enter the email of the user you want to exclude in the Bypass Users field. Once the properties page is saved, it will take effect immediately and all Dynamic Shares will be bypasses for the users specified. |
title | Include child records |
---|
Divbox | ||
---|---|---|
| ||
The Include Child Records option provides a way to share out child records of a parent record. This is done by creating Apex triggers so as to capture events on those child tables, and filtering to only share out child records related to the parent record. After selecting the parent table on a dynamic share form, the Include Child Records option will populate with child tables that support Apex triggers. Select those tables where you want to share child records from. For example, if you select the Case table for the dynamic share and want to share out comments on Case records, select the CaseComment table in the Include Child Records option: Like attachments, this option currently only supports sharing out all fields of the child tables. As is the case with the Apex trigger created for the parent table specified in the share, any triggers created for the child tables will be set to blank when the dynamic share is set as inactive and/or you remove a child table from the selected list (i.e. if you first select the CaseComment table in the Include Child Records option, and create a trigger for it, and then later you decide to remove the CaseComment table from the selected list). This is done to be consistent with the approach of not deleting Apex Triggers, since Apex triggers can only be created in sub-prod orgs and moved over to prod orgs. (i.e. once you delete an Apex trigger in production you can't recreate it without going through the process of doing it in a sub-prod). Apex triggers on the child tables can be deleted but when the Apex trigger is deleted on the parent table, notably when the dynamic share itself is deleted. |
Bulk Share
title | Create a scheduled Salesforce bulk share |
---|
style | background: white |
---|
You can use this feature to schedule one or more bulk shares to occur at specific times or In a repeated time interval.
Log into your Salesforce organization and click the icon in the upper left-hand corner of the screen. Then, click the Perspectium Replicator app.In the navigation bar near the top of the screen, click Scheduled Bulk Shares. In the upper left-hand corner of the resulting page, click New Scheduled Bulk Share.
Select a date and time in the Scheduled Date Time field for when to start bulk sharing records, and the Days and Hours to repeat the process in the Repeat Interval field.
In the Available Bulk Shares, select which bulk shares you want to schedule and click . Check the Active box to activate the scheduled bulk share.
title | Preview a Salesforce bulk share |
---|
style | background: white |
---|
Before executing a Salesforce bulk share, you can preview the bulk share to see how many records will be shared out.
Log into your Salesforce organization and click the icon in the upper left-hand corner of the screen. Then, click the Perspectium Replicator app.In the navigation bar near the top of the screen, click Bulk Shares. In the upper left-hand corner of the resulting page, click to the bulk share you want to preview.
title | Clone a Salesforce bulk share |
---|
style | background: white |
---|
After executing a Salesforce bulk share, you can clone the bulk share with the following steps:
Log into your Salesforce organization and click the icon in the upper left-hand corner of the screen. Then, click the Perspectium Replicator app.In the navigation bar near the top of the screen, click Bulk Shares. In the upper left-hand corner of the resulting page, click to the bulk share you want to clone.
Migrating a Salesforce Sandbox to Production
title | Salesforce sandbox to production |
---|
style | background: white |
---|
Salesforce sandboxes are used to develop and test changes for your organization in an environment that doesn't affect any real data or applications. When development and testing is done, you can migrate these changes from the sandbox to the production org using change sets.
This is especially useful for deploying Apex triggers in production orgs. Dynamic shares in the Perspectium Salesforce Package use Apex triggers, similar to how business rules capture changes in ServiceNow. However, unlike ServiceNow where you can create business rules in a production ServiceNow instance, Salesforce requires you deploy Apex triggers by creating them in a sandbox org and then moving them over to your production org using change sets.
Apex triggers use Apex code to execute the Perspectium application's code to share records out of Salesforce. To deploy anything containing Apex code (such as Apex triggers) in production, you must have 75% test coverage of all Apex code. By default, the Perspectium Package comes with a test class that covers the basic components of Perspectium's Apex code including the dynamic share Apex triggers. However you may need to add an additional test class to your change set that tests your trigger's Apex code (especially if there are any customizations) to ensure this 75% coverage of all Apex code. An example of a test class that tests the dynamic share Apex trigger code can be found here.
NOTE: Your Apex trigger trigger should reference a queue that has an alias. When migrating your Apex trigger from sandbox to production using change sets, the Apex trigger code cannot be changed. So using the same alias name when creating a queue in sandbox and creating a queue in production allows you to specify different queues in your sandbox and production orgs that can use the same Apex trigger.
Prerequisites:
You will need to have two Salesforce orgs:
- A sandbox org sending an outbound change set
- A production org receiving an inbound change set
- The Perspectium Package is installed and configured in both orgs before the migration process begins
title | Sandbox Salesforce Org Sending Outbound Change Set |
---|
style | background: white |
---|
Create Apex test classes for your dynamic share Apex triggers to ensure 75% test coverage of all Apex code. An example of a test class that tests the dynamic share Apex trigger code can be found here.
Log into your Salesforce sandbox org and click theicon in the top right-hand corner of the screen. Then, click Setup.
In the Quick Find window on the left side of the screen, type Apex Test Execution and then click choose the Apex Test Execution option under Custom Code.
In the upper left-hand corner of the Apex Test Execution form, click Select Tests...
Check the box next to any custom Apex trigger test classes you created and then click the Run button. Verify all tests selected are able to complete without failing.
In the Quick Find window on the left side of the screen, type change sets and then click Outbound Change Sets (under Environments > Change Sets).
At the top of the Change Sets list, click New.
Type in a name for the change set. Then, click Save.
Click on Add under Change Set Components.
Click the dropdown in the Component Type field, select Apex Trigger from the component type dropdown and checkmark the dynamic share Apex Triggers you want deploy in production. Select any test Apex Class you created if necessary.
Click Add To Change Set when you're done.
After selecting all the components, the browser will lead you back to the Change Set page. Click Upload.
Now that you've created a change set from your sandbox org and have sent it outbound to your production org, you can go to your production org to receive the change set inbound and deploy your dynamic share Apex triggers in production. See the next section for how to receive this change set.
title | Production Salesforce Org Receiving Inbound Change Set |
---|
style | background: white |
---|
In the Quick Find window on the left side of the screen, type and then click Inbound Change Sets (under Environments > Change Sets).
Under Change Set Awaiting Deployment, find the change set you previously sent outbound from your sandbox org and click Validate.
Select one of the listed methods to test. Then, click Validate.
Back to the Inbound Change Sets, under Change Set Awaiting Deployment, find the change set you previously validated and click Deploy.
Once you are in Deploy Change Set, choose Run local tests or Run all tests as the Test Option and click Deploy.If the change set succeeds, the change set will be deployed. To see all migrated Apex triggers, go to Setup.
If failed, update the selected test or make appropriate changes to the change set.
The following is a successful deployment:title | Successful Deployment |
---|
Divbox | ||
---|---|---|
| ||
NOTE: So as to allow the dynamic share Apex trigger to share records out to a queue, create a shared queue in your production org using the same queue alias as you created and used for this dynamic share in your sandbox org.
Other
title | Salesforce attachments into ServiceNow |
---|
style | background: white |
---|
To read Salesforce attachments into ServiceNow, the u_sfdc_attachment_import import set table is provided as part of the Perspectium Salesforce Update Set for ServiceNow.
The update set comes with a subscribe configuration for u_sfdc_attachment_import so records can be read into the import set table. A script action will then run on records being inserted to properly add and delete attachments.
The Perspectium message that is shared out of Salesforce for an attachment will come with a ParentTable field that has the name of the Salesforce table that you can then use to determine which table this should map to in ServiceNow.
The Set parent table to attach to business rule on the u_sfdc_attachment_import table is provided so you can modify the corresponding field in the import set table (u_parenttable) to the appropriate ServiceNow table based on how you are subscribing Salesforce records into ServiceNow.
For example, if you are subscribing Salesforce Case records into ServiceNow Incident records, you can use the following business rule to update the field:
Code Block |
---|
if (current.u_parenttable == 'Case')
current.u_parenttable = 'Incident'; |
title | ServiceNow attachments into Salesforce |
---|
Divbox | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||
Table mapsThe SFDC Attachment table is included by default, therefore the table should already be set. Below is what it should look like: Source scriptsBelow are the source script for required fields above. attributes:
To configure table map to receive File sObjects from ServiceNow, add the following instead:
body:
@ExternalIdValue:
table_sys_id to ParentId:
Dynamic shareAfter configuring the dynamic share from the ServiceNow and Salesforce Configuration page, the related list is necessary when sharing attachments. It may be added in the same dynamic share. Follow the images below to properly configure the related list for sharing attachments if it is not shown on the form. After configuring the PSP Share Table Map table to be visible on dynamic share, proceed to the bottom and select the respective tab. Click New and add the following table and table map as shown below. |
title | Coalesce on an External ID field |
---|
style | background: white |
---|
By default, records subscribed into Salesforce will coalesce on the ID field to determine if we should insert a new record or update a current record. So in most cases, you will want to create JSON messages to be consumed into Salesforce with the record's ID as found in Salesforce.
For example, if you are replicating records between ServiceNow and Salesforce, you can use the table map feature of the Perspectium application to map out a field that holds the record's Salesforce Id. This Id field will generally be saved as the correlation_id field in ServiceNow tables. That way when the record is subscribed into Salesforce, it can use this Id field to find the record and update or insert as appropriate.
However there may be cases where you want records in Salesforce to coalesce on an External ID field (such as when you want Salesforce to coalesce on the replicated record's ServiceNow sys_id) since records may generally be created on another platform like ServiceNow versus starting in Salesforce. Another reason is you may prefer to have more control in the ServiceNow app and have it be your central location for all data transformation and thus want to control coalescing there as well.
Here's how:
Create a new custom field in Salesforce on the table where you want to coalesce on an External ID field. Make sure to select the Text field option and check the box next to External ID labelled Set this field as the unique record identified from an external system.In the Perspectium application, navigate to your outbound messages to salesforce. Add ExternalIDField and ExternalIDValue attributes, so that when the message is received in Salesforce, it knows what the name of the external ID field is and what value to query for.
Here's an example to Illustrate:
Using the above example, when the message is subscribed into Salesforce, the app will query the Case table and look for a record with ExternalID__c = 'f4e2e6104f120300b6a444b18110c726' and if it finds a record as such, use that record to update.
Here's how the table field map would look for the ExternalIDField attribute:
In this example, the ExternalIdField is scripted to always use the same value since the custom External ID field in Salesforce will always be the same field name while the record's sys_id is used for the External ID field's value.
Info |
---|
The Salesforce app will automatically query for the specified External ID field and value if both attributes exist - otherwise, it will coalesce by the Id field with its normal default behavior. |
title | Change Salesforce Job Intervals |
---|
style | background: white |
---|
You can schedule the MultiOutput Processing (share) and Replicator Subscriber (subscribe) jobs to run at predefined intervals, so that Salesforce data is effectively shared out. When you create a new Perspectium job for Salesforce (or edit an existing one), you can choose from a defined set of Job Intervals - the default settings include 30 seconds, 1 minute, 5 minutes, 15 minutes, 30 minutes, and 60 minutes. You may want to set your intervals to a custom time duration that isn't included in the default settings. Here's how:
In your Salesforce organization, go to Settings (cog icon) > SetupUsing the Quick Find window at the left, type and select Custom Settings (under Custom Code).
From the list of Custom Settings, click the one labelled ReplicatorJobSettings.
Under Custom Setting Definition Detail, click Manage.
Click New (located towards the bottom of the page)
Complete the fields on the Replicator JobSettings Edit form:
title | Subscribe to case comments |
---|
style | background: white |
---|
Salesforce stores comments in a separate table, so when you share out a Case's comments, you would share out the CaseComment table and then subscribe these records into an import set table that targets the sys_journal_field table.
However, to get these comments to refresh properly in the incident's activity log as well as fire notifications, the best way to save these comments is using an onBefore transform map script in the import set table you create that targets the sys_journal_field table. Doing this will ensure that the activity log refreshes properly and any notifications set for incident comments also fire as expected.
Based on a) the incoming CaseComment JSON containing a reference to the Case record as stored in the ParentId field (which is mapped to u_parentid when read into the import set table) and b) this Case record ID stored in the correlation_id field in the incident table in ServiceNow, this script will find the incident record to add the comment through the incident:
Code Block |
---|
var incsysid='';
var inc = new GlideRecord('incident');
inc.addQuery('correlation_id',source.u_parentid);
inc.query();
if (inc.next()){
incsysid = inc.sys_id; //find sysid of incident from parentid in JSON
}
var gr = new GlideRecord('sys_journal_field');
gr.addQuery('name', 'incident');
gr.addQuery('element_id', incsysid);
gr.addQuery('value', source.u_commentbody);
gr.query();
if (gr.getRowCount() > 0) {
ignore = true; // found comment already previously don't need to re-add
}
else {
var igr = new GlideRecord('incident');
if (igr.get(incsysid)) {
igr.comments.setJournalEntry(source.u_commentbody);
igr.update();
ignore = true;
}
} |
title | Salesforce messages |
---|
style | background: white |
---|
Within the Perspectium application in your Salesforce instance, the top navigation menu includes tabs for Inbound Messages and Outbound Messages.
The Outbound Messages tab contains records that are queued to be sent to the Perspectium server. On this tab, you can view all messages, edit or delete an existing message, and create a new message.
NOTE: If the Status is at "Ready", the Time Processed field will be blank. Once the status is "Sent", you will be able to see how long it took for the outbound message to be sent.
The Inbound Messages tab contains records that were shared to this Salesforce instance by other sources, such as ServiceNow. The data flowing into this table will insert or update their respective tables and record once they are processed. On this tab, you can view all messages, edit or delete an existing message, and create a new message.
title | ServiceNow to Salesforce comment configuration |
---|
style | background: white |
---|
Table maps
Sending to the CaseComment table is also supported. Unlike Incident to Case, the target for this table is CaseComment and the source table is Journal Entry [sys_journal_field]. Below are attached images for examples:
Source scripts
Below are the source scrips for required fields above:
attributes:
Code Block |
---|
answer = {"type":"CaseComment"}; |
isPublished:
Code Block |
---|
if (current.element.contains("comment"))
answer = "true"; |
ParentId:
Code Block |
---|
var gr = new GlideRecord('incident');
gr.addQuery("sys_id", current.element_id.toString());
gr.query();
if (gr.next())
answer = gr.correlation_id.toString(); |
@RecordId:
Code Block |
---|
var gr = new GlideRecord('incident');
gr.get(current.element_id.toString());
answer = gr.sys_id.toString(); |
CommentBody:
Code Block |
---|
var pspUtil = new PerspectiumUtil();
var gr = new GlideRecord('incident');
gr.addQuery("sys_id", current.element_id.toString());
gr.query();
gr.next();
var elementId = gr.sys_id;
var tgr = new GlideRecord("sys_journal_field");
tgr.addQuery("element_id", elementId);
tgr.orderByDesc("sys_created_on");
tgr.query();
if (tgr.next())
pspUtil.addTag(tgr, "msp_client_incident_sent");
answer = current.value; |
Transform maps
Receiving into the CaseComment table is also supported. The source for this table is CaseComment and the target table is Journal Entry [sys_journal_field]. Below are attached images for examples:
Source scripts:
Below are the source scripts for the required fields above:
element_id:
Code Block |
---|
var gr = new GlideRecord("incident");
gr.addQuery("correlation_id", source.u_parentid);
gr.query();
gr.next();
return gr.sys_id; // return the value to be put into the target field |
element:
Code Block |
---|
return "comments"; // return the value to be put into the target field |
name:
Code Block |
---|
return "incident"; // return the value to be put into the target field |
Once the field mappings are created, an onBefore transform script is necessary for insertion. Refer to the image and script below for an example.
Code Block |
---|
var gr = new GlideRecord("incident");
gr.addQuery("correlation_id",source.u_parentid);
gr.query();
gr.next();
var elementId = gr.sys_id;
var tgr = new GlideRecord("sys_journal_field");
tgr.addQuery("element_id", elementId);
tgr.addQuery("value", source.u_commentbody);
tgr.orderByDesc("sys_created_on");
tgr.query();
if (tgr.next() && pspUtil.recordHasTag(tgr, "msp_client_incident_sent")) {
ignore = true;
return;
}
pspUtil.addTag(tgr, "msp_client_incident_sent");
gr.comments.setJournalEntry(source.u_commentbody);
gr.setForceUpdate(true);
gr.update(); |
Dynamic share
After setting the table maps, dynamic share will be able to reference them properly. The following images are an example for configuring the ServiceNow Journal Entry (sys_journal_field) table to share out to Salesforce CaseComment object. Here is also a guide for how to set up a dynamic share.Info |
---|
Keep in mind that from comments made in the Incident table are then created in the Journal Entry table. From there, the comments are then sent to CaseComment onto Salesforce. |
Be sure to include the “before share script” provided below. This will defer comments that do not have a correlation id and will put it on stand by.