Page History
HTML |
---|
<style> .release-box { height: 30px; width: 100px; padding-top: 8px; text-align: center; border-radius: 5px; font-weight: bold; background-color: #8efeb3; border-color: #FCE28A; } .release-box:hover { cursor: hand; cursor: pointer; opacity: .9; } </style> <meta name="robots" content="noindex"> <div class="release-box"> <a href="https://docs.perspectium.com/display/krypton" style="text-decoration: none; color: #FFFFFF; display: block;"> Krypton </a> </div> |
After following the steps to configure your Salesforce integration, you can then configure the Perspectium application to share data out of your Salesforce Org.
Shares
The Shares tab contains the different options to share data out of your Salesforce org:
Dynamic Share
A dynamic share allows for real-time sharing of Salesforce records as they are created, updated, and/or deleted. In other words, dynamic shares are caused by some triggering event (such as when cases are created, updated, or deleted). The data is shared to a subscriber, which can be the DataSync Agent, or any number of the other applications that Perspectium can integrate with.
When selecting the Dynamic Share option you are presented with the following options:
Create New Dynamic Share so you can create a new dynamic share to share out records real time for a selected table (SObject).
View Dynamic Shares to see the dynamic shares you previously created and so you can update and delete them.
When creating or editing a dynamic share, do the following to ensure the dynamic share is set up correctly:
- Click Save to first save the dynamic share configuration. After you save, the dynamic share show a preview of the Apex trigger that will be created to capture real time changes.
- Click Save Trigger to save the Apex trigger. Repeat these steps anytime you make changes to the dynamic share.
If you prefer, you can modify the trigger directly by doing the following:
UI Expand | |||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||
|
Since Apex triggers are considered code that requires test coverage, you will need to have a test class for your Apex trigger to deploy it into production. See the following for an example Apex trigger test class.
UI Expand | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||
|
UI Expand | |||||
---|---|---|---|---|---|
| |||||
|
Bulk Share
A bulk share is a job that sends out bulk records from your Salesforce org as a pre-filtered range of data, at a point in time all at once. The data is shared to a subscriber, which can be the DataSync Agent, or any number of the other applications that Perspectium can integrate with.
When selecting the Bulk Share option you are presented with the following options:
Create New Bulk Share to create a new bulk share to share out records at a point in time for a selected table (SObject).
View Bulk Shares allows you to see the bulk shares you previously created and so you can re-execute them.
When creating a bulk share, you can Save the bulk share to save the bulk share or click Execute Now to save and execute the bulk share. Once a bulk has been executed, the bulk share's configurations cannot be changed but it can be executed again by clicking the Execute Now option again.
Some notable features for bulk shares:
Preview - The Preview button will allow you to get a preview of how records the bulk share will share out when executed.
Clone - To create a copy of this bulk share. This is useful if you have a current bulk share that's been executed and you want to use it with changes to some of the configurations without having to create a brand new bulk share and reconfigure from scratch.
Enable Confirmation - To have the DataSync Agent validate that it has received all the records shared out as part of this bulk share execution. The bulk share will send out compare messages that the Agent will use to validate records were received. If the Agent finds that a record is missing (by verifying the record exists when querying for it by the record's Id), it will send back a reshare message telling the Perspectium application in Salesforce to reshare those records. The application will create a new reshare bulk share (named Reshare <Bulk Share Name> Datetime i.e. Reshare AccountShare 2024-02-07 19:29:03) to share out the missing records and another compare message so the Agent can confirm its receive these records. This will repeat until the Agent has confirmed the records are all received.
Share Updates Since Then - To share records only since the last time this bulk share was executed. After the first run (in which all records will be shared based on any filter conditions you enter), the bulk share will then only share any records that were updated since the last time the bulk share executed (using the bulk share's Started time as the last time it executed). This is useful if you set up the bulk share in a scheduled bulk share to share out only changes on a scheduled basis.
Scheduled Bulk Share
Schedule one or more bulk shares to run at a scheduled interval you choose. After creating a bulk share, you can add the created bulk configuration as a child record to be run under the schedule specified within your scheduled bulk configuration.
When selecting the Scheduled Bulk Share option you are presented with the following options:
Create New Scheduled Bulk Share to create a new scheduled bulk share and select which bulk shares to be run and at what intervals.
View Scheduled Bulk Shares allows you to see the scheduled bulk shares you previously created and modify their configurations
When creating a new scheduled bulk share (or modifying a current one), you can do the following:
- Select a date and time in the Scheduled Date Time field for when to start bulk sharing records, and the Days and Hours to repeat the process in the Repeat Interval field.
- In the Available Bulk Shares, select which bulk shares you want to schedule.
- Check the Active box to activate the scheduled bulk share.
- Click Save to save your changes.
Queues
Messages
MonitoringUI Expand | ||||
---|---|---|---|---|
| ||||
| ||||
UI Expand | ||||
| ||||
Divbox | ||||
|
Queues
Messages
Migrating a Salesforce Sandbox to Production
title | Salesforce sandbox to production |
---|
style | background: white |
---|
Salesforce sandboxes are used to develop and test changes for your organization in an environment that doesn't affect any real data or applications. When development and testing is done, you can migrate these changes from the sandbox to the production org using change sets.
This is especially useful for deploying Apex triggers in production orgs. Dynamic shares in the Perspectium Salesforce Package use Apex triggers, similar to how business rules capture changes in ServiceNow. However, unlike ServiceNow where you can create business rules in a production ServiceNow instance, Salesforce requires you deploy Apex triggers by creating them in a sandbox org and then moving them over to your production org using change sets.
Apex triggers use Apex code to execute the Perspectium application's code to share records out of Salesforce. To deploy anything containing Apex code (such as Apex triggers) in production, you must have 75% test coverage of all Apex code. By default, the Perspectium Package comes with a test class that covers the basic components of Perspectium's Apex code including the dynamic share Apex triggers. However you may need to add an additional test class to your change set that tests your trigger's Apex code (especially if there are any customizations) to ensure this 75% coverage of all Apex code. An example of a test class that tests the dynamic share Apex trigger code can be found here.
NOTE: Your Apex trigger trigger should reference a queue that has an alias. When migrating your Apex trigger from sandbox to production using change sets, the Apex trigger code cannot be changed. So using the same alias name when creating a queue in sandbox and creating a queue in production allows you to specify different queues in your sandbox and production orgs that can use the same Apex trigger.
Prerequisites:
You will need to have two Salesforce orgs:
- A sandbox org sending an outbound change set
- A production org receiving an inbound change set
- The Perspectium Package is installed and configured in both orgs before the migration process begins
title | Sandbox Salesforce Org Sending Outbound Change Set |
---|
style | background: white |
---|
Create Apex test classes for your dynamic share Apex triggers to ensure 75% test coverage of all Apex code. An example of a test class that tests the dynamic share Apex trigger code can be found here.
Log into your Salesforce sandbox org and click theicon in the top right-hand corner of the screen. Then, click Setup.
In the Quick Find window on the left side of the screen, type Apex Test Execution and then click choose the Apex Test Execution option under Custom Code.
|
Dashboard
Check the box next to any custom Apex trigger test classes you created and then click the Run button. Verify all tests selected are able to complete without failing.
In the Quick Find window on the left side of the screen, type change sets and then click Outbound Change Sets (under Environments > Change Sets).
At the top of the Change Sets list, click New.
Type in a name for the change set. Then, click Save.
Click on Add under Change Set Components.
Click the dropdown in the Component Type field, select Apex Trigger from the component type dropdown and checkmark the dynamic share Apex Triggers you want deploy in production. Select any test Apex Class you created if necessary.
Click Add To Change Set when you're done.
After selecting all the components, the browser will lead you back to the Change Set page. Click Upload.
Now that you've created a change set from your sandbox org and have sent it outbound to your production org, you can go to your production org to receive the change set inbound and deploy your dynamic share Apex triggers in production. See the next section for how to receive this change set.
title | Production Salesforce Org Receiving Inbound Change Set |
---|
style | background: white |
---|
In the Quick Find window on the left side of the screen, type and then click Inbound Change Sets (under Environments > Change Sets).
Under Change Set Awaiting Deployment, find the change set you previously sent outbound from your sandbox org and click Validate.
Select one of the listed methods to test. Then, click Validate.
Back to the Inbound Change Sets, under Change Set Awaiting Deployment, find the change set you previously validated and click Deploy.
Once you are in Deploy Change Set, choose Run local tests or Run all tests as the Test Option and click Deploy.If the change set succeeds, the change set will be deployed. To see all migrated Apex triggers, go to Setup.
If failed, update the selected test or make appropriate changes to the change set.
The following is a successful deployment:title | Successful Deployment |
---|
Divbox | ||
---|---|---|
| ||
NOTE: So as to allow the dynamic share Apex trigger to share records out to a queue, create a shared queue in your production org using the same queue alias as you created and used for this dynamic share in your sandbox org.
Other
title | Salesforce attachments into ServiceNow |
---|
style | background: white |
---|
To read Salesforce attachments into ServiceNow, the u_sfdc_attachment_import import set table is provided as part of the Perspectium Salesforce Update Set for ServiceNow.
The update set comes with a subscribe configuration for u_sfdc_attachment_import so records can be read into the import set table. A script action will then run on records being inserted to properly add and delete attachments.
The Perspectium message that is shared out of Salesforce for an attachment will come with a ParentTable field that has the name of the Salesforce table that you can then use to determine which table this should map to in ServiceNow.
The Set parent table to attach to business rule on the u_sfdc_attachment_import table is provided so you can modify the corresponding field in the import set table (u_parenttable) to the appropriate ServiceNow table based on how you are subscribing Salesforce records into ServiceNow.
For example, if you are subscribing Salesforce Case records into ServiceNow Incident records, you can use the following business rule to update the field:
Code Block |
---|
if (current.u_parenttable == 'Case')
current.u_parenttable = 'Incident'; |
title | ServiceNow attachments into Salesforce |
---|
Divbox | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||
Table mapsThe SFDC Attachment table is included by default, therefore the table should already be set. Below is what it should look like: Source scriptsBelow are the source script for required fields above. attributes:
To configure table map to receive File sObjects from ServiceNow, add the following instead:
body:
@ExternalIdValue:
table_sys_id to ParentId:
Dynamic shareAfter configuring the dynamic share from the ServiceNow and Salesforce Configuration page, the related list is necessary when sharing attachments. It may be added in the same dynamic share. Follow the images below to properly configure the related list for sharing attachments if it is not shown on the form. After configuring the PSP Share Table Map table to be visible on dynamic share, proceed to the bottom and select the respective tab. Click New and add the following table and table map as shown below. |
title | Coalesce on an External ID field |
---|
style | background: white |
---|
By default, records subscribed into Salesforce will coalesce on the ID field to determine if we should insert a new record or update a current record. So in most cases, you will want to create JSON messages to be consumed into Salesforce with the record's ID as found in Salesforce.
For example, if you are replicating records between ServiceNow and Salesforce, you can use the table map feature of the Perspectium application to map out a field that holds the record's Salesforce Id. This Id field will generally be saved as the correlation_id field in ServiceNow tables. That way when the record is subscribed into Salesforce, it can use this Id field to find the record and update or insert as appropriate.
However there may be cases where you want records in Salesforce to coalesce on an External ID field (such as when you want Salesforce to coalesce on the replicated record's ServiceNow sys_id) since records may generally be created on another platform like ServiceNow versus starting in Salesforce. Another reason is you may prefer to have more control in the ServiceNow app and have it be your central location for all data transformation and thus want to control coalescing there as well.
Here's how:
Create a new custom field in Salesforce on the table where you want to coalesce on an External ID field. Make sure to select the Text field option and check the box next to External ID labelled Set this field as the unique record identified from an external system.In the Perspectium application, navigate to your outbound messages to salesforce. Add ExternalIDField and ExternalIDValue attributes, so that when the message is received in Salesforce, it knows what the name of the external ID field is and what value to query for.
Here's an example to Illustrate:
Using the above example, when the message is subscribed into Salesforce, the app will query the Case table and look for a record with ExternalID__c = 'f4e2e6104f120300b6a444b18110c726' and if it finds a record as such, use that record to update.
Here's how the table field map would look for the ExternalIDField attribute:
In this example, the ExternalIdField is scripted to always use the same value since the custom External ID field in Salesforce will always be the same field name while the record's sys_id is used for the External ID field's value.
Info |
---|
The Salesforce app will automatically query for the specified External ID field and value if both attributes exist - otherwise, it will coalesce by the Id field with its normal default behavior. |
title | Change Salesforce Job Intervals |
---|
style | background: white |
---|
You can schedule the MultiOutput Processing (share) and Replicator Subscriber (subscribe) jobs to run at predefined intervals, so that Salesforce data is effectively shared out. When you create a new Perspectium job for Salesforce (or edit an existing one), you can choose from a defined set of Job Intervals - the default settings include 30 seconds, 1 minute, 5 minutes, 15 minutes, 30 minutes, and 60 minutes. You may want to set your intervals to a custom time duration that isn't included in the default settings. Here's how:
In your Salesforce organization, go to Settings (cog icon) > SetupUsing the Quick Find window at the left, type and select Custom Settings (under Custom Code).
From the list of Custom Settings, click the one labelled ReplicatorJobSettings.
Under Custom Setting Definition Detail, click Manage.
Click New (located towards the bottom of the page)
Complete the fields on the Replicator JobSettings Edit form:
title | Subscribe to case comments |
---|
style | background: white |
---|
Salesforce stores comments in a separate table, so when you share out a Case's comments, you would share out the CaseComment table and then subscribe these records into an import set table that targets the sys_journal_field table.
However, to get these comments to refresh properly in the incident's activity log as well as fire notifications, the best way to save these comments is using an onBefore transform map script in the import set table you create that targets the sys_journal_field table. Doing this will ensure that the activity log refreshes properly and any notifications set for incident comments also fire as expected.
Based on a) the incoming CaseComment JSON containing a reference to the Case record as stored in the ParentId field (which is mapped to u_parentid when read into the import set table) and b) this Case record ID stored in the correlation_id field in the incident table in ServiceNow, this script will find the incident record to add the comment through the incident:
Code Block |
---|
var incsysid='';
var inc = new GlideRecord('incident');
inc.addQuery('correlation_id',source.u_parentid);
inc.query();
if (inc.next()){
incsysid = inc.sys_id; //find sysid of incident from parentid in JSON
}
var gr = new GlideRecord('sys_journal_field');
gr.addQuery('name', 'incident');
gr.addQuery('element_id', incsysid);
gr.addQuery('value', source.u_commentbody);
gr.query();
if (gr.getRowCount() > 0) {
ignore = true; // found comment already previously don't need to re-add
}
else {
var igr = new GlideRecord('incident');
if (igr.get(incsysid)) {
igr.comments.setJournalEntry(source.u_commentbody);
igr.update();
ignore = true;
}
} |
title | Salesforce messages |
---|
style | background: white |
---|
Within the Perspectium application in your Salesforce instance, the top navigation menu includes tabs for Inbound Messages and Outbound Messages.
The Outbound Messages tab contains records that are queued to be sent to the Perspectium server. On this tab, you can view all messages, edit or delete an existing message, and create a new message.
NOTE: If the Status is at "Ready", the Time Processed field will be blank. Once the status is "Sent", you will be able to see how long it took for the outbound message to be sent.
The Inbound Messages tab contains records that were shared to this Salesforce instance by other sources, such as ServiceNow. The data flowing into this table will insert or update their respective tables and record once they are processed. On this tab, you can view all messages, edit or delete an existing message, and create a new message.
title | ServiceNow to Salesforce comment configuration |
---|
style | background: white |
---|
Table maps
Sending to the CaseComment table is also supported. Unlike Incident to Case, the target for this table is CaseComment and the source table is Journal Entry [sys_journal_field]. Below are attached images for examples:
Source scripts
Below are the source scrips for required fields above:
attributes:
Code Block |
---|
answer = {"type":"CaseComment"}; |
isPublished:
Code Block |
---|
if (current.element.contains("comment"))
answer = "true"; |
ParentId:
Code Block |
---|
var gr = new GlideRecord('incident');
gr.addQuery("sys_id", current.element_id.toString());
gr.query();
if (gr.next())
answer = gr.correlation_id.toString(); |
@RecordId:
Code Block |
---|
var gr = new GlideRecord('incident');
gr.get(current.element_id.toString());
answer = gr.sys_id.toString(); |
CommentBody:
Code Block |
---|
var pspUtil = new PerspectiumUtil();
var gr = new GlideRecord('incident');
gr.addQuery("sys_id", current.element_id.toString());
gr.query();
gr.next();
var elementId = gr.sys_id;
var tgr = new GlideRecord("sys_journal_field");
tgr.addQuery("element_id", elementId);
tgr.orderByDesc("sys_created_on");
tgr.query();
if (tgr.next())
pspUtil.addTag(tgr, "msp_client_incident_sent");
answer = current.value; |
Transform maps
Receiving into the CaseComment table is also supported. The source for this table is CaseComment and the target table is Journal Entry [sys_journal_field]. Below are attached images for examples:
Source scripts:
Below are the source scripts for the required fields above:
element_id:
Code Block |
---|
var gr = new GlideRecord("incident");
gr.addQuery("correlation_id", source.u_parentid);
gr.query();
gr.next();
return gr.sys_id; // return the value to be put into the target field |
element:
Code Block |
---|
return "comments"; // return the value to be put into the target field |
name:
Code Block |
---|
return "incident"; // return the value to be put into the target field |
Once the field mappings are created, an onBefore transform script is necessary for insertion. Refer to the image and script below for an example.
Code Block |
---|
var gr = new GlideRecord("incident");
gr.addQuery("correlation_id",source.u_parentid);
gr.query();
gr.next();
var elementId = gr.sys_id;
var tgr = new GlideRecord("sys_journal_field");
tgr.addQuery("element_id", elementId);
tgr.addQuery("value", source.u_commentbody);
tgr.orderByDesc("sys_created_on");
tgr.query();
if (tgr.next() && pspUtil.recordHasTag(tgr, "msp_client_incident_sent")) {
ignore = true;
return;
}
pspUtil.addTag(tgr, "msp_client_incident_sent");
gr.comments.setJournalEntry(source.u_commentbody);
gr.setForceUpdate(true);
gr.update(); |
Dynamic share
After setting the table maps, dynamic share will be able to reference them properly. The following images are an example for configuring the ServiceNow Journal Entry (sys_journal_field) table to share out to Salesforce CaseComment object. Here is also a guide for how to set up a dynamic share.Info |
---|
Keep in mind that from comments made in the Incident table are then created in the Journal Entry table. From there, the comments are then sent to CaseComment onto Salesforce. |
Be sure to include the “before share script” provided below. This will defer comments that do not have a correlation id and will put it on stand by.