Page History
HTML |
---|
<style> .release-box { height: 30px; width: 100px; padding-top: 8px; text-align: center; border-radius: 5px; font-weight: bold; background-color: #8efeb3; border-color: #FCE28A; } .release-box:hover { cursor: hand; cursor: pointer; opacity: .9; } </style> <meta name="robots" content="noindex"> <div class="release-box"> <a href="https://docs.perspectium.com/display/krypton" style="text-decoration: none; color: #FFFFFF; display: block;"> Krypton </a> </div> |
After following the steps to configure your DataSync application for Salesforce
, you can then configure the
application to share data out of your Salesforce
org. The DataSync application has the following tabs to give you access to its features. Contact support@perspectium.com with any questions.
Panel | ||||
---|---|---|---|---|
| ||||
|
Shares
The Shares tab contains the different options to share data out of your Salesforce org:
Dynamic Share
A dynamic share allows for real-time sharing of Salesforce records as they are created, updated, and/or deleted. In other words, dynamic shares are caused by some triggering event (such as when cases are created, updated, or deleted). The data is shared to a subscriber, which can be the DataSync Agent, or any number of the other applications that Perspectium can integrate with.
When selecting the Dynamic Share option you are presented with the following options:
Create New Dynamic Share is so you can create can create a new dynamic share to share out records real time for a selected table (SObject).
View Dynamic Shares allows you Shares to see the dynamic shares you previously created and so you can update and delete them.
When creating or editing a dynamic share, do the following to ensure the dynamic share is set up correctly:
- Click Save to first save the dynamic share configuration. After you save, the dynamic share show a preview of the Apex trigger that will be created to capture real time changes.
- Click Save Trigger to save the Apex trigger. Repeat these steps anytime you make changes to the dynamic share.
If you prefer, you can modify the trigger directly by doing the following:
UI Expand | |||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||
| |||||||||||||||||||||||||
UI Expand | |||||||||||||||||||||||||
| |||||||||||||||||||||||||
|
Since Apex triggers are considered code that requires test coverage, you will need to have a test class for your Apex trigger to deploy it into production. See the following for an example Apex trigger test class.
UI Expand | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||
|
Monitoring
UI Expand | ||||
---|---|---|---|---|
| ||||
| ||||
UI Expand | ||||
| ||||
Divbox | ||||
| ||||
UI Expand | ||||
|
Divbox | ||
---|---|---|
| ||
The Include Child Records option provides a way to share out child records of a parent record. This is done by creating Apex triggers so as to capture events on those child tables, and filtering to only share out child records related to the parent record. After selecting the parent table on a dynamic share form, the Include Child Records option will populate with child tables that support Apex triggers. Select those tables where you want to share child records from. For example, if you select the Case table for the dynamic share and want to share out comments on Case records, select the CaseComment table in the Include Child Records option: Like attachments, this option currently only supports sharing out all fields of the child tables. As is the case with the Apex trigger created for the parent table specified in the share, any triggers created for the child tables will be set to blank when the dynamic share is set as inactive and/or you remove a child table from the selected list (i.e. if you first select the CaseComment table in the Include Child Records option, and create a trigger for it, and then later you decide to remove the CaseComment table from the selected list). This is done to be consistent with the approach of not deleting Apex Triggers, since Apex triggers can only be created in sub-prod orgs and moved over to prod orgs. (i.e. once you delete an Apex trigger in production you can't recreate it without going through the process of doing it in a sub-prod). Apex triggers on the child tables can be deleted but when the Apex trigger is deleted on the parent table, notably when the dynamic share itself is deleted. |
Bulk Share
title | Create a scheduled Salesforce bulk share |
---|
style | background: white |
---|
You can use this feature to schedule one or more bulk shares to occur at specific times or In a repeated time interval.
|
Bulk Share
A bulk share is a job that sends out bulk records from your Salesforce org as a pre-filtered range of data, at a point in time all at once. The data is shared to a subscriber, which can be the DataSync Agent, or any number of the other applications that Perspectium can integrate with.
When selecting the Bulk Share option you are presented with the following options:
Create New Bulk Share to create a new bulk share to share out records at a point in time for a selected table (SObject).
View Bulk Shares allows you to see the bulk shares you previously created and so you can re-execute them.
When creating a bulk share, you can Save the bulk share to save the bulk share or click Execute Now to save and execute the bulk share. Once a bulk has been executed, the bulk share's configurations cannot be changed but it can be executed again by clicking the Execute Now option again.
Some notable features for bulk shares:
Preview - The Preview button will allow you to get a preview of how records the bulk share will share out when executed.
Clone - To create a copy of this bulk share. This is useful if you have a current bulk share that's been executed and you want to use it with changes to some of the configurations without having to create a brand new bulk share and reconfigure from scratch.
Anchor bulk_share_enable_confirmation bulk_share_enable_confirmation
Enable Confirmation - To have the DataSync Agent validate that it has received all the records shared out as part of this bulk share execution. The bulk share will send out compare messages that the Agent will use to validate records were received. If the Agent finds that a record is missing (by verifying the record exists when querying for it by the record's Id), it will send back a reshare message telling the Perspectium application in Salesforce to reshare those records. The application will create a new reshare bulk share (named Reshare <Bulk Share Name> Datetime i.e. Reshare AccountShare 2024-02-07 19:29:03) to share out the missing records and another compare message so the Agent can confirm its receive these records. This will repeat until the Agent has confirmed the records are all received.
Share Updates Since Then - To share records only since the last time this bulk share was executed. After the first run (in which all records will be shared based on any filter conditions you enter), the bulk share will then only share any records that were updated since the last time the bulk share executed (using the bulk share's Started time as the last time it executed). This is useful if you set up the bulk share in a scheduled bulk share to share out only changes on a scheduled basis.
Table Map - To specify a table map for transforming how records will be shared out.
NOTE: The Table Map and Fields to Share options are not compatible with each other.
Scheduled Bulk Share
Schedule one or more bulk shares to run at a scheduled interval you choose. After creating a bulk share, you can add the created bulk configuration as a child record to be run under the schedule specified within your scheduled bulk configuration.
When selecting the Scheduled Bulk Share option you are presented with the following options:
Create New Scheduled Bulk Share to create a new scheduled bulk share and select which bulk shares to be run and at what intervals.
View Scheduled Bulk Shares allows you to see the scheduled bulk shares you previously created and modify their configurations
When creating a new scheduled bulk share (or modifying a current one), you can do the following:
- Select a date and time in the Scheduled Date Time field for when to start bulk sharing records, and the Days and Hours to repeat the process in the Repeat Interval field.
- In the Available Bulk Shares, select which bulk shares you want to schedule.
- Check the Active box to activate the scheduled bulk share.
- Click Save to save your changes.
Queues
The Queues tab contain the queues in the Perspectium Integration Mesh you use to share out data from your Salesforce org and to subscribe to data into org.
For sharing data out of your Salesforce org, you specify a Shared Queue where data will be saved into in the Perspectium Integration Mesh to be consumed by a target (such as the DataSync Agent consuming messages to save records into a target database).
For subscribing to data into your org (i.e. you want to consume messages from the Integration Mesh that was shared by a different source system), you specify a Subscribed Queue in the Integration Mesh you will be consuming messages from.
NOTE: To work with the Enable Confirmation bulk share feature, you will need to create a subscribed queue in the format of psp.out.salesforce.<instance>.<organization_name> in lower case and replacing spaces with _ so the Agent can send back messages for any missing records. The Instance and Organization Name can be found in Company Information:
For example, in the above screenshot where the Organization Name is PSP DS and the Instance is NA244, you would create a subscribed queue named psp.out.salesforce.na244.psp_ds.
Some notable features when creating a new queue (or editing a current queue):
Direction - The direction of the queue, if its for sharing data out of your org (Share) or to subscribe to data into your org (Subscribe).
Alias - A name as a reference to the queue. Alias are used to give a "name" to a queue so your dynamic share Apex trigger created in your sandbox org can be deployed in your production org.
In the navigation bar near the top of the screen, click Scheduled Bulk Shares. In the upper left-hand corner of the resulting page, click New Scheduled Bulk Share.
Select a date and time in the Scheduled Date Time field for when to start bulk sharing records, and the Days and Hours to repeat the process in the Repeat Interval field.
In the Available Bulk Shares, select which bulk shares you want to schedule and click . Check the Active box to activate the scheduled bulk share.
title | Preview a Salesforce bulk share |
---|
style | background: white |
---|
Before executing a Salesforce bulk share, you can preview the bulk share to see how many records will be shared out.
Log into your Salesforce organization and click the icon in the upper left-hand corner of the screen. Then, click the Perspectium Replicator app.In the navigation bar near the top of the screen, click Bulk Shares. In the upper left-hand corner of the resulting page, click to the bulk share you want to preview.
title | Clone a Salesforce bulk share |
---|
style | background: white |
---|
After executing a Salesforce bulk share, you can clone the bulk share with the following steps:
Log into your Salesforce organization and click the icon in the upper left-hand corner of the screen. Then, click the Perspectium Replicator app.In the navigation bar near the top of the screen, click Bulk Shares. In the upper left-hand corner of the resulting page, click to the bulk share you want to clone.
Queues
Messages
Migrating a Salesforce Sandbox to Production
title | Salesforce sandbox to production |
---|
style | background: white |
---|
Salesforce sandboxes are used to develop and test changes for your organization in an environment that doesn't affect any real data or applications. When development and testing is done, you can migrate these changes from the sandbox to the production org using change sets.
This is especially useful for deploying Apex triggers in production orgs. Dynamic shares in the Perspectium Salesforce Package use Apex triggers, similar to how business rules capture changes in ServiceNow. However, unlike ServiceNow where you can create business rules in a production ServiceNow instance, Salesforce requires you deploy Apex triggers by creating them in a sandbox org and then moving them over to your production org using change sets.
Apex triggers use Apex code to execute the Perspectium application's code to share records out of Salesforce. To deploy anything containing Apex code (such as Apex triggers) in production, you must have 75% test coverage of all Apex code. By default, the Perspectium Package comes with a test class that covers the basic components of Perspectium's Apex code including the dynamic share Apex triggers. However you may need to add an additional test class to your change set that tests your trigger's Apex code (especially if there are any customizations) to ensure this 75% coverage of all Apex code. An example of a test class that tests the dynamic share Apex trigger code can be found here.
When migrating your Apex trigger from sandbox to production using change sets, the Apex trigger
codecode cannot
bebe changed. So using the same alias name when creating a queue in sandbox and creating a queue in production allows you to specify different queues in your sandbox and production orgs that can use the same Apex trigger.
Prerequisites:
You will need to have two Salesforce orgs:
- A sandbox org sending an outbound change set
- A production org receiving an inbound change set
- The Perspectium Package is installed and configured in both orgs before the migration process begins
title | Sandbox Salesforce Org Sending Outbound Change Set |
---|
style | background: white |
---|
Create Apex test classes for your dynamic share Apex triggers to ensure 75% test coverage of all Apex code. An example of a test class that tests the dynamic share Apex trigger code can be found here.
Log into your Salesforce sandbox org and click theicon in the top right-hand corner of the screen. Then, click Setup.
In the Quick Find window on the left side of the screen, type Apex Test Execution and then click choose the Apex Test Execution option under Custom Code.
In the upper left-hand corner of the Apex Test Execution form, click Select Tests...
Check the box next to any custom Apex trigger test classes you created and then click the Run button. Verify all tests selected are able to complete without failing.
In the Quick Find window on the left side of the screen, type change sets and then click Outbound Change Sets (under Environments > Change Sets).
At the top of the Change Sets list, click New.
Type in a name for the change set. Then, click Save.
Click on Add under Change Set Components.
Click the dropdown in the Component Type field, select Apex Trigger from the component type dropdown and checkmark the dynamic share Apex Triggers you want deploy in production. Select any test Apex Class you created if necessary.
Click Add To Change Set when you're done.
After selecting all the components, the browser will lead you back to the Change Set page. Click Upload.
Now that you've created a change set from your sandbox org and have sent it outbound to your production org, you can go to your production org to receive the change set inbound and deploy your dynamic share Apex triggers in production. See the next section for how to receive this change set.
title | Production Salesforce Org Receiving Inbound Change Set |
---|
style | background: white |
---|
In the Quick Find window on the left side of the screen, type and then click Inbound Change Sets (under Environments > Change Sets).
Under Change Set Awaiting Deployment, find the change set you previously sent outbound from your sandbox org and click Validate.
Select one of the listed methods to test. Then, click Validate.
Back to the Inbound Change Sets, under Change Set Awaiting Deployment, find the change set you previously validated and click Deploy.
Once you are in Deploy Change Set, choose Run local tests or Run all tests as the Test Option and click Deploy.If the change set succeeds, the change set will be deployed. To see all migrated Apex triggers, go to Setup.
If failed, update the selected test or make appropriate changes to the change set.
The following is a successful deployment:title | Successful Deployment |
---|
Divbox | ||
---|---|---|
| ||
NOTE: So as to allow the dynamic share Apex trigger to share records out to a queue, create a shared queue in your production org using the same queue alias as you created and used for this dynamic share in your sandbox org.
Other
Queue Encryption Key - (Optional) Enter an encryption key for any shares/subscribes that use this queue. If not entered, the default key entered in Properties will be used. To ensure compatibility with all encryption methods, you will have to enter an encryption key of at least 32 characters.
NOTE: Because of how Salesforce encrypts sensitive information fields, the Queue Password and Queue Encryption Key will not be populated when you edit an existing queue. You do not have to re-enter these values when editing a queue unless you want to change their values. Leaving them blank when editing a queue and clicking Save will retain the current values.
Once you've created a queue and saved it for the first time, you can come back to the Queue's form and the following options will be available:
Get Queue Status - To check how many messages are currently in the queue in the Perspectium Integration Mesh. This will allow you to verify if your queue credentials are valid.
Show Encryption Key - To see the encryption key set for this queue. If an encryption was not entered, the message No encryption key set for this queue will be shown (in which case the default encryption key will be used).
Tools
Click here to learn more about the different tools available for the Perspectium application.
Messages
The Messages tab contain the messages being sent to and received from the Perspectium Integration Mesh.
Outbound Messages contains records that are queued to be sent to the Perspectium Integration Mesh. Once the messages have been sent their status will update to Sent.
NOTE: If the Status is at "Ready", the Time Processed field will be blank. Once the status is "Sent", you will be able to see how long it took for the outbound message to be sent.
In the list view of Outbound Messages you will see the following options:
Delete All - To delete all messages in Outbound Messages.
Delete replicator messages - To delete all messages in Outbound Messages with a topic of replicator.
Delete non-replicator messages - To delete all messages in Outbound Messages that do not have a topic of replicator.
NOTE: The above delete options only delete messages in the Outbound Messages table in the Perspectium application in Salesforce to help quickly clean up data in this table. This does not affect the messages in the queues in the Integration Mesh.
Inbound Messages contains records that were shared to this Salesforce instance by other sources. The data flowing into this table will insert or update their respective tables and record once they are processed.
When clicking on a message (both inbound and outbound), you will see a Decrypt Value option. This option will decrypt the message's value field (using the encryption key from the queue or the default key if a key wasn't set in the queue) so you can see the encrypted content that was sent/received from the Integration Mesh.
Dashboard
The Dashboard tab contain an overview of your data exchanges with the Perspectium application in your Salesforce org.
The Dashboard shows the following information:
Queues - A list of the queues you have currently configured. You can toggle between seeing all queues, shared queues and subscribed queues.
Messages - A list of outbound messages. You can toggle between seeing all messages, those that have a status of Error (to troubleshoot issues further), those that have a status of Ready (messages ready to be sent to the Integration Mesh) and Sent (messages that have already been sent to the Integration Mesh).
Bulk Share History chart - A bar chart showing how much data you've bulk shared by date. If you have not executed any bulk shares, the chart will show the message No bulk share execution history found. Run some bulk shares to show history here. NOTE: Salesforce's charting capabilities will reflect an empty chart if you share a similar amount of data for all days (since it will appear as a line at the bottom of the chart). Once your bulk sharing levels start to differentiate, then the chart will show bars properly.
Bulk Shares - A list of the bulk shares you have currently configured.
Dynamic Shares - A list of the dynamic shares you have currently configured.
title | Salesforce attachments into ServiceNow |
---|
style | background: white |
---|
To read Salesforce attachments into ServiceNow, the u_sfdc_attachment_import import set table is provided as part of the Perspectium Salesforce Update Set for ServiceNow.
The update set comes with a subscribe configuration for u_sfdc_attachment_import so records can be read into the import set table. A script action will then run on records being inserted to properly add and delete attachments.
The Perspectium message that is shared out of Salesforce for an attachment will come with a ParentTable field that has the name of the Salesforce table that you can then use to determine which table this should map to in ServiceNow.
The Set parent table to attach to business rule on the u_sfdc_attachment_import table is provided so you can modify the corresponding field in the import set table (u_parenttable) to the appropriate ServiceNow table based on how you are subscribing Salesforce records into ServiceNow.
For example, if you are subscribing Salesforce Case records into ServiceNow Incident records, you can use the following business rule to update the field:
Code Block |
---|
if (current.u_parenttable == 'Case')
current.u_parenttable = 'Incident'; |
title | ServiceNow attachments into Salesforce |
---|
Divbox | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||
Table mapsThe SFDC Attachment table is included by default, therefore the table should already be set. Below is what it should look like: Source scriptsBelow are the source script for required fields above. attributes:
To configure table map to receive File sObjects from ServiceNow, add the following instead:
body:
@ExternalIdValue:
table_sys_id to ParentId:
Dynamic shareAfter configuring the dynamic share from the ServiceNow and Salesforce Configuration page, the related list is necessary when sharing attachments. It may be added in the same dynamic share. Follow the images below to properly configure the related list for sharing attachments if it is not shown on the form. After configuring the PSP Share Table Map table to be visible on dynamic share, proceed to the bottom and select the respective tab. Click New and add the following table and table map as shown below. |
title | Coalesce on an External ID field |
---|
style | background: white |
---|
By default, records subscribed into Salesforce will coalesce on the ID field to determine if we should insert a new record or update a current record. So in most cases, you will want to create JSON messages to be consumed into Salesforce with the record's ID as found in Salesforce.
For example, if you are replicating records between ServiceNow and Salesforce, you can use the table map feature of the Perspectium application to map out a field that holds the record's Salesforce Id. This Id field will generally be saved as the correlation_id field in ServiceNow tables. That way when the record is subscribed into Salesforce, it can use this Id field to find the record and update or insert as appropriate.
However there may be cases where you want records in Salesforce to coalesce on an External ID field (such as when you want Salesforce to coalesce on the replicated record's ServiceNow sys_id) since records may generally be created on another platform like ServiceNow versus starting in Salesforce. Another reason is you may prefer to have more control in the ServiceNow app and have it be your central location for all data transformation and thus want to control coalescing there as well.
Here's how:
Create a new custom field in Salesforce on the table where you want to coalesce on an External ID field. Make sure to select the Text field option and check the box next to External ID labelled Set this field as the unique record identified from an external system.In the Perspectium application, navigate to your outbound messages to salesforce. Add ExternalIDField and ExternalIDValue attributes, so that when the message is received in Salesforce, it knows what the name of the external ID field is and what value to query for.
Here's an example to Illustrate:
Using the above example, when the message is subscribed into Salesforce, the app will query the Case table and look for a record with ExternalID__c = 'f4e2e6104f120300b6a444b18110c726' and if it finds a record as such, use that record to update.
Here's how the table field map would look for the ExternalIDField attribute:
In this example, the ExternalIdField is scripted to always use the same value since the custom External ID field in Salesforce will always be the same field name while the record's sys_id is used for the External ID field's value.
Info |
---|
The Salesforce app will automatically query for the specified External ID field and value if both attributes exist - otherwise, it will coalesce by the Id field with its normal default behavior. |
title | Change Salesforce Job Intervals |
---|
style | background: white |
---|
You can schedule the MultiOutput Processing (share) and Replicator Subscriber (subscribe) jobs to run at predefined intervals, so that Salesforce data is effectively shared out. When you create a new Perspectium job for Salesforce (or edit an existing one), you can choose from a defined set of Job Intervals - the default settings include 30 seconds, 1 minute, 5 minutes, 15 minutes, 30 minutes, and 60 minutes. You may want to set your intervals to a custom time duration that isn't included in the default settings. Here's how:
In your Salesforce organization, go to Settings (cog icon) > SetupUsing the Quick Find window at the left, type and select Custom Settings (under Custom Code).
From the list of Custom Settings, click the one labelled ReplicatorJobSettings.
Under Custom Setting Definition Detail, click Manage.
Click New (located towards the bottom of the page)
Complete the fields on the Replicator JobSettings Edit form:
title | Subscribe to case comments |
---|
style | background: white |
---|
Salesforce stores comments in a separate table, so when you share out a Case's comments, you would share out the CaseComment table and then subscribe these records into an import set table that targets the sys_journal_field table.
However, to get these comments to refresh properly in the incident's activity log as well as fire notifications, the best way to save these comments is using an onBefore transform map script in the import set table you create that targets the sys_journal_field table. Doing this will ensure that the activity log refreshes properly and any notifications set for incident comments also fire as expected.
Based on a) the incoming CaseComment JSON containing a reference to the Case record as stored in the ParentId field (which is mapped to u_parentid when read into the import set table) and b) this Case record ID stored in the correlation_id field in the incident table in ServiceNow, this script will find the incident record to add the comment through the incident:
Code Block |
---|
var incsysid='';
var inc = new GlideRecord('incident');
inc.addQuery('correlation_id',source.u_parentid);
inc.query();
if (inc.next()){
incsysid = inc.sys_id; //find sysid of incident from parentid in JSON
}
var gr = new GlideRecord('sys_journal_field');
gr.addQuery('name', 'incident');
gr.addQuery('element_id', incsysid);
gr.addQuery('value', source.u_commentbody);
gr.query();
if (gr.getRowCount() > 0) {
ignore = true; // found comment already previously don't need to re-add
}
else {
var igr = new GlideRecord('incident');
if (igr.get(incsysid)) {
igr.comments.setJournalEntry(source.u_commentbody);
igr.update();
ignore = true;
}
} |
title | Salesforce messages |
---|
style | background: white |
---|
Within the Perspectium application in your Salesforce instance, the top navigation menu includes tabs for Inbound Messages and Outbound Messages.
The Outbound Messages tab contains records that are queued to be sent to the Perspectium server. On this tab, you can view all messages, edit or delete an existing message, and create a new message.
NOTE: If the Status is at "Ready", the Time Processed field will be blank. Once the status is "Sent", you will be able to see how long it took for the outbound message to be sent.
The Inbound Messages tab contains records that were shared to this Salesforce instance by other sources, such as ServiceNow. The data flowing into this table will insert or update their respective tables and record once they are processed. On this tab, you can view all messages, edit or delete an existing message, and create a new message.
title | ServiceNow to Salesforce comment configuration |
---|
style | background: white |
---|
Table maps
Sending to the CaseComment table is also supported. Unlike Incident to Case, the target for this table is CaseComment and the source table is Journal Entry [sys_journal_field]. Below are attached images for examples:
Source scripts
Below are the source scrips for required fields above:
attributes:
Code Block |
---|
answer = {"type":"CaseComment"}; |
isPublished:
Code Block |
---|
if (current.element.contains("comment"))
answer = "true"; |
ParentId:
Code Block |
---|
var gr = new GlideRecord('incident');
gr.addQuery("sys_id", current.element_id.toString());
gr.query();
if (gr.next())
answer = gr.correlation_id.toString(); |
@RecordId:
Code Block |
---|
var gr = new GlideRecord('incident');
gr.get(current.element_id.toString());
answer = gr.sys_id.toString(); |
CommentBody:
Code Block |
---|
var pspUtil = new PerspectiumUtil();
var gr = new GlideRecord('incident');
gr.addQuery("sys_id", current.element_id.toString());
gr.query();
gr.next();
var elementId = gr.sys_id;
var tgr = new GlideRecord("sys_journal_field");
tgr.addQuery("element_id", elementId);
tgr.orderByDesc("sys_created_on");
tgr.query();
if (tgr.next())
pspUtil.addTag(tgr, "msp_client_incident_sent");
answer = current.value; |
Transform maps
Receiving into the CaseComment table is also supported. The source for this table is CaseComment and the target table is Journal Entry [sys_journal_field]. Below are attached images for examples:
Source scripts:
Below are the source scripts for the required fields above:
element_id:
Code Block |
---|
var gr = new GlideRecord("incident");
gr.addQuery("correlation_id", source.u_parentid);
gr.query();
gr.next();
return gr.sys_id; // return the value to be put into the target field |
element:
Code Block |
---|
return "comments"; // return the value to be put into the target field |
name:
Code Block |
---|
return "incident"; // return the value to be put into the target field |
Once the field mappings are created, an onBefore transform script is necessary for insertion. Refer to the image and script below for an example.
Code Block |
---|
var gr = new GlideRecord("incident");
gr.addQuery("correlation_id",source.u_parentid);
gr.query();
gr.next();
var elementId = gr.sys_id;
var tgr = new GlideRecord("sys_journal_field");
tgr.addQuery("element_id", elementId);
tgr.addQuery("value", source.u_commentbody);
tgr.orderByDesc("sys_created_on");
tgr.query();
if (tgr.next() && pspUtil.recordHasTag(tgr, "msp_client_incident_sent")) {
ignore = true;
return;
}
pspUtil.addTag(tgr, "msp_client_incident_sent");
gr.comments.setJournalEntry(source.u_commentbody);
gr.setForceUpdate(true);
gr.update(); |
Dynamic share
After setting the table maps, dynamic share will be able to reference them properly. The following images are an example for configuring the ServiceNow Journal Entry (sys_journal_field) table to share out to Salesforce CaseComment object. Here is also a guide for how to set up a dynamic share.Info |
---|
Keep in mind that from comments made in the Incident table are then created in the Journal Entry table. From there, the comments are then sent to CaseComment onto Salesforce. |
Be sure to include the “before share script” provided below. This will defer comments that do not have a correlation id and will put it on stand by.