Perspectium DataSync Agents support the replication of data from your app to an Azure Blob Storage container, or an Azure Blob Storage Subscriber Agent. By configuring your Azure Blob Storage Subscriber Agent, data from your app can be replicated and then saved as either .json or .xml file(s) in your Azure Blob Storage container.
Prerequisites
First, you will need to set up the Perspectium DataSync Agent.
You should also stop running your DataSync Agent before making any Agent configuration changes.
Procedure
To configure your DataSync Agent to run as an Azure Blob Storage Subscriber Agent, follow these steps:
Navigate to the directory where you saved your agent.xml file when installing your DataSync Agent.
Open your agent.xml file in a text editing application and delete the following directives nested within the <task> tag:
- <database_type>
- <database_server>
- <database_port>
- <database_user>
- <database_password>
- <database_parms>
- <database_column_max_size>
- <database>
Locate the <task_name> and <handler> directives nested within the <task> tag and update their values as follows:
Directive | Update value to... |
---|---|
<task_name> | azure_blob_storage_agent_subscribe |
<handler> | com.perspectium.replicator.file.AzureBlobStorage |
Within the <task> tag, nest the following directives:
Directive | Description | Required? |
---|---|---|
<file_format> | Format you want to save your data records in e.g., json or xml | Yes |
<abs_container> | Name of your Azure Blob Storage container, including subdirectories if desired, to specify where the records will be uploaded e.g. container/folder1/folder2. For example, with <abs_container>pspcontainer</abs_container> will save records into the pspcontainer blob storage container. With <pspcontainer>pspcontainer/datasync-agent/tables/$table</pspcontainer> configured, if an incident record is being processed and uploaded to the Azure Blob Storage container, then the record will be saved in the pspcontainer container and in the /datasync-agent/tables/incident directory in that container, creating the directories datasync-agent, tables and incident automatically. NOTE: Adding the $table token indicates this token will be replaced by the table name of the record. | Yes |
<connection_string> | String provided by Azure to connect and upload to Azure Blob Storage containers. You can obtain this from the Azure Portal e.g. <connection_string>DefaultEndpointsProtocol=https;AccountName=test;AccountKey=YOURKEY;EndpointSuffix=core.windows.net</connection_string> | Yes |
Save the changes you've made to your agent.xml and close the file. An example agent.xml configuration for an Azure Blob Storage Subscriber Agent is shown below:
<?xml version="1.0" encoding="UTF-8"?> <config> <agent> <share /> <subscribe> <task> <task_name>azure_blob_storage_agent_subscribe</task_name> <message_connection password="password" user="user">https://mesh.perspectium.net</message_connection> <instance_connection password="password" user="user">https://myinstance.service-now.com</instance_connection> <handler>com.perspectium.replicator.file.AzureBlobStorage</handler> <decryption_key>Some decryption key here</decryption_key> <abs_container>pspcontainer</abs_container> <file_format>json</file_format> <connection_string></connection_string> </task> </subscribe> <polling_interval>40</polling_interval> </agent> </config>
Files saved in the Azure Blob Storage container will be named <task_name>.<randomized_unique_identifier>.<file_format>. A randomized unique identifier is used to ensure there are no file naming collisions when saving to the Azure Blob Storage container. Using the above configuration example, a file would be named Azure Blob Storage_agent_subscribe.00b470b7-901c-4447-9316-023a265d632f.json.
NOTE: In this configuration example, your data records will be saved in your Azure Blob Storage container as one file. To save each record from your app as an individual file in your Azure Blob Storage container, use the following agent.xml configuration example as a guide:
<?xml version="1.0" encoding="UTF-8"?> <config> <agent> <share /> <subscribe> <task> <task_name>azure_blob_storage_agent_subscribe</task_name> <message_connection password="password" user="user">https://<customer>.perspectium.net</message_connection> <instance_connection password="password" user="user">https://<instance>.service-now.com</instance_connection> <handler>com.perspectium.replicator.file.Azure Blob StorageSubscriber</handler> <decryption_key>Some decryption key here</decryption_key> <abs_container>pspcontainer</abs_container> <file_format>json</file_format> <connection_string></connection_string> <file_format>json</file_format> <file_prefix>record_</file_prefix> <file_suffix>.json</file_suffix> <one_record_per_file/> </task> </subscribe> <polling_interval>40</polling_interval> </agent> </config>
Saving one record per file supports the following configuration directives:
Directive | Example | Use | Required? |
<file_prefix> | <file_prefix>record_</file_prefix> NOTE: Use the value $table_$d{yyyyMMdd}_$i to set a dynamic file name where table will be the record's table, yyyyMMdd will be the date format, and i will be file number, i.e. problem_20200530_1.json. You can modify yyyyMMdd with other date format of your choice. For example, hourly will need a yyyyMMddHH value. For other date format, see Date Format. <file_prefix>$table_$d{yyyyMMdd}_$i</file_prefix> | A prefix for the file name of each record. If this directive is not specified, “psp.replicator.” will be used as the prefix. NOTE: The time period will be configured in this directory. | No |
<file_suffix> | <file_suffix>.xml</file_suffix> | A suffix for the file name of each record. If this directive is not specified, “.json” will be used as the suffix. | No |
In this case, each record will be saved in its own file named<file_prefix><randomized_unique_identifier><file_suffix>. Using the above configuration example, a file would be named record_00b470b7-901c-4447-9316-023a265d632f.xml.
After configuring your agent.xml file to support your Azure Blob Storage Subscriber Agent, start running your DataSync Agent again.