With batch processing, records are inserted or updated in batches versus being executed one by one. When there are no more records received in the given flush interval, <batch_flush_interval>, or when the number of records reaches the maximum batch size, <max_batch_size> (whichever happens first), then the DataSync Agent will insert or update all the records from the current batch with the use of <batch_insert/> and <batch_update/>.
Using batch processing can improve the DataSync Agent's overall performance.
To use batch processing with DataSync Agent, configure your agent.xml with the directives below:
Directive | Example | Description |
---|---|---|
<batch_insert/> | Self closing tag that configures your Agent to batch SQL INSERT statements. | |
<batch_update/> | Self closing tag that configures your Agent to batch SQL UPDATE statements. | |
<batch_flush_interval> | <batch_flush_interval>20</batch_flush_interval> | Number of seconds that will trigger a batch SQL statement execution. Acceptable values are integers from 1 to 30 (seconds). |
<max_batch_size> | <max_batch_size>200</max_batch_size> | Number of SQL statements that will trigger a batch SQL statement execution. A larger suggested value is 200. |
Example of agent.xml
<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?> <config> <agent> <subscribe> <task> <task_name>example_subscribe</task_name> <message_connection password="encrypt:MbsPassword" user="MbsUser">amqps://example.perspectium.net</message_connection> <instance_connection password="encrypt:MbsPassword" user="SnUser">https://example.service-now.com</instance_connection> <handler>com.perspectium.replicator.sql.SQLSubscriber</handler> <decryption_key>Some decryption key here</decryption_key> <database_type>mysql</database_type> <database_server>localhost</database_server> <database_port>3306</database_port> <database_user>perspectium</database_user> <database_password>DbPassword</database_password> <database_parms>characterEncoding=UTF-8</database_parms> <database>psp_repl</database> <batch_insert/> <max_batch_size>200</max_batch_size> </task> </subscribe> <max_reads_per_connect>4000</max_reads_per_connect> <polling_interval>5</polling_interval> <skip_message_set_processing/> </agent> </config>