There are some performance options you should have a look at to tune your DPM configuration
Network: DPM can be a network intensive application especially during the initial full backup and the express full backup creation.
- Check that you DPM servers have a fixed network speed on the server and the switch side. Make sure there are no duplex mismatches.
- When the load on your production network is to high you could consider a special backup LAN.
- It is Microsoft best practice for DPM to use a pagefile size that is 0.2 percent the size of all recovery point volumes combined, in addition to the recommended size (generally, 1.5 times the amount of RAM on the computer). For example, if the recovery point volumes on a DPM server total 2 TB, you should increase the pagefile size by 4 GB."
- Additionally; consider moving the pagefile to a different volume. The best performances gains will be achieved if this volume is not only located on a separate hard drive but is also the only volume on that hard drive and pagefile.sys is the only file on the volume. The last step is maybe a little over the top for most configurations, but it an option to consider, you can use the resource monitor to find your bottleneck.
- On the DPM server the volumes in the DPM storage pool are not visible to the virus scanner. There is no need to exclude these from the real time protection scanning.
- Exclude the SQL database and its log files from the real time protection virus scanning; many of the virus scan programs will temporarily lock a file while scanning, which may cause data integrity issues. DO NOT run real time virus scanning against any database files.
Disk configuration: For the DPM Storage pool
- Microsoft recommends that you use disks that have capacity of no more than 1.5 terabytes. Because a dynamic volume can span up to 32 disks, if you use 1.5-terabyte disks, DPM can create replica volumes of up to 48 terabytes.
Optimize options: Within the DPM console
- On wire compression: decreases the size of data being transferred during replica creation and synchronization and allows more data throughput with less impact to network performance. However, this option adds to the CPU load on both the DPM server and the protected computers. The amount of compression and improvement on network performance depends on workload
- Consistency Check: To ensure that replica data is kept consistent you can schedule a consistency check. To optimize performance, run the consistency check during off-peak hours.
- Synchronization time: DPM performs synchronization (of application log files) according to the Protection group schedule. By default all schedules start at 12:00 AM, with the optimize performance options you can select an off-set to this start time. This prevents you from starting all synchronizations at the same time.