We’ve got a 25Gb database on SQL7 server backing up locally at approximately 8Mb / sec, with no variation. However, when the database is backed up is to a networked server the throughput varies dramatically from job to job, from approximately 6Mb /sec down to under 1Mb / sec.
The database operates on a 100Mb switched network comprising the database server, a backup server and a PDC. There are no other users connected to the server and there’s no other activity at the time of the backup.
I’ve done the basic bottleneck checks using Performance Monitor, and the Backup Device throughput is the same as the Read Bytes / sec. So basically, I’m still none the wiser.
Any help would be gratefully received.
Thanks
The database operates on a 100Mb switched network comprising the database server, a backup server and a PDC. There are no other users connected to the server and there’s no other activity at the time of the backup.
I’ve done the basic bottleneck checks using Performance Monitor, and the Backup Device throughput is the same as the Read Bytes / sec. So basically, I’m still none the wiser.
Any help would be gratefully received.
Thanks