vipperdigger
IS-IT--Management
Much debate here about which is better. One side says set users at 10M and servers at 100m as to not overwhelm servers. Other side says set both at 100 to get data off wire quicker as todays servers NIC's and switches have so much buffer it does not hurt servers. Any links or white papers that has done studies to prove this one way or the other? Has any one with experience on mass changing on user ports and their network performance?