What are the acceptable test facility losses (Rx bad) and duration, specifically when using the interface loop back test?
Fundamentally, the loop back test should find zero errors, i.e 100% good packets. The IEEE standards state that the error rate on Ethernet should be about 1 bit in every 10^13 (1 in 10 to the power of 13). That equates to about 1 errored packet in every 10^9 packets (taking the number of bits in a packet to be about 10,000) If you are seeing significantly more than 1 error in every 10^9 packets, then that would indicate the Ethernet interface is out of spec. |