The IPC\WHMA-A-620 Rev A standard states that in the absence of agreement on test requirements between manufacturer and user the continuity spec for class 3 assemblies should be 2 ohms, or 1 ohm plus the actual resistance. We agree that this is a reasonable method for determining continuity test specs. However, when more stringent continuity test specs are dictated, certain guidelines should be followed in order to avoid setting a spec so tight that “in-specification” cables fail testing.
When testing a cable or harness for continuity, there are three factors that must be considered:
- Theoretical resistance of the “ideal” device under test (DUT). This includes both the wire and mating-contact resistance of the DUT.
- In-tolerance variations in the actual resistance of the DUT.
- Added resistance for measuring the DUT (test fixture resistance and measurement tolerance of the tester used to test the DUT).
These three factors work together to create the following equation:
(Theoretical Resistance of Ideal DUT )
+
(Worst-case variances for in-spec wire and contacts)
+
(Practical variances of tester accuracy and fixturing resistance)
=
Lowest practical resistance threshold for DUT test
In a perfect world we would determine the “exact” resistance of the assembly and then add some margin of error to determine the test threshold. In reality, many factors come into play that must be added to this “ideal” resistance in order to be able to practically test harnesses in a production environment.
Read our full article which will help you determine a practical test resistance number and determine effective, but practical, continuity test specifications.
Setting Practical Resistance Specifications for Continuity Testing