BCI Testing: MIL-STD-461 CS114, RTCA DO160 Sec 20, and ISO 11452-4

All Bulk Current Injection (BCI) tests are addressing the same risk: given a relatively long cable run mounted relatively close to conductive structure (vehicle chassis), the cabling can pick up electromagnetic noise from the surrounding environment. Cabling is especially good at doing this at lower frequencies, below 200-ish MHz. Various sources can be: crosstalk from co-routed cabling, noise from on-board RF transmitters, or noise from the external environment (radars, comms, HIRF, etc.). At higher frequencies, the risk is addressed by testing radiated susceptibility/immunity (e.g. RS103) in a semi-anechoic or reverb chamber. However there are drawbacks to that approach at lower frequencies: the longer distances of radiated testing doesn’t represent the crosstalk risk; reverb chambers are harder to spec at lower frequencies; radiated testing at 1 m is in the near field at lower frequencies, which can lead to a lack of replicability. Additionally, BCI testing can be done in a shield room instead of a more specialized (and often harder to schedule) ALSE or reverb chamber. 

Thus BCI testing, where current is directly induced in the cable under test (CUT), is generally preferable below 200 MHz. There are three comparable standards that do this for defense, aerospace, and automotive industries that are worth examining: 

Differences between the three can be seen immediately from their frequency ranges and maximum induced current levels:

  • CS114: 10 kHz - 200 MHz, up to 109 dBuA

  • DO160: 10 kHz - 400 MHz, up to 109 dBuA

  • ISO 11452-4: 1 MHz - 400 MHz, up to 100 dBuA

DO160 mentions specifically that their frequency range is meant to overlap with their radiated susceptibility testing range, which starts at 100 MHz. ISO, focused on the automotive industry, doesn’t have as many strong threats in the kHz range (although given the increasing prevalence of HV systems in EVs switching in the kHz range, I wonder if this will change in the future). 

Another thing to compare is the maximum amplitudes. MIL-STD-461G has a good rule of thumb that for wiring suspended 5 cm above a ground plane (as is specified in all the test testups), an incident E-field of 1 V/m will result in 1.5 mA of induced current. CS114 and DO160 have maximum levels of 109 dBuA, equivalent to 300 mA and assuming a maximum threat of 200 V/m. In both cases, HIRF (high intensity radiated field) is one of the driving concerns. If we apply the same formula to ISO 11452-4, we get an incident field of 67 V/m. This is a bit odd because the maximum vehicle level radiated immunity test in this frequency range goes up to 100 V/m, but there may be a presumption that the vehicle chassis provides some level of shielding to the unit. 

The other thing to consider is the modulation of the injected current and the dwell times. All methods recommend dwelling for the response time of the unit under test (UUT) but have different minimums:

  • CS114: Amplitude modulated (AM) only, dwell time 3 sec

  • DO160: Both AM and continuous wave (CW), dwell time 1 sec

  • ISO 11452-4: Both AM and CW, dwell time 1 sec

Generally speaking a unit tested to DO160 BCI would be considered equivalent to one tested to CS114, but the different dwell times means there’s a small chance the DO160 test could miss something, while the fact that CS114 doesn’t cover 200 - 400 MHz means there’s a small chance something could be missed there. 

In other respects, the test setups between the three are generally identical. They all have cables suspended 5 cm above a ground plane, with a monitor probe in addition to the injection probe during testing. They test cable bundles per connector, and if it is known that chassis will be used for current return the ground lead is excluded from the cable bundle. For CS114 and DO160, if there are cables with redundant purposes they should all be subjected to the BCI stimulus simultaneously; this situation generally does not arise on automotive modules. 

 

TIP: 

To save time and hassle later, test all cables unshielded but have a shielded version ready to swap in if there’s a serious failure. Especially on aerospace projects it is not uncommon to spec that all cables will be shielded and to do EMC testing that way–then later in the project to ditch the shielding for cost/weight purposes, rendering the earlier EMC testing meaningless. Testing without shielding gives you good confidence that the unit will be OK if shields are removed, or it will give you hard data to show that shields are necessary. 

 

TIP:

GSFC-STD-7000b (as usual) has some excellent additional information on this test method in Section 2.5.3.3.6, which expands on some fundamental aspects beyond what you’ll find in the MIL-STD-461 appendices.

 

Categories:

ANSI | CISPR | FCC | IEC | ISO | IEEE | MIL-STD | NASA | SAE | OTHER

Previous
Previous

IEC 62153: “Metallic communication cables test methods”

Next
Next

ESD: IEC 61000-4-2, MIL-STD-461 CS118, ISO 10605, et. al.