Leakage test Standard for Medical Device

Udirn1

Starting to get Involved
Hello all.
I have been engaging with chinease laboratory to test our medical RF device and come to a dead lock with them pertaining their interpretation of the patient leakage test , basically on how it should be done.
Our device has a BF type B polar RF .

Per standard, we need to connect the bpolar handpiece to a 200 ohms load, and measure the leakage current flowing to earth via another 200 ohms resistor.
We have done this test both in our internal lab and an official lab in our country and passed, by using a floating oscilloscope measuring voltage across the 200 ohms resistor connected in series to ground., hence found the V and calculated the I from ohms law.

The chinease however, insist, that since in standard it says HF current meter, they must use a current probes, like clamps...and with their method we dont pass in fractions...for example, if we measure 48mA they measure 53mA.
The standard says below 50 mA is OK.

All of our explanations to them has been blocked so far.
My questions:
1. Is there an addendum or some official paper that I can use to prove to them that the illustrations or wordings in the standard are not one sided and mandatory and it may happen that using ohms law is OK?
2. Any professional paper explaining that using o scope to measure a 1mhz signal is much more accurate than a current probe, that itself is converting magnetic field to voltage internally and then converts it to current has its tolerances as well?
3. Is there any formal paper dealing with acceptable tolerances within such measurements due to all tolerances comprising the dut? The standard stipulates that under 50 mA is acceptable, but I am sure there is also an acceptable tolerance ration
4. Any other suggestions on how to mitigate the situation, would be highly appreciated.

Udi
 

Peter Selvey

Leader
Super Moderator
There are a few issues here, it is an interesting topic.

First is that any and all measurements with oscilloscopes, either direct or with voltage and current probes should be treated with a large grain of salt, or more formerly, as uncalibrated. This includes your own measurements, your local lab and the Chinese lab. This is a controversial claim so it's worth to explain.

Historically, accurate rms measurements at high frequency was done with "thermal" based meters. These are meters that use the basic principle that the amount of heat produced in a wideband resistor will be the same regardless of the frequency. Famous old instruments like the Fluke 8912A and those big YEW black box HF ammeters used this method. This principle remains in use for calibration labs around the world when dealing with higher frequencies.

Around the 1980s, digital oscilloscopes became popular. Being digital it was possible to display parameters such as Vmax, Vpp, Vrms and so on. At first, these were 8-bit resolution, which is really very rough and not suitable for making higher accuracy laboratory measurements, and since the core measurement is rough, there is no point designing laboratory quality front end electronics (buffer, gain/dividers) and accessories such as a voltage and current probes, as well as the software algorithms used to extract measured data. Being rough and covering a wide frequency range, it was also not practical to develop support items such as reference sources at higher frequencies and accredited calibration over the range of frequencies measured.

To be fair, 8-bit scopes are excellent tools for timing, diagnostic and working in dB where errors of say ±1dB (±11%) is not a big issue, for example audio.

Given the use in laboratories for electrical safety, you might expect that the industry evolved to improve the front end electronics, and probes, reference sources and calibration, particularly in the frequency region needed by labs (≤1MHz). For example, develop a 12-bit 20MHz scope with guaranteed specifications up to 1MHz, solid software algorithms, and low cost reference sources and accredited calibration systems.

But no. Rather than improve accuracy at lower frequencies (e.g. 1MHz or less) the industry has largely stuck to 8-bit scopes with higher and higher frequency (e.g. 50MHz, 100MHz, 300MHz, 1GHz).

Part of the problem has been that calibration laboratories perform calibration according to the manufacturer's specifications. However, these specifications do not cover accuracy at high frequency, apart from the "3dB bandwidth" (== ±30%), which is useless for lab measurements. Another problem is the sneaky way in which scope manufacturers give the specifications, which if you are not looking carefully, imply they are accurate to 3%. That accuracy is just at DC and is also full scale. A normal DMM meter will give accuracy specs at different frequencies and also have the main accuracy spec as a function of reading (which is more meaningful), not full scale. There are similar issues with the specifications of voltage probes and current probes. These are generally designed for a frequency flatness of ±0.5dB (±5%) if adjustments are done properly, but again there is no traceability to support this, and no formal specification, and other specs can be misleading.

Thus, the combination of a calibration sticker, dodgy specifications together with digital display of Vmax, Vpp, Vrms has lulled the industry into a major blind spot.

So, what to do for this issue? Well, one possibility is to try and get your hands on a meter that is properly specified at 1MHz such as an old Fluke 8912A and calibrated at 1MHz. Or you could make a dedicated set up (200Ω + cables, fixed channel, range, time division) and have that calibrated by a lab at the current/frequency of interest, of course confirming first that the cal lab is accredited for the current/frequency. But ... take care, scopes are not great for making repeatable measurements as well. I recommend to do a test when the scope is cold, and then again 30 min later after warm up and move the cables around to detect if there are any temperature/cable movement issues.

Apart from the scope/probe, it also takes some effort to prove the 200Ω resistor is non-inductive (or at least, enough not to affect the results). This is particularly important if you are using the I = V/R method. I recommend special thin film resistors. It is also possible to use say a 1Ω resistor to measure the current rather than the 200Ω, since a standard 1Ω SMD resistor is unlikely to have any issues at 1MHz while a 200Ω power resistor might.

Finally, and perhaps most importantly, you should not be relying on the test results for compliance.

The HF leakage current from a generator is a function of maximum voltage (Vrms), the frequency and the capacitance between earth and the isolated output. For example, if your generator is 300Vrms, 1MHz and the limit is 50mArms, the maximum capacitance should be:

C = I/(2πVf) = 26.5pF

Thus, the design should guarantee the capacitance is below this limit. Since capacitance is variable (in production, temperature), it makes sense to have a margin of at least 20%, or say around 20pF. If your own test results show 48mA for a 50mA limit, then you have a problem anyway, since you won't be able to ensure this limit is met in regular production, which is the whole point of the tests. And this is before the measurement tolerances above are taken into account.

This is a very small amount of capacitance so it can be difficult to control but should be possible. Knowing this small amount of capacitance has a big influence on the result also means the whole test set up needs to be carefully inspected to make sure there is no additional capacitance between the non-isolated (earthed) circuit and the isolated circuit. An "isolated" oscilloscope is not likely to be isolated at high frequency as there will be EMC capacitors and also pri-sec capacitance in any isolation transformer. If scope is the only option, a low capacitance differential probe is the best way to go, or a current probe that has been validated at the frequency of interest.

A big area!!
 

Tidge

Trusted Information Resource
I've been offered some small amount of professional criticism for describing some of my (self-categorized) "non-technical" questions during job interviews; this post reminds me of some of the more specific technical questions I've asked of candidates for positions that involve instrumentation, especially this bit:
To be fair, 8-bit scopes are excellent tools for timing, diagnostic and working in dB where errors of say ±1dB (±11%) is not a big issue, for example audio.

"Just what percent accuracy of a static measurement can you expect from an 8-bit device?"

The rest of the post was personally triggering (pardon the oscope pun) for a variety of reasons. Thanks for that!
 
Top Bottom