Routine production tests - dielectric voltage withstand test

eldercosta

Involved In Discussions
I am using the Annex K of IEC TR 62354:2014 for guidance for production tests.

I want to have a normative base to subsidize the reduction of the time thatthe equipment tested in factory undergoes the tests to avoid unnecessary stress (see this topic).

Amongst other things, the Annex presents requirements regarding test equipment checking. On Annex K, K.1 e) we have the following requirement for the case the dielectric voltage withstand duration of one second.

e) The test equipment, when adjusted for production line testing, is to produce an output voltage that is not less than the factory test value specified, nor is the magnitude of the test voltage to be greater than 120% of the specified test potential when the tester is used in each of the following conditions:

1) If the test duration is 1s, the ouput is to be maintained within the specified range:
- when only a voltmeter having an input impedance of at least 2MOhms and a specimen of the product being tested are connected to the output terminals AND

-when a relatively high resistance is connected in parallel with the voltmeter and the product being tested, and the value of the resistance is gradually reduced to the point where an indication of unacceptable performance just occurs.

My question relates to the second dash. I am trying to devise how the test setup would be to meet this description. My understanding is I would have to put the variable resistance in parallel with a sample of the EUT, then applying the test a few times reducing the resistance until the hipot detects a fault, then increasing it a little and performing the hipot voltage measurement to verify it complies with the requirement.

The problem I could not find a variable resistor that would withstand 4kVac with the necessary resistance (let alone that could have it varied safely). Is there something off-the-shelf that I could use? TBH I thought of an electronic circuit that could do it but then it occurred to me that I could be reinventing the wheel, which I would like to avoid as much as possible. Also I could be overthinking and/or overcomplicating the test setup.

FWIW the 60s test is much simpler :
2) If the test duration is 1 min, the output voltage is to be maintained withihn the specified range, by manual or automatic means, throughout the 1 m duration of the test or until there is an indication of unacceptable performance.

Any suggestions?
 

Peter Selvey

Leader
Super Moderator
I think in practice it would involve two resistors set just above and below the trip point. For example, if the target is 0.5mA @ 1500Vrms, use 2.85MΩ for the trip resistor and 3.15MΩ for the no trip. This tests ±5% around the trip point which is more than enough.

The real tolerance would depend on your margin. For example, if the actual medical device has typically 0.1mA @ 1500V, and you set the trip point to 0.2mA just to steer clear, then the test could be done at ±10% or ±20% around the 0.2mA trip point and that would be fine from an engineering and safety point of view.
 

eldercosta

Involved In Discussions
Hello, Peter.

Thank you very much for your input. It makes much sense. TBH I was thinking of 4000Vrms, between mains and AP - one or the tests required by the Annex, the other being between primary and low voltage metal parts (a.k.a. secondary in my case) where 1500Vrms applies. I wonder if making this test with only 4000Vrms suffices.

In understood from your answer the EUT would not be connected. Did I get it right?
 

Peter Selvey

Leader
Super Moderator
Yes, I don't have the standard but I can guess the intent is just to check the trip point is working OK especially for a 1s test. For that the EUT should not be connected. Also it looks like a set up validation, i.e. not intended to be done every test. It might be reasonable to do it once a day at most.
 

eldercosta

Involved In Discussions
The first quote in my OP is verbatim, I just emphasized the "AND" at the end of the first dash.

e) The test equipment, when adjusted for production line testing, is to produce an output voltage that is not less than the factory test value specified, nor is the magnitude of the test voltage to be greater than 120% of the specified test potential when the tester is used in each of the following conditions:

1) If the test duration is 1s, the ouput is to be maintained within the specified range:
- when only a voltmeter having an input impedance of at least 2MOhms and a specimen of the product being tested are connected to the output terminals AND

-when a relatively high resistance is connected in parallel with the voltmeter and the product being tested, and the value of the resistance is gradually reduced to the point where an indication of unacceptable performance just occurs.

I agree the EUT shouldn't be connected for a matter of good sense as we do not want to compromise the product by ovestressing it - that is the point of the production tests not being exactly the same as the type tests.

The text of the second dash suggests otherwise to me, though. I do hope I am misinterpreting it.

What do you think?
 
Top Bottom