volt-nuts@lists.febo.com

Discussion of precise voltage measurement

View all threads

What if “verification” in metrology?

S-
Steve - Home
Sat, Jun 6, 2020 10:30 PM

Yes, for sure. I should have mentioned The difference between metrology as in instrument calibration versus user calibration of the system to account for the environment. The calibration procedure for the 8510 is far more complex for sure.

I had a good laugh in our metrology lab’s DC voltage section a few decades ago. After lots of manual math to account for uncertainties in our transfer from standard cells to DVMs, the “meter beaters” as we nicknamed them would then check a dry cell battery they kept in a drawer as a final check of the DVM! They never had a failure until we drained their precious “standard” dry cell over a weekend and left it for them. Monday by noon time they were sure something was horribly wrong with their standard cell ensemble because the dry cell test failed three consecutive DVMs!

Steve
WB0DBS

On Jun 6, 2020, at 5:10 PM, Florian Teply usenet@teply.info wrote:

Am Sat, 6 Jun 2020 07:59:42 -0500
schrieb Steve - Home steve-krull@cox.net:

Along those very lines, HP/Agilent/Keysight sold both calibration
kits and verification kits for VNAs. We did a full calibration using
the mechanical calibration kit at the start of a measuring session
where you needed the absolute best accuracy your system was capable
of and used the verification kit as a sanity check during the daily
operation.

Yes, and no. Unfortunately the term "calibration" is used differently
in the realms of metrology and Network Analyzers. What's typically
referred to as calibration when network analyzers are concerned is
aimed at canceling the effect of the environment (test leads
and their frequency response, time delay/phase shift), not actual
calibration of the instrument. You don't care to know how big the error
introduced by that actually is, you just want to have the instrument
substract it from the real measurement values. As you mention, the
verification kits serve as sanity checks if the "calibration" actually
was okay. But there's a number of parameters which none of the
calibration and verification kits I've seen until now addresses:
frequency accurracy is not checked, absolute power accurracy is not
checked.

Florian
DH7FET

Steve
WB0DBS

On Jun 6, 2020, at 6:21 AM, Florian Teply usenet@teply.info wrote:

Funny thing how things work out time-wise: I had a discussion
yesterday on the very topic during re-audit for ISO 9001.

In basic terms, verification in metrology is a very slimmed down
calibration: For a calibration, you essentially check every range of
your instrument at usually five or more spots within that range in
order to determine accurracy of your instrument in each range.
For a verification, you do this only at the spot where you intend to
measure. So if you were to measure a nominal 7.2V source, you'd
compare the reading of your meter with your, say, known good 7.5V
reference instead of doing a full calibration of the meter. It
doesn't tell you anything about, say, the offset error of your
meter, or how big the deviation is at the 1000V range, it just
tells you if your meter meets requirements of the one measurement
you intend to do.

So, in order to determine whether or not your chinese voltage
reference meets its specs, you'd check your meter against, say, the
well-characterized LTZ1000A you happen to have in your lab.

Strictly speaking, you still have to do it as carefully as you
would do a real calibration, taking all known effects into account,
but it's still much less time-consuming than a full calibration as
you check only one single point instead of all possible ranges with
five points each.

Does this help answer your questions or did I just bring up more
questions than answers?

best regards,
Florian

Am Wed, 3 Jun 2020 09:58:59 +0100
schrieb "Dr. David Kirkby" drkirkby@kirkbymicrowave.co.uk:

I am trying to work out what the BIPM definition of verification
means

https://jcgm.bipm.org/vim/en/2.44.html

“ provision of objective evidence that a given item fulfils
specified requirements”

Let’s assume that I wanted to verify if the voltage reference meets
the Chinese specifications. Would consulting the 3457A manual and
voltage reference specifications, to determine if the meter is good
be considered verification?

Or does verification only apply to an instrument? For example
comparing the 3457A to a Fluke voltage reference?

The one sentence definition in VIM leaves me wondering what the
intension of the entry is.


volt-nuts mailing list -- volt-nuts@lists.febo.com
To unsubscribe, go to
http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com and
follow the instructions there.


volt-nuts mailing list -- volt-nuts@lists.febo.com
To unsubscribe, go to
http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com and
follow the instructions there.


volt-nuts mailing list -- volt-nuts@lists.febo.com
To unsubscribe, go to http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com
and follow the instructions there.

Yes, for sure. I should have mentioned The difference between metrology as in instrument calibration versus user calibration of the system to account for the environment. The calibration procedure for the 8510 is far more complex for sure. I had a good laugh in our metrology lab’s DC voltage section a few decades ago. After lots of manual math to account for uncertainties in our transfer from standard cells to DVMs, the “meter beaters” as we nicknamed them would then check a dry cell battery they kept in a drawer as a final check of the DVM! They never had a failure until we drained their precious “standard” dry cell over a weekend and left it for them. Monday by noon time they were sure something was horribly wrong with their standard cell ensemble because the dry cell test failed three consecutive DVMs! Steve WB0DBS > On Jun 6, 2020, at 5:10 PM, Florian Teply <usenet@teply.info> wrote: > > Am Sat, 6 Jun 2020 07:59:42 -0500 > schrieb Steve - Home <steve-krull@cox.net>: > >> Along those very lines, HP/Agilent/Keysight sold both calibration >> kits and verification kits for VNAs. We did a full calibration using >> the mechanical calibration kit at the start of a measuring session >> where you needed the absolute best accuracy your system was capable >> of and used the verification kit as a sanity check during the daily >> operation. >> > Yes, and no. Unfortunately the term "calibration" is used differently > in the realms of metrology and Network Analyzers. What's typically > referred to as calibration when network analyzers are concerned is > aimed at canceling the effect of the environment (test leads > and their frequency response, time delay/phase shift), not actual > calibration of the instrument. You don't care to know how big the error > introduced by that actually is, you just want to have the instrument > substract it from the real measurement values. As you mention, the > verification kits serve as sanity checks if the "calibration" actually > was okay. But there's a number of parameters which none of the > calibration and verification kits I've seen until now addresses: > frequency accurracy is not checked, absolute power accurracy is not > checked. > > Florian > DH7FET > >> Steve >> WB0DBS >> >> >> >>>> On Jun 6, 2020, at 6:21 AM, Florian Teply <usenet@teply.info> wrote: >>> >>> Funny thing how things work out time-wise: I had a discussion >>> yesterday on the very topic during re-audit for ISO 9001. >>> >>> In basic terms, verification in metrology is a very slimmed down >>> calibration: For a calibration, you essentially check every range of >>> your instrument at usually five or more spots within that range in >>> order to determine accurracy of your instrument in each range. >>> For a verification, you do this only at the spot where you intend to >>> measure. So if you were to measure a nominal 7.2V source, you'd >>> compare the reading of your meter with your, say, known good 7.5V >>> reference instead of doing a full calibration of the meter. It >>> doesn't tell you anything about, say, the offset error of your >>> meter, or how big the deviation is at the 1000V range, it just >>> tells you if your meter meets requirements of the one measurement >>> you intend to do. >>> >>> So, in order to determine whether or not your chinese voltage >>> reference meets its specs, you'd check your meter against, say, the >>> well-characterized LTZ1000A you happen to have in your lab. >>> >>> Strictly speaking, you still have to do it as carefully as you >>> would do a real calibration, taking all known effects into account, >>> but it's still much less time-consuming than a full calibration as >>> you check only one single point instead of all possible ranges with >>> five points each. >>> >>> Does this help answer your questions or did I just bring up more >>> questions than answers? >>> >>> best regards, >>> Florian >>> >>> >>> Am Wed, 3 Jun 2020 09:58:59 +0100 >>> schrieb "Dr. David Kirkby" <drkirkby@kirkbymicrowave.co.uk>: >>> >>>> I am trying to work out what the BIPM definition of verification >>>> means >>>> >>>> https://jcgm.bipm.org/vim/en/2.44.html >>>> >>>> “ provision of objective evidence that a given item fulfils >>>> specified requirements” >>>> >>>> Let’s assume that I wanted to verify if the voltage reference meets >>>> the Chinese specifications. Would consulting the 3457A manual and >>>> voltage reference specifications, to determine if the meter is good >>>> be considered verification? >>>> >>>> Or does verification only apply to an instrument? For example >>>> comparing the 3457A to a Fluke voltage reference? >>>> >>>> The one sentence definition in VIM leaves me wondering what the >>>> intension of the entry is. >>>> >>>> >>>> >>>> >>> >>> >>> _______________________________________________ >>> volt-nuts mailing list -- volt-nuts@lists.febo.com >>> To unsubscribe, go to >>> http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com and >>> follow the instructions there. >> >> >> _______________________________________________ >> volt-nuts mailing list -- volt-nuts@lists.febo.com >> To unsubscribe, go to >> http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com and >> follow the instructions there. > > > _______________________________________________ > volt-nuts mailing list -- volt-nuts@lists.febo.com > To unsubscribe, go to http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com > and follow the instructions there.
FT
Florian Teply
Sun, Jun 7, 2020 9:14 AM

Am Sat, 6 Jun 2020 17:30:08 -0500
schrieb Steve - Home steve-krull@cox.net:

Yes, for sure. I should have mentioned The difference between
metrology as in instrument calibration versus user calibration of the
system to account for the environment. The calibration procedure for
the 8510 is far more complex for sure.

I had a good laugh in our metrology lab’s DC voltage section a few
decades ago. After lots of manual math to account for uncertainties
in our transfer from standard cells to DVMs, the “meter beaters” as
we nicknamed them would then check a dry cell battery they kept in a
drawer as a final check of the DVM! They never had a failure until we
drained their precious “standard” dry cell over a weekend and left it
for them. Monday by noon time they were sure something was horribly
wrong with their standard cell ensemble because the dry cell test
failed three consecutive DVMs!

Wow, that's mean. ;-) But certainly something to laugh at after a
while. And of course also a good example of poor execution of a good
idea. Checking for consistency is a good thing. You just have to
keep in mind that this can also fail. Having another independent check
is more resilient, but adds complexity.
That's btw exactly what these VNA verification kits provide: an
independent check of the correction introduced by the calibration kit
as they have different known good values of reflection and attennuation
than those provided by the calibration kit.

That's also what makes calibration fun for nuts: You're always at the
chase of effects most people don't even know exist.

BTW @ David: Thanks a bunch for pointing out the NPL metrology courses.
They're pretty good for starters, and the fact they're free right now
makes it even better...

Florian
DH7FET

Am Sat, 6 Jun 2020 17:30:08 -0500 schrieb Steve - Home <steve-krull@cox.net>: > Yes, for sure. I should have mentioned The difference between > metrology as in instrument calibration versus user calibration of the > system to account for the environment. The calibration procedure for > the 8510 is far more complex for sure. > > I had a good laugh in our metrology lab’s DC voltage section a few > decades ago. After lots of manual math to account for uncertainties > in our transfer from standard cells to DVMs, the “meter beaters” as > we nicknamed them would then check a dry cell battery they kept in a > drawer as a final check of the DVM! They never had a failure until we > drained their precious “standard” dry cell over a weekend and left it > for them. Monday by noon time they were sure something was horribly > wrong with their standard cell ensemble because the dry cell test > failed three consecutive DVMs! > Wow, that's mean. ;-) But certainly something to laugh at after a while. And of course also a good example of poor execution of a good idea. Checking for consistency is a good thing. You just have to keep in mind that this can also fail. Having another independent check is more resilient, but adds complexity. That's btw exactly what these VNA verification kits provide: an independent check of the correction introduced by the calibration kit as they have different known good values of reflection and attennuation than those provided by the calibration kit. That's also what makes calibration fun for nuts: You're always at the chase of effects most people don't even know exist. BTW @ David: Thanks a bunch for pointing out the NPL metrology courses. They're pretty good for starters, and the fact they're free right now makes it even better... Florian DH7FET
MY
Miguel Yepes
Sun, Jun 7, 2020 8:16 PM

Well in a VNA a calibration kit makes corrections and a verification kit does not it just compares values. I remember long ago oscilloscopes calibration manuals had adjustments because most of them when out of specification. Now I dont see calibration manuals just verification manuals. Most equipment now dont have internal adjustments and some are via software so I dont think is going to happen what you suggest. The calibration dictionay definition is:
In information technology and other fields, calibration is the setting or correcting of a measuring device or base level, usually by adjusting it to match or conform to a dependably known and unvarying measure. For example, the brightness or black level of a video display can be calibrated using a PLUGE pattern


From: volt-nuts volt-nuts-bounces@lists.febo.com on behalf of Florian Teply usenet@teply.info
Sent: Saturday, June 6, 2020 4:52 PM
To: volt-nuts@lists.febo.com volt-nuts@lists.febo.com
Subject: Re: [volt-nuts] What if “verification” in metrology?

Am Sat, 6 Jun 2020 15:58:55 +0000
schrieb Miguel Yepes theamberco@hotmail.com:

But in calibration you have to make some type of corrections and on
verification you dont you just compare to be within the manufacturer
ranges. Now for instance Tektronix has verification manuals to check
their equipment, before they called them calibration manuals and
included adjustment to meet the specifications of the manufacturer.
Now the user cant adjust easily because most are via software and is
not commercially available.

That's a very popular misconception here: Calibration by itself does
not include adjustment. You can have a calibrated instrument which is
one order of magnitude off, or an instrument which fails manufacturer
specifications for measurement accurracy, and still the calibration is
valid. Also you can have an instrument which meets specification but
is not calibrated. As Ilya mentioned, calibration is essentially just a
quantified comparison to a known good reference, giving you an estimate
of how much the instrument reading is off under well-defined conditions.

Of course performing a calibration requires you to take all known
systematic effects into account, and it would be wise to do the same
for verification, but there's no need at all to touch the instrument.
If you happen to know through calibration that your volt meter has an
offset of 1.8236V in all ranges, there'sno need to change what the
instrument displays. You just take this offset into account whenever
you use the value you read. The same goes for all the other types of
corrections: if you know the instrument has a temperature dependency of
-50mV/K, and it has been calibrated at 23°C, but you're doing the
measurements in summer at 28°C, you better add 250mV to your reading to
cancel that temperature drift. And you do exactly the same for a
verification. In the end it doesn't matter if these influences are
taken into account before the instrument displays its reading or before
you write down the measurement result in the report.

Personally, I'd rather include these considerations in measurement
reports than adjusting the instrument for two reasons: a) it
demonstrates that you're aware of these effects and can compensate for
them, and b) as Ilya mentioned it doesn't change the instrument. Even
though nowadays I consider argument b) only half valid: It's true that
there's a certain risk of changing the instruments behaviour by turning
trim pots and the like, thereby invalidating prior knowledge about the
instruments behaviour. But purely mathematical corrections do not
change the behaviour of the instrument. There's still the issue of
comparing readings taken at different points in time: the same readings
will have different meanings before and after adjustment, while
properly calculated corrections should lead to the same result even at
different points in time.

I consider it good measurement practice to not only identify the
sources of error and their respecive contributions, but also the
uncertainties associated with these, as it clearly shows where
improvements have the most effect on your quality of measurement.

And if you implement all these corrections outside your instrument,
there's no need to buy a calibration manual as you take the instrument
as is.

HTH,
Florian


From: volt-nuts volt-nuts-bounces@lists.febo.com on behalf of Dr.
David Kirkby drkirkby@kirkbymicrowave.co.uk Sent: Saturday, June 6,
2020 9:19 AM To: Discussion of precise voltage measurement
volt-nuts@lists.febo.com Subject: Re: [volt-nuts] What if
“verification” in metrology?

On Sat, 6 Jun 2020 at 12:18, Florian Teply usenet@teply.info wrote:

Funny thing how things work out time-wise: I had a discussion
yesterday on the very topic during re-audit for ISO 9001.

Yes, it is.

In basic terms, verification in metrology is a very slimmed down

calibration: For a calibration, you essentially check every range of
your instrument at usually five or more spots within that range in
order to determine accurracy of your instrument in each range.
For a verification, you do this only at the spot where you intend to
measure. So if you were to measure a nominal 7.2V source, you'd
compare the reading of your meter with your, say, known good 7.5V
reference instead of doing a full calibration of the meter.

So, in order to determine whether or not your chinese voltage
reference meets its specs, you'd check your meter against, say, the
well-characterized LTZ1000A you happen to have in your lab.

Strictly speaking, you still have to do it as carefully as you
would do a real calibration, taking all known effects into account,
but it's still much less time-consuming than a full calibration as
you check only one single point instead of all possible ranges with
five points each.

Does this help answer your questions or did I just bring up more
questions than answers?

Yes. I wish VIM was a bit more explicit about it. A single sentence
just does not do it justice.

That's true, also took me some time and many discussions with
colleagues to wrap my head around the concept. Verification can be a
pretty powerful tool to save cost: at work we use it to significantly
extend calibration periods of our test equipment. And it helps keeping
measurement uncertainty in check.

Florian


volt-nuts mailing list -- volt-nuts@lists.febo.com
To unsubscribe, go to http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com
and follow the instructions there.

Well in a VNA a calibration kit makes corrections and a verification kit does not it just compares values. I remember long ago oscilloscopes calibration manuals had adjustments because most of them when out of specification. Now I dont see calibration manuals just verification manuals. Most equipment now dont have internal adjustments and some are via software so I dont think is going to happen what you suggest. The calibration dictionay definition is: In information technology and other fields, calibration is the setting or correcting of a measuring device or base level, usually by adjusting it to match or conform to a dependably known and unvarying measure. For example, the brightness or black level of a video display can be calibrated using a PLUGE pattern ________________________________ From: volt-nuts <volt-nuts-bounces@lists.febo.com> on behalf of Florian Teply <usenet@teply.info> Sent: Saturday, June 6, 2020 4:52 PM To: volt-nuts@lists.febo.com <volt-nuts@lists.febo.com> Subject: Re: [volt-nuts] What if “verification” in metrology? Am Sat, 6 Jun 2020 15:58:55 +0000 schrieb Miguel Yepes <theamberco@hotmail.com>: > But in calibration you have to make some type of corrections and on > verification you dont you just compare to be within the manufacturer > ranges. Now for instance Tektronix has verification manuals to check > their equipment, before they called them calibration manuals and > included adjustment to meet the specifications of the manufacturer. > Now the user cant adjust easily because most are via software and is > not commercially available. > That's a very popular misconception here: Calibration by itself does not include adjustment. You can have a calibrated instrument which is one order of magnitude off, or an instrument which fails manufacturer specifications for measurement accurracy, and still the calibration is valid. Also you can have an instrument which meets specification but is not calibrated. As Ilya mentioned, calibration is essentially just a quantified comparison to a known good reference, giving you an estimate of how much the instrument reading is off under well-defined conditions. Of course performing a calibration requires you to take all known systematic effects into account, and it would be wise to do the same for verification, but there's no need at all to touch the instrument. If you happen to know through calibration that your volt meter has an offset of 1.8236V in all ranges, there'sno need to change what the instrument displays. You just take this offset into account whenever you use the value you read. The same goes for all the other types of corrections: if you know the instrument has a temperature dependency of -50mV/K, and it has been calibrated at 23°C, but you're doing the measurements in summer at 28°C, you better add 250mV to your reading to cancel that temperature drift. And you do exactly the same for a verification. In the end it doesn't matter if these influences are taken into account before the instrument displays its reading or before you write down the measurement result in the report. Personally, I'd rather include these considerations in measurement reports than adjusting the instrument for two reasons: a) it demonstrates that you're aware of these effects and can compensate for them, and b) as Ilya mentioned it doesn't change the instrument. Even though nowadays I consider argument b) only half valid: It's true that there's a certain risk of changing the instruments behaviour by turning trim pots and the like, thereby invalidating prior knowledge about the instruments behaviour. But purely mathematical corrections do not change the behaviour of the instrument. There's still the issue of comparing readings taken at different points in time: the same readings will have different meanings before and after adjustment, while properly calculated corrections should lead to the same result even at different points in time. I consider it good measurement practice to not only identify the sources of error and their respecive contributions, but also the uncertainties associated with these, as it clearly shows where improvements have the most effect on your quality of measurement. And if you implement all these corrections outside your instrument, there's no need to buy a calibration manual as you take the instrument as is. HTH, Florian > ________________________________ > From: volt-nuts <volt-nuts-bounces@lists.febo.com> on behalf of Dr. > David Kirkby <drkirkby@kirkbymicrowave.co.uk> Sent: Saturday, June 6, > 2020 9:19 AM To: Discussion of precise voltage measurement > <volt-nuts@lists.febo.com> Subject: Re: [volt-nuts] What if > “verification” in metrology? > > On Sat, 6 Jun 2020 at 12:18, Florian Teply <usenet@teply.info> wrote: > > > Funny thing how things work out time-wise: I had a discussion > > yesterday on the very topic during re-audit for ISO 9001. > > > > Yes, it is. > > In basic terms, verification in metrology is a very slimmed down > > calibration: For a calibration, you essentially check every range of > > your instrument at usually five or more spots within that range in > > order to determine accurracy of your instrument in each range. > > For a verification, you do this only at the spot where you intend to > > measure. So if you were to measure a nominal 7.2V source, you'd > > compare the reading of your meter with your, say, known good 7.5V > > reference instead of doing a full calibration of the meter. > > > > > > So, in order to determine whether or not your chinese voltage > > reference meets its specs, you'd check your meter against, say, the > > well-characterized LTZ1000A you happen to have in your lab. > > > > Strictly speaking, you still have to do it as carefully as you > > would do a real calibration, taking all known effects into account, > > but it's still much less time-consuming than a full calibration as > > you check only one single point instead of all possible ranges with > > five points each. > > > > Does this help answer your questions or did I just bring up more > > questions than answers? > > > > Yes. I wish VIM was a bit more explicit about it. A single sentence > just does not do it justice. > That's true, also took me some time and many discussions with colleagues to wrap my head around the concept. Verification can be a pretty powerful tool to save cost: at work we use it to significantly extend calibration periods of our test equipment. And it helps keeping measurement uncertainty in check. Florian _______________________________________________ volt-nuts mailing list -- volt-nuts@lists.febo.com To unsubscribe, go to http://lists.febo.com/mailman/listinfo/volt-nuts_lists.febo.com and follow the instructions there.