I have bought an HP 4284A precision LCR meter. This is an old model with a
basic accuracy of 0.05% and covers 20 Hz to 1 MHz.
Converting the specifications into determining the uncertainty of a
measurement is nontrivial, but I think it reasonable to assume the
uncertainty will always be >0.05%.
Surprisingly the current precision LCR meter from Kesight, the E4980A (20
Hz to 2 MHz) offers the same basic accuracy. So while fairly old, the 4284A
doesn't seem to be miles behind the current crop LCR meters from the top
manufacturers.
The recommended calibration period on the 4284A is 6 months, which would
get rather expensive - on the current E4980A the calibration period is a
more respectable 12 months.
I am looking for suggestions on how I can get "reasonable" confidence in
the instrument at "reasonable" cost, without returning it to Keysight every
6 months.
I have a 3457A DVM, but mot much else in the way of precision low frequency
equipment.
It has 4 BNC connectors for Kelvin probes.
I suspect that getting precision resistors and keeping them for a house
standard might be worthwhile, but are looking for suggestions on the best
approach.
I will send it to Keysight once when it arrives to ensure that there are no
faults on it, but I don't currently feel I can justify getting it
calibrated every 6 months.
Maybe I can make some stable "standards", then measure them soon after the
LCR meter calibrated been calibrated and periodically measure their values.
Any suggestions about how to approach that?
Dave
I have a smiliar issue, so interesting to read this. Mine is related to older gear like 4192a and 4275A meters which I am using. All these old meters, if properly adjusted an calibrated, are very good meters, you can repair them yourself, and are very cost efficient (what did you pay for our 4284 if I may ask, I find them still pretty expensive). Newer gear, as you stated, did not add much higher percision if at all (I was told by professional seller of used gear that this is why they are very much demanded, he cennot get enough of them).
I too do not want to spend the money for getting these meters calibrated on a regular basis, it would cost a fortune over time. The same issue applies to my other gear, so I have decided I have a few basic standards calibrated externally on a regular basis, and the rest is calibrated from there:
My standards are voltage (10V 732a/4910), resistance (10k, SR104), rf level (nrp z 55, power sensor dc to 40ghz), and lower frequency thermal converter. The rest is then calibrated by trsnsfer measurements (in a traceable manner). The 3458A e.g. is always good for ratio measurements because of its excellent linearity, so with a stable voltage source, one can calibrate resistors from 1ohm to 1mohm easily and precisely. Just as an example. On a ide not e.g. I have my 732a 10V calibrated only (to < 0,5ppm acc.), and do the 1V calibration with the 3458A. One can do that more often, as this output is less stable, using a 3458A, and saves money externally too.
To your question: today the capacitance is derived in national standard labs from the quantum ohm by quadrature bridges. There also is a calculable capacitor model being used (earlier), but this requires precise machining, and I have not further considered this. Quadrature bridges as used in standard labs are accurate to <1ppm. These are lab setups with cables flying arround, built up by a set of special transformers, dds generators, lock-in amp detector amplifiers and so on, so these are no special all-in-one meters. Others have shown/reported that pretty basic lab bridges using standard stuff (7 digit ratio transformers, standard dds generators, precision ac level meters) are still good to 10e-5, which is pretty precise. Dont get me wrong, this still requires expensive and precision gear, but nothig very special (e.g. no special custom made transformers and stuff).
So I am currently preparing a setup of a coaxial quadrature bridge. A 1000pf coaxial capacitor (gr type), these are reasonably cheap and very stable, and as standard a 10kohm coaxial resistor will be used. This can be build using rf cases and a good, low ppm/k resistor. I am still trying to figure out the math re. error propagation (e.g., how does a potential phase error of the dds generator needed to generate the quadrature signal translate into a mesurement tolerance). There is some data online, some with a download fee, and also books.
So with this setup, one can precisely calibrate a 1000pf capacitor. The gear needed may sound a little expensive, but I have it anyway, amongst other reasons also to calibrate my other stuff (e.g the ratio transformer is used for ac meter and calibrator cal.), so all I needed in addition is the capacitors (actually I bought gr types from 1pf to 1000pf) and the coaxial cabling setup which is a litte special related to the ratio transformer, rest is just bnc cabling. These gr capacitors have very little drift, they are air capacitors built from sheet metal with temp elong. coefficient of essentially 0. This allows that they are calibrated say every 2 years or more (cal labs do that too).
I did also buy some time ago russian hermetic glas mica capacitors, which I had calibrated externally, so it will be interesting to see how they aged. (These should be fairly stable, if you want one let me know). I installed them into a rf box with 4 bnc connectors (similar to the ones used as standard cal devices by Agilent). For the gr capacitors, you need an adapter between the gr874 connectors and the 4 BNCs. You also need to build this.
Thats all then for the 1000pF. The other capacitors can then be calibrated with a similar setup using again the ratio transformer in a capacitor bridge setup. And one needs the math behind to derive the accuracies of the measurements. As you plan to do an initial cal of your meter externally, you could vaildate the bridge tolerance calculations. (Actually we could later also exchange capacitors to validate results, I have done that with a voltnut here in Germany on some other stuff some time ago.)
Then you also need shorts and opens and resistors in shielded cases, this is straigtforward though, I had built them up some time ago already for my meters.
cheers
Gesendet: Freitag, 06. Februar 2015 um 23:10 Uhr
Von: "Dr. David Kirkby (Kirkby Microwave Ltd)" drkirkby@kirkbymicrowave.co.uk
An: "Discussion of precise voltage measurement" volt-nuts@febo.com
Betreff: [volt-nuts] Checking an LCR meter
I have bought an HP 4284A precision LCR meter. This is an old model with a
basic accuracy of 0.05% and covers 20 Hz to 1 MHz.
Converting the specifications into determining the uncertainty of a
measurement is nontrivial, but I think it reasonable to assume the
uncertainty will always be >0.05%.
Surprisingly the current precision LCR meter from Kesight, the E4980A (20
Hz to 2 MHz) offers the same basic accuracy. So while fairly old, the 4284A
doesn't seem to be miles behind the current crop LCR meters from the top
manufacturers.
The recommended calibration period on the 4284A is 6 months, which would
get rather expensive - on the current E4980A the calibration period is a
more respectable 12 months.
I am looking for suggestions on how I can get "reasonable" confidence in
the instrument at "reasonable" cost, without returning it to Keysight every
6 months.
I have a 3457A DVM, but mot much else in the way of precision low frequency
equipment.
It has 4 BNC connectors for Kelvin probes.
I suspect that getting precision resistors and keeping them for a house
standard might be worthwhile, but are looking for suggestions on the best
approach.
I will send it to Keysight once when it arrives to ensure that there are no
faults on it, but I don't currently feel I can justify getting it
calibrated every 6 months.
Maybe I can make some stable "standards", then measure them soon after the
LCR meter calibrated been calibrated and periodically measure their values.
Any suggestions about how to approach that?
Dave
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.
Hi Dave:
The 4-terminal pair measurement method used on many of the HP (and other) LCR meters and impedance analyzers is much
more accurate over many many decades of impedance value than network analyzers which are only accurate near 50 Ohms.
This is explained in the impedance measurement handbook. In my mind the key to an accurate measurement is to know which
of a number of impedance measurement techniques to use. The Z hdbk has contour maps showing the effect of frequency,
impedance and accuracy for a number of different measurement techniques.
http://www.prc68.com/I/Z.shtml#KeyDocs
The other key idea is that LCR measurements are done on components without connectors and so there's the handling of the
fixture parasitics. This is far from trivial. Again the Impedance Measurement Handbook is essential and the
Measurement Accessories Selection Guide is very handy.
And finally by making impedance measurements over a broad frequency range you can fit a model to the data instead of
utilizing the simple series or parallel 2 component model that's standard in an LCR meter. This simple model may be
fine for a lot of applications, but is not good for more complex cases. For example the model for a crystal resonator
that's built into the E4915, E4916 & E5100 is based on an S21 measurement and the use of a low-Z PI test fixture. What
all these instruments have in common is frequency sweep and DSP IF processing. See my Crystal Equivalent Circuit web
page for an example of a Z-transform (Z:T in upper left of screen shots) measurement.
http://www.prc68.com/I/Xec.shtml
Mail_Attachment --
Have Fun,
Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/2012Issues.html
http://www.prc68.com/I/DietNutrition.html
Dr. David Kirkby (Kirkby Microwave Ltd) wrote:
I have bought an HP 4284A precision LCR meter. This is an old model with a
basic accuracy of 0.05% and covers 20 Hz to 1 MHz.
Converting the specifications into determining the uncertainty of a
measurement is nontrivial, but I think it reasonable to assume the
uncertainty will always be >0.05%.
Surprisingly the current precision LCR meter from Kesight, the E4980A (20
Hz to 2 MHz) offers the same basic accuracy. So while fairly old, the 4284A
doesn't seem to be miles behind the current crop LCR meters from the top
manufacturers.
The recommended calibration period on the 4284A is 6 months, which would
get rather expensive - on the current E4980A the calibration period is a
more respectable 12 months.
I am looking for suggestions on how I can get "reasonable" confidence in
the instrument at "reasonable" cost, without returning it to Keysight every
6 months.
I have a 3457A DVM, but mot much else in the way of precision low frequency
equipment.
It has 4 BNC connectors for Kelvin probes.
I suspect that getting precision resistors and keeping them for a house
standard might be worthwhile, but are looking for suggestions on the best
approach.
I will send it to Keysight once when it arrives to ensure that there are no
faults on it, but I don't currently feel I can justify getting it
calibrated every 6 months.
Maybe I can make some stable "standards", then measure them soon after the
LCR meter calibrated been calibrated and periodically measure their values.
Any suggestions about how to approach that?
Dave
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.
We have a Wayne Kerr B905 (0.05% basic accuracy), ESI 2150 (0.02%) and
HP 4284A (0.1%) at work. They have the same working principle of
course but are very different from the calibration point of view.
The WK and ESI units have no calibration trimmers inside or user
accessible software constants. The service manuals have no calibration
instructions either, just a few checking procedures to make sure that
there is no fault.
The accurary relies on crystal frequency and stable range resistors.
WK uses Vishay bulk metal foil and ESI their own mica card resistors
(similar to SR1 resistance standards). The lifetime drift of these
resistors is a magnitude lower than the best accuracy of the meter.
Both units were manufactured in the early 80's and are still within
the specs. No repairs or adjustments ever made. They are regularly
checked against a set of 3 terminal bulk metal foil resistors
(calibrated with a 3458A) and a GR 1404A nitrogen filled standard
capacitor (which visits the national lab).
There is a huge different with our HP 4284A (and other HP Yokogawa
impedance line products as far as I know) which needs to be adjusted
in yearly basis. Could be the 1 MHz frequency range which requires a
number of trimmers and more complex circuitry but also seems to ruin
the stablity.
We also use the ESI 2150 to get a traceable inductance calibration
based on traceable resistance and capacitance (and frequency). The ESI
2150 inductance specifications (after the checking procedures for
capacitance and resistance) are similar to the best uncertainty
available from the high level labs.
2015-02-07 0:10 UTC+02.00, Dr. David Kirkby (Kirkby Microwave Ltd)
drkirkby@kirkbymicrowave.co.uk:
I have bought an HP 4284A precision LCR meter. This is an old model with a
basic accuracy of 0.05% and covers 20 Hz to 1 MHz.
Converting the specifications into determining the uncertainty of a
measurement is nontrivial, but I think it reasonable to assume the
uncertainty will always be >0.05%.
Surprisingly the current precision LCR meter from Kesight, the E4980A (20
Hz to 2 MHz) offers the same basic accuracy. So while fairly old, the 4284A
doesn't seem to be miles behind the current crop LCR meters from the top
manufacturers.
The recommended calibration period on the 4284A is 6 months, which would
get rather expensive - on the current E4980A the calibration period is a
more respectable 12 months.
I am looking for suggestions on how I can get "reasonable" confidence in
the instrument at "reasonable" cost, without returning it to Keysight every
6 months.
I have a 3457A DVM, but mot much else in the way of precision low frequency
equipment.
It has 4 BNC connectors for Kelvin probes.
I suspect that getting precision resistors and keeping them for a house
standard might be worthwhile, but are looking for suggestions on the best
approach.
I will send it to Keysight once when it arrives to ensure that there are no
faults on it, but I don't currently feel I can justify getting it
calibrated every 6 months.
Maybe I can make some stable "standards", then measure them soon after the
LCR meter calibrated been calibrated and periodically measure their values.
Any suggestions about how to approach that?
Dave
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.