Hi
If you collect data over the entire second and average that down for a single point, then no, your ADEV will not be correct.
There are a number of papers on this. What ADEV wants to see is a single phase “sample” at one second spacing. This is
also at the root of how you get 10 second ADEV. You don’t average the ten 1 second data points. You throw nine data points
away and use one of them ( = you decimate the data ).
What happens if you ignore this? Your curve looks “to good”. The resultant curve is below the real curve when plotted.
A quick way to demonstrate this is to do ADEV with averaged vs decimated data ….
Bob
On May 10, 2018, at 4:46 AM, Oleg Skydan olegskydan@gmail.com wrote:
Hi
I have got a pair of not so bad OCXOs (Morion GK85). I did some measurements, the results may be interested to others (sorry if not), so I decided to post them.
I ran a set of 5minutes long counter runs (two OCXOs were measured against each other), each point is 1sec gate frequency measurement with different number of timestamps used in LR calculation (from 10 till 5e6). The counter provides continuous counting. As you can see I reach the HW limitations at 5..6e-12 ADEV (1s tau) with only 1e5 timestamps. The results looks reasonable, the theory predicts 27ps equivalent resolution with 1e5 timestamps, also the sqrt(N) law is clearly seen on the plots. I do not know what is the limiting factor, if it is OCXOs or some counter HW.
I know there are HW problems, some of them were identified during this experiment. They were expectable, cause HW is still just an ugly construction made from the boards left in the "radio junk box" from the other projects/experiments. I am going to move to the well designed PCB with some improvements in HW (and more or less "normal" analog frontend with good comparator, ADCMP604 or something similar, for the "low frequency" input). But I want to finish my initial tests, it should help with the HW design.
Now I have some questions. As you know I am experimenting with the counter that uses LR calculations to improve its resolution. The LR data for each measurement is collected during the gate time only, also measurements are continuous. Will the ADEV be calculated correctly from such measurements? I understand that any averaging for the time window larger then single measurement time will spoil the ADEV plot. Also I understand that using LR can result in incorrect frequency estimate for the signal with large drift (should not be a problem for the discussed measurements, at least for the numbers we are talking about).
Does the ADEV plots I got looks reasonable for the used "mid range" OCXOs (see the second plot for the long run test)?
BTW, I see I can interface GPS module to my counter without additional HW (except the module itself, do not worry it will not be another DIY GPSDO, probably :-) ). I will try to do it. The initial idea is not try to lock the reference OCXO to GPS, instead I will just measure GPS against REF and will make corrections using pure math in SW. I see some advantages with such design - no hi resolution DAC, reference for DAC, no loop, no additional hardware at all - only the GPS module and software :) (it is in the spirit of this project)... Of cause I will not have reference signal that can be used outside the counter, I think I can live with it. It worth to do some experiments.
Best!
Oleg UR3IQO
<Снимок экрана (1148).png><Снимок экрана (1150).png><Снимок экрана (1149).png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
I'm a bit fuzzy, then, on the definition of ADEV. I was under the
impression that one measured a series of
"phase samples" at the desired spacing, then took the RMS value of that
series, not just a single sample,
as the ADEV value.
Can anybody say which it is? The RMS approach seems to make better sense
as it provides some measure
of defense against taking a sample that happens to be an outlier, yet
avoids the flaw of tending to average
the reported ADEV towards zero.
Dana (K8YUM)
On Thu, May 10, 2018 at 9:21 AM, Bob kb8tq kb8tq@n1k.org wrote:
Hi
If you collect data over the entire second and average that down for a
single point, then no, your ADEV will not be correct.
There are a number of papers on this. What ADEV wants to see is a single
phase “sample” at one second spacing. This is
also at the root of how you get 10 second ADEV. You don’t average the ten
1 second data points. You throw nine data points
away and use one of them ( = you decimate the data ).
What happens if you ignore this? Your curve looks “to good”. The resultant
curve is below the real curve when plotted.
A quick way to demonstrate this is to do ADEV with averaged vs decimated
data ….
Bob
On May 10, 2018, at 4:46 AM, Oleg Skydan olegskydan@gmail.com wrote:
Hi
I have got a pair of not so bad OCXOs (Morion GK85). I did some
measurements, the results may be interested to others (sorry if not), so I
decided to post them.
I ran a set of 5minutes long counter runs (two OCXOs were measured
against each other), each point is 1sec gate frequency measurement with
different number of timestamps used in LR calculation (from 10 till 5e6).
The counter provides continuous counting. As you can see I reach the HW
limitations at 5..6e-12 ADEV (1s tau) with only 1e5 timestamps. The results
looks reasonable, the theory predicts 27ps equivalent resolution with 1e5
timestamps, also the sqrt(N) law is clearly seen on the plots. I do not
know what is the limiting factor, if it is OCXOs or some counter HW.
I know there are HW problems, some of them were identified during this
experiment. They were expectable, cause HW is still just an ugly
construction made from the boards left in the "radio junk box" from the
other projects/experiments. I am going to move to the well designed PCB
with some improvements in HW (and more or less "normal" analog frontend
with good comparator, ADCMP604 or something similar, for the "low
frequency" input). But I want to finish my initial tests, it should help
with the HW design.
Now I have some questions. As you know I am experimenting with the
counter that uses LR calculations to improve its resolution. The LR data
for each measurement is collected during the gate time only, also
measurements are continuous. Will the ADEV be calculated correctly from
such measurements? I understand that any averaging for the time window
larger then single measurement time will spoil the ADEV plot. Also I
understand that using LR can result in incorrect frequency estimate for the
signal with large drift (should not be a problem for the discussed
measurements, at least for the numbers we are talking about).
Does the ADEV plots I got looks reasonable for the used "mid range"
OCXOs (see the second plot for the long run test)?
BTW, I see I can interface GPS module to my counter without additional
HW (except the module itself, do not worry it will not be another DIY
GPSDO, probably :-) ). I will try to do it. The initial idea is not try to
lock the reference OCXO to GPS, instead I will just measure GPS against REF
and will make corrections using pure math in SW. I see some advantages with
such design - no hi resolution DAC, reference for DAC, no loop, no
additional hardware at all - only the GPS module and software :) (it is in
the spirit of this project)... Of cause I will not have reference signal
that can be used outside the counter, I think I can live with it. It worth
to do some experiments.
Best!
Oleg UR3IQO
<Снимок экрана (1148).png><Снимок экрана (1150).png><Снимок экрана
(1149).png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
Hi
More or less:
ADEV takes the difference between phase samples and then does a standard
deviation on them. RMS of the phase samples makes a lot of sense and it was
used back in the late 50’s / early 60’s. The gotcha turns out to be that it is an
ill behaved measure. The more data you take, the bigger the number you get.
( = it does not converge ). That problem is what lead NBS to dig into a better
measure. The result was ADEV.
The point about averaging vs decimation relates to what you do to the data before
you ever compute the ADEV. If you have 0.1 second samples, you have to do something
to get to a tau of 1 second or 10 seconds or … The process you use to get the data
to the proper interval turns out to matter quite a bit.
Bob
On May 10, 2018, at 12:17 PM, Dana Whitlow k8yumdoober@gmail.com wrote:
I'm a bit fuzzy, then, on the definition of ADEV. I was under the
impression that one measured a series of
"phase samples" at the desired spacing, then took the RMS value of that
series, not just a single sample,
as the ADEV value.
Can anybody say which it is? The RMS approach seems to make better sense
as it provides some measure
of defense against taking a sample that happens to be an outlier, yet
avoids the flaw of tending to average
the reported ADEV towards zero.
Dana (K8YUM)
On Thu, May 10, 2018 at 9:21 AM, Bob kb8tq kb8tq@n1k.org wrote:
Hi
If you collect data over the entire second and average that down for a
single point, then no, your ADEV will not be correct.
There are a number of papers on this. What ADEV wants to see is a single
phase “sample” at one second spacing. This is
also at the root of how you get 10 second ADEV. You don’t average the ten
1 second data points. You throw nine data points
away and use one of them ( = you decimate the data ).
What happens if you ignore this? Your curve looks “to good”. The resultant
curve is below the real curve when plotted.
A quick way to demonstrate this is to do ADEV with averaged vs decimated
data ….
Bob
On May 10, 2018, at 4:46 AM, Oleg Skydan olegskydan@gmail.com wrote:
Hi
I have got a pair of not so bad OCXOs (Morion GK85). I did some
measurements, the results may be interested to others (sorry if not), so I
decided to post them.
I ran a set of 5minutes long counter runs (two OCXOs were measured
against each other), each point is 1sec gate frequency measurement with
different number of timestamps used in LR calculation (from 10 till 5e6).
The counter provides continuous counting. As you can see I reach the HW
limitations at 5..6e-12 ADEV (1s tau) with only 1e5 timestamps. The results
looks reasonable, the theory predicts 27ps equivalent resolution with 1e5
timestamps, also the sqrt(N) law is clearly seen on the plots. I do not
know what is the limiting factor, if it is OCXOs or some counter HW.
I know there are HW problems, some of them were identified during this
experiment. They were expectable, cause HW is still just an ugly
construction made from the boards left in the "radio junk box" from the
other projects/experiments. I am going to move to the well designed PCB
with some improvements in HW (and more or less "normal" analog frontend
with good comparator, ADCMP604 or something similar, for the "low
frequency" input). But I want to finish my initial tests, it should help
with the HW design.
Now I have some questions. As you know I am experimenting with the
counter that uses LR calculations to improve its resolution. The LR data
for each measurement is collected during the gate time only, also
measurements are continuous. Will the ADEV be calculated correctly from
such measurements? I understand that any averaging for the time window
larger then single measurement time will spoil the ADEV plot. Also I
understand that using LR can result in incorrect frequency estimate for the
signal with large drift (should not be a problem for the discussed
measurements, at least for the numbers we are talking about).
Does the ADEV plots I got looks reasonable for the used "mid range"
OCXOs (see the second plot for the long run test)?
BTW, I see I can interface GPS module to my counter without additional
HW (except the module itself, do not worry it will not be another DIY
GPSDO, probably :-) ). I will try to do it. The initial idea is not try to
lock the reference OCXO to GPS, instead I will just measure GPS against REF
and will make corrections using pure math in SW. I see some advantages with
such design - no hi resolution DAC, reference for DAC, no loop, no
additional hardware at all - only the GPS module and software :) (it is in
the spirit of this project)... Of cause I will not have reference signal
that can be used outside the counter, I think I can live with it. It worth
to do some experiments.
Best!
Oleg UR3IQO
<Снимок экрана (1148).png><Снимок экрана (1150).png><Снимок экрана
(1149).png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
Bob, thanks for clarification!
From: "Bob kb8tq" kb8tq@n1k.org
If you collect data over the entire second and average that down for a
single point, then no, your ADEV will not be correct.
That probably explains why I got so nice (and suspicious) plots :)
There are a number of papers on this. What ADEV wants to see is a single
phase “sample” at one second spacing.
After I read your answer I remembered some nice papers from prof. Rubiola,
me bad - I was able to answer my question by myself. When we take a single
phase "sample" at start and at end time of the "each tau" it is equivalent
to summing all timestamps intervals I collect during that "tau", but by
doing LR processing I calculate weighted sum, so the results will differ.
So, it appears "ADEV" calculations is PDEV (parabolic) in reality, because
of the current firmware processing.
I made a test with two plots for illustration - one is the classical ADEV
(with 2.5ns time resolution), the second one with LR processed data (5e6
timestamps per second). Both plots are made from the same data. It is
obvious the classical ADEV is limited by the counter resolution in the left
part of the plot. It is interesting is it possible to use the 4999998 extra
points per each second to improve counter resolution in ADEV measurements
without affecting ADEV?
Thanks!
Oleg UR3IQO
Hi
On May 10, 2018, at 1:44 PM, Oleg Skydan olegskydan@gmail.com wrote:
Bob, thanks for clarification!
From: "Bob kb8tq" kb8tq@n1k.org
If you collect data over the entire second and average that down for a single point, then no, your ADEV will not be correct.
That probably explains why I got so nice (and suspicious) plots :)
There are a number of papers on this. What ADEV wants to see is a single phase “sample” at one second spacing.
After I read your answer I remembered some nice papers from prof. Rubiola, me bad - I was able to answer my question by myself. When we take a single phase "sample" at start and at end time of the "each tau" it is equivalent to summing all timestamps intervals I collect during that "tau", but by doing LR processing I calculate weighted sum, so the results will differ. So, it appears "ADEV" calculations is PDEV (parabolic) in reality, because of the current firmware processing.
I made a test with two plots for illustration - one is the classical ADEV (with 2.5ns time resolution), the second one with LR processed data (5e6 timestamps per second). Both plots are made from the same data. It is obvious the classical ADEV is limited by the counter resolution in the left part of the plot. It is interesting is it possible to use the 4999998 extra points per each second to improve counter resolution in ADEV measurements without affecting ADEV?
The most accurate answer is always “that depends”. The simple answer is no. If you take a look at some of the papers from
the 90’s you can find suggestions on doing filtering on the first point in the series. The gotcha is that it does impact the first
point. The claim is that if you do it right, it does not impact the rest of the points in the series.
Bob
Thanks!
Oleg UR3IQO
<Снимок экрана (1151).png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
The most accurate answer is always “that depends”. The simple answer is
no.
I have spent the yesterday evening and quite a bit of the night :) reading
many interesting papers and several related discussions in the time-nuts
archive (the Magnus Danielson posts in "Modified Allan Deviation and counter
averaging" and "Omega counters and Parabolic Variance (PVAR)" topics were
very informative and helpful, thanks!).
It looks like the trick to combine averaging with the possibility of correct
ADEV calculation in the post processing exists. There is a nice presentation
made by prof. Rubiola [1]. There is a suitable solution on page 54 (at least
I understood it so, maybe I am wrong). I can switch to usual averaging
(Lambda/Delta counter) instead of LR calculation (Omega counter), the losses
should be very small I my case. With such averaging the MDEV can be
correctly computed. If ADEV is needed, the averaging interval can be reduced
and several measurements (more then eight) can be combined into one point
(creating the new weighting function which resembles the usual Pi one, as
shown in the [1] p.54), it should be possible to calculate usual ADEV using
such data. As far as I understand, the filter which is formed by the
resulting weighting function will have wider bandwidth, so the impact on
ADEV will be smaller and it can be computed correctly. Am I missing
something?
I have made the necessary changes in code, now firmware computes the Delta
averaging, also it computes combined Delta averaged measurements (resulting
in trapezoidal weighting function), both numbers are computed with
continuous stamping and optimal overlapping. Everything is done in real
time. I did some tests. The results are very similar to the ones made with
LR counting.
[1] http://www.rubiola.org/pdf-slides/2012T-IFCS-Counters.pdf
E. Rubiola, High resolution time and frequency counters, updated
version.
All the best!
Oleg UR3IQO
Hi
If you do the weighted average as indicated in the paper and compare it to a “single sample” computation,
the results are different for that time interval. To me that’s a problem. To the authors, the fact that the rest of
the curve is the same is proof that it works. I certainly agree that once you get to longer tau, the process
has no detrimental impact. There is still the problem that the first post on the graph is different depending
on the technique.
The other side of all this is that ADEV is really not a very good way to test a counter. It has it’s quirks and it’s
issues. They are impacted by what is in a counter, but that’s a side effect. If one is after a general test of
counter hardware, one probably should look at other approaches.
If you are trying specifically just to measure ADEV, then there are a lot of ways to do that by it’s self. It’s not
clear that re-invinting the hardware is required to do this. Going with an “average down” approach ultimately
will have problems for certain signals and noise profiles.
Bob
On May 11, 2018, at 10:42 AM, Oleg Skydan olegskydan@gmail.com wrote:
Hi
From: "Bob kb8tq" kb8tq@n1k.org
The most accurate answer is always “that depends”. The simple answer is no.
I have spent the yesterday evening and quite a bit of the night :) reading many interesting papers and several related discussions in the time-nuts archive (the Magnus Danielson posts in "Modified Allan Deviation and counter averaging" and "Omega counters and Parabolic Variance (PVAR)" topics were very informative and helpful, thanks!).
It looks like the trick to combine averaging with the possibility of correct ADEV calculation in the post processing exists. There is a nice presentation made by prof. Rubiola [1]. There is a suitable solution on page 54 (at least I understood it so, maybe I am wrong). I can switch to usual averaging (Lambda/Delta counter) instead of LR calculation (Omega counter), the losses should be very small I my case. With such averaging the MDEV can be correctly computed. If ADEV is needed, the averaging interval can be reduced and several measurements (more then eight) can be combined into one point (creating the new weighting function which resembles the usual Pi one, as shown in the [1] p.54), it should be possible to calculate usual ADEV using such data. As far as I understand, the filter which is formed by the resulting weighting function will have wider bandwidth, so the impact on ADEV will be smaller and it can be computed correctly. Am I missing something?
I have made the necessary changes in code, now firmware computes the Delta averaging, also it computes combined Delta averaged measurements (resulting in trapezoidal weighting function), both numbers are computed with continuous stamping and optimal overlapping. Everything is done in real time. I did some tests. The results are very similar to the ones made with LR counting.
[1] http://www.rubiola.org/pdf-slides/2012T-IFCS-Counters.pdf
E. Rubiola, High resolution time and frequency counters, updated version.
All the best!
Oleg UR3IQO
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
Oleg,
On 05/10/2018 10:46 AM, Oleg Skydan wrote:
Hi
Now I have some questions. As you know I am experimenting with the
counter that uses LR calculations to improve its resolution. The LR data
for each measurement is collected during the gate time only, also
measurements are continuous. Will the ADEV be calculated correctly from
such measurements?
Many assume yes, actual answer is no or well, it depends.
ADEV assumes brick-wall filtering up to the Nyquist frequency as result
of the sample-rate. When you filter the data as you do a Linear
Regression / Least Square estimation, the actual bandwidth will be much
less, so the ADEV measures will be biased for lower taus, but for higher
taus less of the kernel of the ADEV will be affected by the filter and
thus the bias will reduce.
It was when investigating this that Prof Enrico Rubiola and Prof
Francois Vernotte invented the parabolic deviation PDEV.
I understand that any averaging for the time window
larger then single measurement time will spoil the ADEV plot.
Correct.
Also I understand that using LR can result in incorrect frequency estimate for
the signal with large drift (should not be a problem for the discussed
measurements, at least for the numbers we are talking about).
That is a result of not using a LR/LS method supporting drift, at which
time its effect on frequency estimation is greatly reduced.
LR only for phase and frequency model, i.e. only linear components, is
unable to correctly handle the quadratic nature of linear drift. Thus,
by using a LR model that supports it, the quadratic terms influence on
frequency estimation can be reduced.
Does the ADEV plots I got looks reasonable for the used "mid range"
OCXOs (see the second plot for the long run test)?
You probably want to find the source of the wavy response as the orange
and red trace.
Cheers,
Magnus
Hi Dana,
On 05/10/2018 06:17 PM, Dana Whitlow wrote:
I'm a bit fuzzy, then, on the definition of ADEV. I was under the
impression that one measured a series of
"phase samples" at the desired spacing, then took the RMS value of that
series, not just a single sample,
as the ADEV value.
You cannot use RMS here, as the noise does not converge as you build the
average. This was a huge issue before the Allan processing approach
essentially went for combining the average of 2-point RMS measurements,
which ends up being a subtraction of two frequency measure.
Can anybody say which it is? The RMS approach seems to make better sense
as it provides some measure
of defense against taking a sample that happens to be an outlier, yet
avoids the flaw of tending to average
the reported ADEV towards zero.
Forget about RMS here.
Cheers,
Magnus
Dana (K8YUM)
On Thu, May 10, 2018 at 9:21 AM, Bob kb8tq kb8tq@n1k.org wrote:
Hi
If you collect data over the entire second and average that down for a
single point, then no, your ADEV will not be correct.
There are a number of papers on this. What ADEV wants to see is a single
phase “sample” at one second spacing. This is
also at the root of how you get 10 second ADEV. You don’t average the ten
1 second data points. You throw nine data points
away and use one of them ( = you decimate the data ).
What happens if you ignore this? Your curve looks “to good”. The resultant
curve is below the real curve when plotted.
A quick way to demonstrate this is to do ADEV with averaged vs decimated
data ….
Bob
On May 10, 2018, at 4:46 AM, Oleg Skydan olegskydan@gmail.com wrote:
Hi
I have got a pair of not so bad OCXOs (Morion GK85). I did some
measurements, the results may be interested to others (sorry if not), so I
decided to post them.
I ran a set of 5minutes long counter runs (two OCXOs were measured
against each other), each point is 1sec gate frequency measurement with
different number of timestamps used in LR calculation (from 10 till 5e6).
The counter provides continuous counting. As you can see I reach the HW
limitations at 5..6e-12 ADEV (1s tau) with only 1e5 timestamps. The results
looks reasonable, the theory predicts 27ps equivalent resolution with 1e5
timestamps, also the sqrt(N) law is clearly seen on the plots. I do not
know what is the limiting factor, if it is OCXOs or some counter HW.
I know there are HW problems, some of them were identified during this
experiment. They were expectable, cause HW is still just an ugly
construction made from the boards left in the "radio junk box" from the
other projects/experiments. I am going to move to the well designed PCB
with some improvements in HW (and more or less "normal" analog frontend
with good comparator, ADCMP604 or something similar, for the "low
frequency" input). But I want to finish my initial tests, it should help
with the HW design.
Now I have some questions. As you know I am experimenting with the
counter that uses LR calculations to improve its resolution. The LR data
for each measurement is collected during the gate time only, also
measurements are continuous. Will the ADEV be calculated correctly from
such measurements? I understand that any averaging for the time window
larger then single measurement time will spoil the ADEV plot. Also I
understand that using LR can result in incorrect frequency estimate for the
signal with large drift (should not be a problem for the discussed
measurements, at least for the numbers we are talking about).
Does the ADEV plots I got looks reasonable for the used "mid range"
OCXOs (see the second plot for the long run test)?
BTW, I see I can interface GPS module to my counter without additional
HW (except the module itself, do not worry it will not be another DIY
GPSDO, probably :-) ). I will try to do it. The initial idea is not try to
lock the reference OCXO to GPS, instead I will just measure GPS against REF
and will make corrections using pure math in SW. I see some advantages with
such design - no hi resolution DAC, reference for DAC, no loop, no
additional hardware at all - only the GPS module and software :) (it is in
the spirit of this project)... Of cause I will not have reference signal
that can be used outside the counter, I think I can live with it. It worth
to do some experiments.
Best!
Oleg UR3IQO
<Снимок экрана (1148).png><Снимок экрана (1150).png><Снимок экрана
(1149).png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
Oleg,
On 05/11/2018 04:42 PM, Oleg Skydan wrote:
The most accurate answer is always “that depends”. The simple answer
is no.
I have spent the yesterday evening and quite a bit of the night :)
reading many interesting papers and several related discussions in the
time-nuts archive (the Magnus Danielson posts in "Modified Allan
Deviation and counter averaging" and "Omega counters and Parabolic
Variance (PVAR)" topics were very informative and helpful, thanks!).
You are welcome. Good that people have use for them.
It looks like the trick to combine averaging with the possibility of
correct ADEV calculation in the post processing exists. There is a nice
presentation made by prof. Rubiola [1]. There is a suitable solution on
page 54 (at least I understood it so, maybe I am wrong). I can switch to
usual averaging (Lambda/Delta counter) instead of LR calculation (Omega
counter), the losses should be very small I my case. With such averaging
the MDEV can be correctly computed.
If fact, you can do a Omega-style counter you can use for PDEV, you just
need to use the right approach to be able to decimate the data. Oh,
there's a draft paper on that:
https://arxiv.org/abs/1604.01004
Need to update that one.
If ADEV is needed, the averaging
interval can be reduced and several measurements (more then eight) can
be combined into one point (creating the new weighting function which
resembles the usual Pi one, as shown in the [1] p.54), it should be
possible to calculate usual ADEV using such data. As far as I
understand, the filter which is formed by the resulting weighting
function will have wider bandwidth, so the impact on ADEV will be
smaller and it can be computed correctly. Am I missing something?
Well, you can reduce averaging interval to 1 and then you compute the
ADEV, but it does not behave as the MDEV any longer.
What you can do is that you can calculate MDEV or PDEV, and then apply
the suitable bias function to convert the level to that of ADEV.
I have made the necessary changes in code, now firmware computes the
Delta averaging, also it computes combined Delta averaged measurements
(resulting in trapezoidal weighting function), both numbers are computed
with continuous stamping and optimal overlapping. Everything is done in
real time. I did some tests. The results are very similar to the ones
made with LR counting.
Yes, they give relatively close values of deviation, where PDEV goes
somewhat lower, indicating that there is a slight advantage of the LR/LS
frequency estimation measure over that of the Delta counter, as given by
it's MDEV.
Cheers,
Magnus