MD
Magnus Danielson
Mon, Nov 27, 2017 11:44 PM
Hi Jim,
On 11/28/2017 12:03 AM, jimlux wrote:
On 11/27/17 2:45 PM, Magnus Danielson wrote:
There is nothing wrong about attempting new approaches, or even just
test and idea and see how it pans out. You should then compare it to a
number of other approaches, and as you test things, you should analyze
the same data with different methods. Prototyping that in Python is
fine, but in order to analyze it, you need to be careful about the
details.
I would consider one just doing the measurements and then try
different post-processings and see how those vary.
Another paper then takes up on that and attempts analysis that matches
the numbers from actual measurements.
So, we might provide tough love, but there is a bit of experience
behind it, so it should be listened to carefully.
It is tough to come up with good artificial test data - the literature
on generating "noise samples" is significantly thinner than the
literature on measuring the noise.
Agree completely. It's really the 1/f flicker noise which is hard.
The white phase and frequency noise forms is trivial in comparison, but
also needs its care to detail.
Enough gaussian is sometimes harder than elsewhere. I always try to
consider it a possible limitation.
Enough random is another issue. What is the length of the noise source,
what is the characteristics?
When it comes to measuring actual signals with actual ADCs, there's also
a number of traps - you can design a nice approach, using the SNR/ENOB
data from the data sheet, and get seemingly good data.
The challenge is really in coming up with good tests of your
measurement technique that show that it really is giving you what you
think it is.
A trivial example is this (not a noise measuring problem, per se) -
You need to measure the power of a received signal - if the signal is
narrow band, and high SNR, then the bandwidth of the measuring system
(be it a FFT or conventional spectrum analyzer) doesn't make a lot of
difference - the precise filter shape is non-critical. The noise power
that winds up in the measurement bandwidth is small, for instance.
But now, let's say that the signal is a bit wider band or lower SNR or
you're uncertain of its exact frequency, then the shape of the filter
starts to make a big difference.
Now, let’s look at a system where there’s some decimation involved - any
decimation raises the prospect of “out of band signals” aliasing into
the post decimation passband. Now, all of a sudden, the filtering
before the decimator starts to become more important. And the number of
bits you have to carry starts being more important.
There is a risk of wasting bits too early when decimating. The trouble
comes when the actual signal is way below the noise and you want to
bring it out in post-processing, the limit of dynamic will haunt you.
This have been shown many times before.
Also, noise and quantization has an interesting interaction.
It actually took a fair amount of work to prove that a system I was
working on
a) accurately measured the signal (in the presence of other large signals)
b) that there weren’t numerical issues causing the strong signal to show
up in the low level signal filter bins
c) that the measured noise floor matched the expectation
It's tricky business indeed. The cross-correlation technique could
potentially measure below it's own noise-floor. Turns out it was very
very VERY hard to do that safely. It remains a research topic. At best
we just barely got to work around the issue. That is indeed a high
dynamic setup.
Cheers,
Magnus
Hi Jim,
On 11/28/2017 12:03 AM, jimlux wrote:
> On 11/27/17 2:45 PM, Magnus Danielson wrote:
>
>>
>> There is nothing wrong about attempting new approaches, or even just
>> test and idea and see how it pans out. You should then compare it to a
>> number of other approaches, and as you test things, you should analyze
>> the same data with different methods. Prototyping that in Python is
>> fine, but in order to analyze it, you need to be careful about the
>> details.
>>
>> I would consider one just doing the measurements and then try
>> different post-processings and see how those vary.
>> Another paper then takes up on that and attempts analysis that matches
>> the numbers from actual measurements.
>>
>> So, we might provide tough love, but there is a bit of experience
>> behind it, so it should be listened to carefully.
>>
>
>
> It is tough to come up with good artificial test data - the literature
> on generating "noise samples" is significantly thinner than the
> literature on measuring the noise.
Agree completely. It's really the 1/f flicker noise which is hard.
The white phase and frequency noise forms is trivial in comparison, but
also needs its care to detail.
Enough gaussian is sometimes harder than elsewhere. I always try to
consider it a possible limitation.
Enough random is another issue. What is the length of the noise source,
what is the characteristics?
> When it comes to measuring actual signals with actual ADCs, there's also
> a number of traps - you can design a nice approach, using the SNR/ENOB
> data from the data sheet, and get seemingly good data.
>
> The challenge is really in coming up with good *tests* of your
> measurement technique that show that it really is giving you what you
> think it is.
>
> A trivial example is this (not a noise measuring problem, per se) -
>
> You need to measure the power of a received signal - if the signal is
> narrow band, and high SNR, then the bandwidth of the measuring system
> (be it a FFT or conventional spectrum analyzer) doesn't make a lot of
> difference - the precise filter shape is non-critical. The noise power
> that winds up in the measurement bandwidth is small, for instance.
>
> But now, let's say that the signal is a bit wider band or lower SNR or
> you're uncertain of its exact frequency, then the shape of the filter
> starts to make a big difference.
>
> Now, let’s look at a system where there’s some decimation involved - any
> decimation raises the prospect of “out of band signals” aliasing into
> the post decimation passband. Now, all of a sudden, the filtering
> before the decimator starts to become more important. And the number of
> bits you have to carry starts being more important.
There is a risk of wasting bits too early when decimating. The trouble
comes when the actual signal is way below the noise and you want to
bring it out in post-processing, the limit of dynamic will haunt you.
This have been shown many times before.
Also, noise and quantization has an interesting interaction.
> It actually took a fair amount of work to *prove* that a system I was
> working on
> a) accurately measured the signal (in the presence of other large signals)
> b) that there weren’t numerical issues causing the strong signal to show
> up in the low level signal filter bins
> c) that the measured noise floor matched the expectation
It's tricky business indeed. The cross-correlation technique could
potentially measure below it's own noise-floor. Turns out it was very
very VERY hard to do that safely. It remains a research topic. At best
we just barely got to work around the issue. That is indeed a high
dynamic setup.
Cheers,
Magnus
AK
Attila Kinali
Tue, Nov 28, 2017 8:27 AM
Every experimentalist suppose ergodicity on this kind of noise, otherwise
you get nowhere.
Err.. no. Even if you assume that the spectrum tops off at some very
low frequency and does not increase anymore, ie that there is a finite
limit to noise power, even then ergodicity is not given.
Ergodicity breaks because the noise process is not stationary.
And assuming so for any kind of 1/f noise would be wrong.
Addendum: the reason why this is wrong is because assuming noise
is ergodic means it is stationary. But the reason why we have to
treat 1/f noise specially is exactly because it is not stationary.
I.e. we lose the one property in our model that we need to make
the model realistic.
Attila Kinali
--
<JaberWorky> The bad part of Zurich is where the degenerates
throw DARK chocolate at you.
On Mon, 27 Nov 2017 23:50:22 +0100
Attila Kinali <attila@kinali.ch> wrote:
> > Every experimentalist suppose ergodicity on this kind of noise, otherwise
> > you get nowhere.
>
> Err.. no. Even if you assume that the spectrum tops off at some very
> low frequency and does not increase anymore, ie that there is a finite
> limit to noise power, even then ergodicity is not given.
> Ergodicity breaks because the noise process is not stationary.
> And assuming so for any kind of 1/f noise would be wrong.
Addendum: the reason why this is wrong is because assuming noise
is ergodic means it is stationary. But the reason why we have to
treat 1/f noise specially is exactly because it is not stationary.
I.e. we lose the one property in our model that we need to make
the model realistic.
Attila Kinali
--
<JaberWorky> The bad part of Zurich is where the degenerates
throw DARK chocolate at you.
MR
Mattia Rizzi
Tue, Nov 28, 2017 8:52 AM
This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
That's the theory. I am not arguing that it's realistic.
Ergodicity breaks because the noise process is not stationary.
I know but see the following.
Well, any measurement is an estimate.
It's not so simple. If you don't assume ergodicity, your spectrum analyzer
does not work, because:
- The spectrum analyzer takes several snapshots of your realization to
estimate the PSD. If it's not stationary, the estimate does not converge.
- It's just a single realization, therefore also a flat signal can be a
realization of 1/f flicker noise. Your measurement has zero statistical
significance.
2017-11-27 23:50 GMT+01:00 Attila Kinali attila@kinali.ch:
To make the point a bit more clear. The above means that noise with
a PSD of the form 1/f^a for a>=1 (ie flicker phase, white frequency
and flicker frequency noise), the noise (aka random variable) is:
- Not independently distributed
- Not stationary
- Not ergodic
I think you got too much in theory. If you follow striclty the statistics
theory, you get nowhere.
You can't even talk about 1/f PSD, because Fourier doesn't converge over
infinite power signals.
This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
The power in 1/f noise is actually limited by the age of the universe.
And quite strictly so. The power you have in 1/f is the same for every
decade in frequency (or time) you go. The age of the universe is about
1e10 years, that's roughly 3e17 seconds, ie 17 decades of possible noise.
If we assume something like a 1k carbon resistor you get something around
of 1e-17W/decade of noise power (guestimate, not an exact calculation).
That means that resistor, had it been around ever since the universe was
created, then it would have converted 17*1e-17 = 2e-16W of heat into
electrical energy, on average, over the whole liftime of the universe.
That's not much :-)
In fact, you are not allowed to take a realization, make several fft and
claim that that's the PSD of the process. But that's what the spectrum
analyzer does, because it's not a multiverse instrument.
Well, any measurement is an estimate.
Every experimentalist suppose ergodicity on this kind of noise, otherwise
you get nowhere.
Err.. no. Even if you assume that the spectrum tops off at some very
low frequency and does not increase anymore, ie that there is a finite
limit to noise power, even then ergodicity is not given.
Ergodicity breaks because the noise process is not stationary.
And assuming so for any kind of 1/f noise would be wrong.
Attila Kinali
--
<JaberWorky> The bad part of Zurich is where the degenerates
throw DARK chocolate at you.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
Hi
>This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
That's the theory. I am not arguing that it's realistic.
>Ergodicity breaks because the noise process is not stationary.
I know but see the following.
>Well, any measurement is an estimate.
It's not so simple. If you don't assume ergodicity, your spectrum analyzer
does not work, because:
1) The spectrum analyzer takes several snapshots of your realization to
estimate the PSD. If it's not stationary, the estimate does not converge.
2) It's just a single realization, therefore also a flat signal can be a
realization of 1/f flicker noise. Your measurement has *zero* statistical
significance.
2017-11-27 23:50 GMT+01:00 Attila Kinali <attila@kinali.ch>:
> Hoi Mattia,
>
> On Mon, 27 Nov 2017 23:04:56 +0100
> Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
>
> > >To make the point a bit more clear. The above means that noise with
> > > a PSD of the form 1/f^a for a>=1 (ie flicker phase, white frequency
> > > and flicker frequency noise), the noise (aka random variable) is:
> > > 1) Not independently distributed
> > > 2) Not stationary
> > > 3) Not ergodic
> >
> > I think you got too much in theory. If you follow striclty the statistics
> > theory, you get nowhere.
> > You can't even talk about 1/f PSD, because Fourier doesn't converge over
> > infinite power signals.
>
> This is true. But then the Fourier transformation integrates time from
> minus infinity to plus infinity. Which isn't exactly realistic either.
> The power in 1/f noise is actually limited by the age of the universe.
> And quite strictly so. The power you have in 1/f is the same for every
> decade in frequency (or time) you go. The age of the universe is about
> 1e10 years, that's roughly 3e17 seconds, ie 17 decades of possible noise.
> If we assume something like a 1k carbon resistor you get something around
> of 1e-17W/decade of noise power (guestimate, not an exact calculation).
> That means that resistor, had it been around ever since the universe was
> created, then it would have converted 17*1e-17 = 2e-16W of heat into
> electrical energy, on average, over the whole liftime of the universe.
> That's not much :-)
>
> > In fact, you are not allowed to take a realization, make several fft and
> > claim that that's the PSD of the process. But that's what the spectrum
> > analyzer does, because it's not a multiverse instrument.
>
> Well, any measurement is an estimate.
>
> > Every experimentalist suppose ergodicity on this kind of noise, otherwise
> > you get nowhere.
>
> Err.. no. Even if you assume that the spectrum tops off at some very
> low frequency and does not increase anymore, ie that there is a finite
> limit to noise power, even then ergodicity is not given.
> Ergodicity breaks because the noise process is not stationary.
> And assuming so for any kind of 1/f noise would be wrong.
>
>
> Attila Kinali
> --
> <JaberWorky> The bad part of Zurich is where the degenerates
> throw DARK chocolate at you.
> _______________________________________________
> time-nuts mailing list -- time-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/
> mailman/listinfo/time-nuts
> and follow the instructions there.
>
AK
Attila Kinali
Tue, Nov 28, 2017 1:12 PM
Well, any measurement is an estimate.
It's not so simple. If you don't assume ergodicity, your spectrum analyzer
does not work, because:
- The spectrum analyzer takes several snapshots of your realization to
estimate the PSD. If it's not stationary, the estimate does not converge.
I do not see how ergocidity has anything to do with a spectrum analyzer.
You are measuring one single instance. Not multiple.
And no, you do not need stationarity either. The spectrum analyzer has
a lower cut of frequency, which is given by its update rate and the
inner workings of the SA.
- It's just a single realization, therefore also a flat signal can be a
realization of 1/f flicker noise. Your measurement has zero statistical
significance.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples. If it would be, then the random variable would not
have a PSD of 1/f. If you go back to the definition of the PSD of
a random variable X(ω,t), you will see it is independent of ω.
And about statistical significance: yes, you will have zero statistical
significance about the behaviour of the population of random variables,
but you will have a statistically significant number of samples of one
realization of the random variable. And that's what you work with.
Attila Kinali
--
It is upon moral qualities that a society is ultimately founded. All
the prosperity and technological sophistication in the world is of no
use without that foundation.
-- Miss Matheson, The Diamond Age, Neil Stephenson
On Tue, 28 Nov 2017 09:52:37 +0100
Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
> >Well, any measurement is an estimate.
>
> It's not so simple. If you don't assume ergodicity, your spectrum analyzer
> does not work, because:
> 1) The spectrum analyzer takes several snapshots of your realization to
> estimate the PSD. If it's not stationary, the estimate does not converge.
I do not see how ergocidity has anything to do with a spectrum analyzer.
You are measuring one single instance. Not multiple.
And no, you do not need stationarity either. The spectrum analyzer has
a lower cut of frequency, which is given by its update rate and the
inner workings of the SA.
> 2) It's just a single realization, therefore also a flat signal can be a
> realization of 1/f flicker noise. Your measurement has *zero* statistical
> significance.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples. If it would be, then the random variable would not
have a PSD of 1/f. If you go back to the definition of the PSD of
a random variable X(ω,t), you will see it is independent of ω.
And about statistical significance: yes, you will have zero statistical
significance about the behaviour of the population of random variables,
but you will have a statistically significant number of samples of *one*
realization of the random variable. And that's what you work with.
Attila Kinali
--
It is upon moral qualities that a society is ultimately founded. All
the prosperity and technological sophistication in the world is of no
use without that foundation.
-- Miss Matheson, The Diamond Age, Neil Stephenson
BK
Bob kb8tq
Tue, Nov 28, 2017 2:05 PM
Well, any measurement is an estimate.
It's not so simple. If you don't assume ergodicity, your spectrum analyzer
does not work, because:
- The spectrum analyzer takes several snapshots of your realization to
estimate the PSD. If it's not stationary, the estimate does not converge.
I do not see how ergocidity has anything to do with a spectrum analyzer.
You are measuring one single instance. Not multiple.
And no, you do not need stationarity either. The spectrum analyzer has
a lower cut of frequency, which is given by its update rate and the
inner workings of the SA.
Coming back to ADEV, just as a SA has an upper frequency, lower frequency,
a couple of bandwidths, and persistence other devices have associated specs.
As these specs change, so do the readings you get. A noise floor measured with
a 100 KHz bandwidth is obviously different than one with a 1 KHz bandwidth.
That’s easy to spot looking at noise. If you are looking at a sine wave tone, it
may not be so obvious. Finding the “right” signal to show this or that with ADEV
is not always easy….. With some of these techniques, people have been
digging into this and that for decades.
Bob
- It's just a single realization, therefore also a flat signal can be a
realization of 1/f flicker noise. Your measurement has zero statistical
significance.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples. If it would be, then the random variable would not
have a PSD of 1/f. If you go back to the definition of the PSD of
a random variable X(ω,t), you will see it is independent of ω.
And about statistical significance: yes, you will have zero statistical
significance about the behaviour of the population of random variables,
but you will have a statistically significant number of samples of one
realization of the random variable. And that's what you work with.
Attila Kinali
--
It is upon moral qualities that a society is ultimately founded. All
the prosperity and technological sophistication in the world is of no
use without that foundation.
-- Miss Matheson, The Diamond Age, Neil Stephenson
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
Hi
> On Nov 28, 2017, at 8:12 AM, Attila Kinali <attila@kinali.ch> wrote:
>
> On Tue, 28 Nov 2017 09:52:37 +0100
> Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
>
>>> Well, any measurement is an estimate.
>>
>> It's not so simple. If you don't assume ergodicity, your spectrum analyzer
>> does not work, because:
>> 1) The spectrum analyzer takes several snapshots of your realization to
>> estimate the PSD. If it's not stationary, the estimate does not converge.
>
> I do not see how ergocidity has anything to do with a spectrum analyzer.
> You are measuring one single instance. Not multiple.
> And no, you do not need stationarity either. The spectrum analyzer has
> a lower cut of frequency, which is given by its update rate and the
> inner workings of the SA.
Coming back to ADEV, just as a SA has an upper frequency, lower frequency,
a couple of bandwidths, and persistence other devices have associated specs.
As these specs change, so do the readings you get. A noise floor measured with
a 100 KHz bandwidth is obviously different than one with a 1 KHz bandwidth.
That’s easy to spot looking at noise. If you are looking at a sine wave tone, it
may not be so obvious. Finding the “right” signal to show this or that with ADEV
is not always easy….. With some of these techniques, people have been
digging into this and that for decades.
Bob
>
>> 2) It's just a single realization, therefore also a flat signal can be a
>> realization of 1/f flicker noise. Your measurement has *zero* statistical
>> significance.
>
> A flat signal cannot be the realization of a random variable with
> a PSD ~ 1/f. At least not for a statisticially significant number
> of time-samples. If it would be, then the random variable would not
> have a PSD of 1/f. If you go back to the definition of the PSD of
> a random variable X(ω,t), you will see it is independent of ω.
>
> And about statistical significance: yes, you will have zero statistical
> significance about the behaviour of the population of random variables,
> but you will have a statistically significant number of samples of *one*
> realization of the random variable. And that's what you work with.
>
>
> Attila Kinali
> --
> It is upon moral qualities that a society is ultimately founded. All
> the prosperity and technological sophistication in the world is of no
> use without that foundation.
> -- Miss Matheson, The Diamond Age, Neil Stephenson
> _______________________________________________
> time-nuts mailing list -- time-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
> and follow the instructions there.
D
djl
Tue, Nov 28, 2017 7:23 PM
True that the models depend on the noise statistics to be iid, that is
ergodic. That's the first assumption, and, while making the math
tractable, is the worst assumption.
Don
On 2017-11-28 01:52, Mattia Rizzi wrote:
This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
That's the theory. I am not arguing that it's realistic.
Ergodicity breaks because the noise process is not stationary.
I know but see the following.
Well, any measurement is an estimate.
It's not so simple. If you don't assume ergodicity, your spectrum
analyzer
does not work, because:
- The spectrum analyzer takes several snapshots of your realization to
estimate the PSD. If it's not stationary, the estimate does not
converge.
- It's just a single realization, therefore also a flat signal can be
a
realization of 1/f flicker noise. Your measurement has zero
statistical
significance.
2017-11-27 23:50 GMT+01:00 Attila Kinali attila@kinali.ch:
To make the point a bit more clear. The above means that noise with
a PSD of the form 1/f^a for a>=1 (ie flicker phase, white frequency
and flicker frequency noise), the noise (aka random variable) is:
- Not independently distributed
- Not stationary
- Not ergodic
I think you got too much in theory. If you follow striclty the statistics
theory, you get nowhere.
You can't even talk about 1/f PSD, because Fourier doesn't converge over
infinite power signals.
This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
The power in 1/f noise is actually limited by the age of the universe.
And quite strictly so. The power you have in 1/f is the same for every
decade in frequency (or time) you go. The age of the universe is about
1e10 years, that's roughly 3e17 seconds, ie 17 decades of possible
noise.
If we assume something like a 1k carbon resistor you get something
around
of 1e-17W/decade of noise power (guestimate, not an exact
calculation).
That means that resistor, had it been around ever since the universe
was
created, then it would have converted 17*1e-17 = 2e-16W of heat into
electrical energy, on average, over the whole liftime of the universe.
That's not much :-)
In fact, you are not allowed to take a realization, make several fft and
claim that that's the PSD of the process. But that's what the spectrum
analyzer does, because it's not a multiverse instrument.
Well, any measurement is an estimate.
Every experimentalist suppose ergodicity on this kind of noise, otherwise
you get nowhere.
Err.. no. Even if you assume that the spectrum tops off at some very
low frequency and does not increase anymore, ie that there is a finite
limit to noise power, even then ergodicity is not given.
Ergodicity breaks because the noise process is not stationary.
And assuming so for any kind of 1/f noise would be wrong.
Attila Kinali
--
<JaberWorky> The bad part of Zurich is where the degenerates
throw DARK chocolate at you.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
--
Dr. Don Latham
PO Box 404, Frenchtown, MT, 59834
VOX: 406-626-4304
True that the models depend on the noise statistics to be iid, that is
ergodic. That's the first assumption, and, while making the math
tractable, is the worst assumption.
Don
On 2017-11-28 01:52, Mattia Rizzi wrote:
> Hi
>
>> This is true. But then the Fourier transformation integrates time from
> minus infinity to plus infinity. Which isn't exactly realistic either.
>
> That's the theory. I am not arguing that it's realistic.
>
>> Ergodicity breaks because the noise process is not stationary.
>
> I know but see the following.
>
>> Well, any measurement is an estimate.
>
> It's not so simple. If you don't assume ergodicity, your spectrum
> analyzer
> does not work, because:
> 1) The spectrum analyzer takes several snapshots of your realization to
> estimate the PSD. If it's not stationary, the estimate does not
> converge.
> 2) It's just a single realization, therefore also a flat signal can be
> a
> realization of 1/f flicker noise. Your measurement has *zero*
> statistical
> significance.
>
>
>
> 2017-11-27 23:50 GMT+01:00 Attila Kinali <attila@kinali.ch>:
>
>> Hoi Mattia,
>>
>> On Mon, 27 Nov 2017 23:04:56 +0100
>> Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
>>
>> > >To make the point a bit more clear. The above means that noise with
>> > > a PSD of the form 1/f^a for a>=1 (ie flicker phase, white frequency
>> > > and flicker frequency noise), the noise (aka random variable) is:
>> > > 1) Not independently distributed
>> > > 2) Not stationary
>> > > 3) Not ergodic
>> >
>> > I think you got too much in theory. If you follow striclty the statistics
>> > theory, you get nowhere.
>> > You can't even talk about 1/f PSD, because Fourier doesn't converge over
>> > infinite power signals.
>>
>> This is true. But then the Fourier transformation integrates time from
>> minus infinity to plus infinity. Which isn't exactly realistic either.
>> The power in 1/f noise is actually limited by the age of the universe.
>> And quite strictly so. The power you have in 1/f is the same for every
>> decade in frequency (or time) you go. The age of the universe is about
>> 1e10 years, that's roughly 3e17 seconds, ie 17 decades of possible
>> noise.
>> If we assume something like a 1k carbon resistor you get something
>> around
>> of 1e-17W/decade of noise power (guestimate, not an exact
>> calculation).
>> That means that resistor, had it been around ever since the universe
>> was
>> created, then it would have converted 17*1e-17 = 2e-16W of heat into
>> electrical energy, on average, over the whole liftime of the universe.
>> That's not much :-)
>>
>> > In fact, you are not allowed to take a realization, make several fft and
>> > claim that that's the PSD of the process. But that's what the spectrum
>> > analyzer does, because it's not a multiverse instrument.
>>
>> Well, any measurement is an estimate.
>>
>> > Every experimentalist suppose ergodicity on this kind of noise, otherwise
>> > you get nowhere.
>>
>> Err.. no. Even if you assume that the spectrum tops off at some very
>> low frequency and does not increase anymore, ie that there is a finite
>> limit to noise power, even then ergodicity is not given.
>> Ergodicity breaks because the noise process is not stationary.
>> And assuming so for any kind of 1/f noise would be wrong.
>>
>>
>> Attila Kinali
>> --
>> <JaberWorky> The bad part of Zurich is where the degenerates
>> throw DARK chocolate at you.
>> _______________________________________________
>> time-nuts mailing list -- time-nuts@febo.com
>> To unsubscribe, go to https://www.febo.com/cgi-bin/
>> mailman/listinfo/time-nuts
>> and follow the instructions there.
>>
> _______________________________________________
> time-nuts mailing list -- time-nuts@febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
> and follow the instructions there.
--
Dr. Don Latham
PO Box 404, Frenchtown, MT, 59834
VOX: 406-626-4304
MR
Mattia Rizzi
Thu, Nov 30, 2017 11:44 AM
True that the models depend on the noise statistics to be iid, that is
ergodic. That's the first assumption, and, while making the math tractable,
is the worst assumption.
I am not talking about intractable math. I'm talking about experimental
hypothesis vs theory.
I said that if you follow strictly the theory, you cannot claim anything on
any stuff you're measuring that may have a flicker process.
Therefore, any experimentalist suppose ergodicity in his measurements.
@Attila
I do not see how ergocidity has anything to do with a spectrum analyzer.
You are measuring one single instance. Not multiple [...] And about
statistical significance: yes, you will have zero statistical significance
about the behaviour of the population of random variables, but you will
have a statistically significant number of samples of one realization of
the random variable. And that's what you work with.
Let me emphasize your sentence: "you will have a statistically significant
number of samples of one realization of the random variable.".
This sentence is the meaning of ergodic process [
https://en.wikipedia.org/wiki/Ergodic_process]
If it's ergodic, you can characterize the stochastic process using only one
realization.
If it's not, your measurement is worthless, because there's no guarantee
that it contains all the statistical information.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples
Without ergodicity you cannot claim it. You have to suppose ergodicity.
And no, you do not need stationarity either. The spectrum analyzer has
a lower cut of frequency, which is given by its update rate and the
inner workings of the SA.
You need stationarity. Your SA takes several snapshots of the realization,
with an assumption: the characteristics of the stochastic process are not
changing over time. If the stochastic process is stationary, the
autocorrelation function doesn't depend over time. So you are authorized to
take several snapshots, compensate for the obseveration time (low cut-off
frequency) (*), and be sure that the estimated PSD will converge to
something meaningful.
If it's not stationary, it can change over time, therefore you are not
authorized to use a SA. It's like measuring the transfer function of a
time-varying filter (e.g. LTV system), the estimate doesn't converge.
cheers,
Mattia
(*) You can compensate the measured PSD to mimic the stochastic process
PSD, because the SA is a LTI system.
2017-11-28 20:23 GMT+01:00 djl djl@montana.com:
True that the models depend on the noise statistics to be iid, that is
ergodic. That's the first assumption, and, while making the math tractable,
is the worst assumption.
Don
On 2017-11-28 01:52, Mattia Rizzi wrote:
Hi
This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
That's the theory. I am not arguing that it's realistic.
Ergodicity breaks because the noise process is not stationary.
I know but see the following.
Well, any measurement is an estimate.
It's not so simple. If you don't assume ergodicity, your spectrum analyzer
does not work, because:
- The spectrum analyzer takes several snapshots of your realization to
estimate the PSD. If it's not stationary, the estimate does not converge.
- It's just a single realization, therefore also a flat signal can be a
realization of 1/f flicker noise. Your measurement has zero statistical
significance.
2017-11-27 23:50 GMT+01:00 Attila Kinali attila@kinali.ch:
Hoi Mattia,
To make the point a bit more clear. The above means that noise with
a PSD of the form 1/f^a for a>=1 (ie flicker phase, white frequency
and flicker frequency noise), the noise (aka random variable) is:
- Not independently distributed
- Not stationary
- Not ergodic
I think you got too much in theory. If you follow striclty the
theory, you get nowhere.
You can't even talk about 1/f PSD, because Fourier doesn't converge
This is true. But then the Fourier transformation integrates time from
minus infinity to plus infinity. Which isn't exactly realistic either.
The power in 1/f noise is actually limited by the age of the universe.
And quite strictly so. The power you have in 1/f is the same for every
decade in frequency (or time) you go. The age of the universe is about
1e10 years, that's roughly 3e17 seconds, ie 17 decades of possible noise.
If we assume something like a 1k carbon resistor you get something around
of 1e-17W/decade of noise power (guestimate, not an exact calculation).
That means that resistor, had it been around ever since the universe was
created, then it would have converted 17*1e-17 = 2e-16W of heat into
electrical energy, on average, over the whole liftime of the universe.
That's not much :-)
In fact, you are not allowed to take a realization, make several fft
claim that that's the PSD of the process. But that's what the spectrum
analyzer does, because it's not a multiverse instrument.
Well, any measurement is an estimate.
Every experimentalist suppose ergodicity on this kind of noise,
Err.. no. Even if you assume that the spectrum tops off at some very
low frequency and does not increase anymore, ie that there is a finite
limit to noise power, even then ergodicity is not given.
Ergodicity breaks because the noise process is not stationary.
And assuming so for any kind of 1/f noise would be wrong.
Attila Kinali
--
<JaberWorky> The bad part of Zurich is where the degenerates
throw DARK chocolate at you.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/time-nuts
and follow the instructions there.
Hi,
@All
>True that the models depend on the noise statistics to be iid, that is
ergodic. That's the first assumption, and, while making the math tractable,
is the worst assumption.
I am not talking about intractable math. I'm talking about experimental
hypothesis vs theory.
I said that if you follow strictly the theory, you cannot claim anything on
any stuff you're measuring that may have a flicker process.
Therefore, any experimentalist suppose ergodicity in his measurements.
@Attila
>I do not see how ergocidity has anything to do with a spectrum analyzer.
You are measuring one single instance. Not multiple [...] And about
statistical significance: yes, you will have zero statistical significance
about the behaviour of the population of random variables, but you will
have a statistically significant number of samples of *one* realization of
the random variable. And that's what you work with.
Let me emphasize your sentence: "you will have a statistically significant
number of samples of *one* realization of the random variable.".
This sentence is the meaning of ergodic process [
https://en.wikipedia.org/wiki/Ergodic_process]
If it's ergodic, you can characterize the stochastic process using only one
realization.
If it's not, your measurement is worthless, because there's no guarantee
that it contains all the statistical information.
>A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples
Without ergodicity you cannot claim it. You have to suppose ergodicity.
>And no, you do not need stationarity either. The spectrum analyzer has
a lower cut of frequency, which is given by its update rate and the
inner workings of the SA.
You need stationarity. Your SA takes several snapshots of the realization,
with an assumption: the characteristics of the stochastic process are not
changing over time. If the stochastic process is stationary, the
autocorrelation function doesn't depend over time. So you are authorized to
take several snapshots, compensate for the obseveration time (low cut-off
frequency) (*), and be sure that the estimated PSD will converge to
something meaningful.
If it's not stationary, it can change over time, therefore you are not
authorized to use a SA. It's like measuring the transfer function of a
time-varying filter (e.g. LTV system), the estimate doesn't converge.
cheers,
Mattia
(*) You can compensate the measured PSD to mimic the stochastic process
PSD, because the SA is a LTI system.
2017-11-28 20:23 GMT+01:00 djl <djl@montana.com>:
> True that the models depend on the noise statistics to be iid, that is
> ergodic. That's the first assumption, and, while making the math tractable,
> is the worst assumption.
> Don
>
>
> On 2017-11-28 01:52, Mattia Rizzi wrote:
>
>> Hi
>>
>> This is true. But then the Fourier transformation integrates time from
>>>
>> minus infinity to plus infinity. Which isn't exactly realistic either.
>>
>> That's the theory. I am not arguing that it's realistic.
>>
>> Ergodicity breaks because the noise process is not stationary.
>>>
>>
>> I know but see the following.
>>
>> Well, any measurement is an estimate.
>>>
>>
>> It's not so simple. If you don't assume ergodicity, your spectrum analyzer
>> does not work, because:
>> 1) The spectrum analyzer takes several snapshots of your realization to
>> estimate the PSD. If it's not stationary, the estimate does not converge.
>> 2) It's just a single realization, therefore also a flat signal can be a
>> realization of 1/f flicker noise. Your measurement has *zero* statistical
>> significance.
>>
>>
>>
>> 2017-11-27 23:50 GMT+01:00 Attila Kinali <attila@kinali.ch>:
>>
>> Hoi Mattia,
>>>
>>> On Mon, 27 Nov 2017 23:04:56 +0100
>>> Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
>>>
>>> > >To make the point a bit more clear. The above means that noise with
>>> > > a PSD of the form 1/f^a for a>=1 (ie flicker phase, white frequency
>>> > > and flicker frequency noise), the noise (aka random variable) is:
>>> > > 1) Not independently distributed
>>> > > 2) Not stationary
>>> > > 3) Not ergodic
>>> >
>>> > I think you got too much in theory. If you follow striclty the
>>> statistics
>>> > theory, you get nowhere.
>>> > You can't even talk about 1/f PSD, because Fourier doesn't converge
>>> over
>>> > infinite power signals.
>>>
>>> This is true. But then the Fourier transformation integrates time from
>>> minus infinity to plus infinity. Which isn't exactly realistic either.
>>> The power in 1/f noise is actually limited by the age of the universe.
>>> And quite strictly so. The power you have in 1/f is the same for every
>>> decade in frequency (or time) you go. The age of the universe is about
>>> 1e10 years, that's roughly 3e17 seconds, ie 17 decades of possible noise.
>>> If we assume something like a 1k carbon resistor you get something around
>>> of 1e-17W/decade of noise power (guestimate, not an exact calculation).
>>> That means that resistor, had it been around ever since the universe was
>>> created, then it would have converted 17*1e-17 = 2e-16W of heat into
>>> electrical energy, on average, over the whole liftime of the universe.
>>> That's not much :-)
>>>
>>> > In fact, you are not allowed to take a realization, make several fft
>>> and
>>> > claim that that's the PSD of the process. But that's what the spectrum
>>> > analyzer does, because it's not a multiverse instrument.
>>>
>>> Well, any measurement is an estimate.
>>>
>>> > Every experimentalist suppose ergodicity on this kind of noise,
>>> otherwise
>>> > you get nowhere.
>>>
>>> Err.. no. Even if you assume that the spectrum tops off at some very
>>> low frequency and does not increase anymore, ie that there is a finite
>>> limit to noise power, even then ergodicity is not given.
>>> Ergodicity breaks because the noise process is not stationary.
>>> And assuming so for any kind of 1/f noise would be wrong.
>>>
>>>
>>> Attila Kinali
>>> --
>>> <JaberWorky> The bad part of Zurich is where the degenerates
>>> throw DARK chocolate at you.
>>> _______________________________________________
>>> time-nuts mailing list -- time-nuts@febo.com
>>> To unsubscribe, go to https://www.febo.com/cgi-bin/
>>> mailman/listinfo/time-nuts
>>> and follow the instructions there.
>>>
>>> _______________________________________________
>> time-nuts mailing list -- time-nuts@febo.com
>> To unsubscribe, go to https://www.febo.com/cgi-bin/m
>> ailman/listinfo/time-nuts
>> and follow the instructions there.
>>
>
> --
> Dr. Don Latham
> PO Box 404, Frenchtown, MT, 59834
> VOX: 406-626-4304
>
>
> _______________________________________________
> time-nuts mailing list -- time-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/m
> ailman/listinfo/time-nuts
> and follow the instructions there.
>
AK
Attila Kinali
Thu, Nov 30, 2017 2:40 PM
Let me emphasize your sentence: "you will have a statistically significant
number of samples of one realization of the random variable.".
This sentence is the meaning of ergodic process [
https://en.wikipedia.org/wiki/Ergodic_process]
If it's ergodic, you can characterize the stochastic process using only one
realization.
If it's not, your measurement is worthless, because there's no guarantee
that it contains all the statistical information.
You are mixing up ergodicity and reproducability.
Also, you are moving the goalpost.
We usually want to characterize a single clock or oscillator.
Not a production lot. As such the we only care about the statistical
properties of that single instance. If you want to verify that your
production lot has consistent performance metrics, then this is a
completely different goal and requires a different methodology. But
in the end it will boil down to measuring each clock/oscillator
individualy to make sure it fullfils the specs.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples
Without ergodicity you cannot claim it. You have to suppose ergodicity.
If you demand ergodicity, you cannot have 1/f.
You can have only one or the other. Not both.
And if you choose ergodicity, you will not faithfully model a clock.
If it's not stationary, it can change over time, therefore you are not
authorized to use a SA. It's like measuring the transfer function of a
time-varying filter (e.g. LTV system), the estimate doesn't converge.
Please take one of the SA's you have at CERN, measure an oscillator
for a long time and note down the center frequency with each measurement.
I promise you, you will be astonished.
Attila Kinali
--
It is upon moral qualities that a society is ultimately founded. All
the prosperity and technological sophistication in the world is of no
use without that foundation.
-- Miss Matheson, The Diamond Age, Neil Stephenson
On Thu, 30 Nov 2017 12:44:13 +0100
Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
> Let me emphasize your sentence: "you will have a statistically significant
> number of samples of *one* realization of the random variable.".
> This sentence is the meaning of ergodic process [
> https://en.wikipedia.org/wiki/Ergodic_process]
> If it's ergodic, you can characterize the stochastic process using only one
> realization.
> If it's not, your measurement is worthless, because there's no guarantee
> that it contains all the statistical information.
You are mixing up ergodicity and reproducability.
Also, you are moving the goalpost.
We usually want to characterize a single clock or oscillator.
Not a production lot. As such the we only care about the statistical
properties of that single instance. If you want to verify that your
production lot has consistent performance metrics, then this is a
completely different goal and requires a different methodology. But
in the end it will boil down to measuring each clock/oscillator
individualy to make sure it fullfils the specs.
> >A flat signal cannot be the realization of a random variable with
> a PSD ~ 1/f. At least not for a statisticially significant number
> of time-samples
>
> Without ergodicity you cannot claim it. You have to suppose ergodicity.
If you demand ergodicity, you cannot have 1/f.
You can have only one or the other. Not both.
And if you choose ergodicity, you will not faithfully model a clock.
> If it's not stationary, it can change over time, therefore you are not
> authorized to use a SA. It's like measuring the transfer function of a
> time-varying filter (e.g. LTV system), the estimate doesn't converge.
Please take one of the SA's you have at CERN, measure an oscillator
for a long time and note down the center frequency with each measurement.
I promise you, you will be astonished.
Attila Kinali
--
It is upon moral qualities that a society is ultimately founded. All
the prosperity and technological sophistication in the world is of no
use without that foundation.
-- Miss Matheson, The Diamond Age, Neil Stephenson
MD
Magnus Danielson
Thu, Nov 30, 2017 4:10 PM
On 11/30/2017 03:40 PM, Attila Kinali wrote:
Let me emphasize your sentence: "you will have a statistically significant
number of samples of one realization of the random variable.".
This sentence is the meaning of ergodic process [
https://en.wikipedia.org/wiki/Ergodic_process]
If it's ergodic, you can characterize the stochastic process using only one
realization.
If it's not, your measurement is worthless, because there's no guarantee
that it contains all the statistical information.
You are mixing up ergodicity and reproducability.
Also, you are moving the goalpost.
We usually want to characterize a single clock or oscillator.
Not a production lot. As such the we only care about the statistical
properties of that single instance. If you want to verify that your
production lot has consistent performance metrics, then this is a
completely different goal and requires a different methodology. But
in the end it will boil down to measuring each clock/oscillator
individualy to make sure it fullfils the specs.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples
Without ergodicity you cannot claim it. You have to suppose ergodicity.
If you demand ergodicity, you cannot have 1/f.
You can have only one or the other. Not both.
And if you choose ergodicity, you will not faithfully model a clock.
If it's not stationary, it can change over time, therefore you are not
authorized to use a SA. It's like measuring the transfer function of a
time-varying filter (e.g. LTV system), the estimate doesn't converge.
Please take one of the SA's you have at CERN, measure an oscillator
for a long time and note down the center frequency with each measurement.
I promise you, you will be astonished.
After tons of measurements and attempts on theory a model was formed
that was sufficiently consistent with measurements.
The model that fits observation makes much of the traditional
statistical measures and definitions "tricky" to apply.
Flicker, that is PSD of 1/f, still is tricky to hunt down the real root
and model it, so we just use approximation in it's place because we need
to have something to work with. Even without flicker, the white
frequency noise messes with us.
This thread seems to lost contact with these aspects.
Cheers,
Magnus
On 11/30/2017 03:40 PM, Attila Kinali wrote:
> On Thu, 30 Nov 2017 12:44:13 +0100
> Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
>
>> Let me emphasize your sentence: "you will have a statistically significant
>> number of samples of *one* realization of the random variable.".
>> This sentence is the meaning of ergodic process [
>> https://en.wikipedia.org/wiki/Ergodic_process]
>> If it's ergodic, you can characterize the stochastic process using only one
>> realization.
>> If it's not, your measurement is worthless, because there's no guarantee
>> that it contains all the statistical information.
>
> You are mixing up ergodicity and reproducability.
>
> Also, you are moving the goalpost.
> We usually want to characterize a single clock or oscillator.
> Not a production lot. As such the we only care about the statistical
> properties of that single instance. If you want to verify that your
> production lot has consistent performance metrics, then this is a
> completely different goal and requires a different methodology. But
> in the end it will boil down to measuring each clock/oscillator
> individualy to make sure it fullfils the specs.
>
>
>>> A flat signal cannot be the realization of a random variable with
>> a PSD ~ 1/f. At least not for a statisticially significant number
>> of time-samples
>>
>> Without ergodicity you cannot claim it. You have to suppose ergodicity.
>
> If you demand ergodicity, you cannot have 1/f.
> You can have only one or the other. Not both.
> And if you choose ergodicity, you will not faithfully model a clock.
>
>> If it's not stationary, it can change over time, therefore you are not
>> authorized to use a SA. It's like measuring the transfer function of a
>> time-varying filter (e.g. LTV system), the estimate doesn't converge.
>
> Please take one of the SA's you have at CERN, measure an oscillator
> for a long time and note down the center frequency with each measurement.
> I promise you, you will be astonished.
After tons of measurements and attempts on theory a model was formed
that was sufficiently consistent with measurements.
The model that fits observation makes much of the traditional
statistical measures and definitions "tricky" to apply.
Flicker, that is PSD of 1/f, still is tricky to hunt down the real root
and model it, so we just use approximation in it's place because we need
to have something to work with. Even without flicker, the white
frequency noise messes with us.
This thread seems to lost contact with these aspects.
Cheers,
Magnus
BK
Bob kb8tq
Thu, Nov 30, 2017 5:04 PM
Let me emphasize your sentence: "you will have a statistically significant
number of samples of one realization of the random variable.".
This sentence is the meaning of ergodic process [
https://en.wikipedia.org/wiki/Ergodic_process]
If it's ergodic, you can characterize the stochastic process using only one
realization.
If it's not, your measurement is worthless, because there's no guarantee
that it contains all the statistical information.
You are mixing up ergodicity and reproducability.
Also, you are moving the goalpost.
We usually want to characterize a single clock or oscillator.
Not a production lot. As such the we only care about the statistical
properties of that single instance. If you want to verify that your
production lot has consistent performance metrics, then this is a
completely different goal and requires a different methodology. But
in the end it will boil down to measuring each clock/oscillator
individualy to make sure it fullfils the specs.
A flat signal cannot be the realization of a random variable with
a PSD ~ 1/f. At least not for a statisticially significant number
of time-samples
Without ergodicity you cannot claim it. You have to suppose ergodicity.
If you demand ergodicity, you cannot have 1/f.
You can have only one or the other. Not both.
And if you choose ergodicity, you will not faithfully model a clock.
If it's not stationary, it can change over time, therefore you are not
authorized to use a SA. It's like measuring the transfer function of a
time-varying filter (e.g. LTV system), the estimate doesn't converge.
Please take one of the SA's you have at CERN, measure an oscillator
for a long time and note down the center frequency with each measurement.
I promise you, you will be astonished.
After tons of measurements and attempts on theory a model was formed that was sufficiently consistent with measurements.
The model that fits observation makes much of the traditional statistical measures and definitions "tricky" to apply.
Flicker, that is PSD of 1/f, still is tricky to hunt down the real root and model it, so we just use approximation in it's place because we need to have something to work with.
I believe that was roughly the third thing the prof said when he introduced 1/F noise back when I was in school. It might have been
the fourth thing …. that was a long time ago ….
Bob
Hi
> On Nov 30, 2017, at 11:10 AM, Magnus Danielson <magnus@rubidium.dyndns.org> wrote:
>
>
>
> On 11/30/2017 03:40 PM, Attila Kinali wrote:
>> On Thu, 30 Nov 2017 12:44:13 +0100
>> Mattia Rizzi <mattia.rizzi@gmail.com> wrote:
>>> Let me emphasize your sentence: "you will have a statistically significant
>>> number of samples of *one* realization of the random variable.".
>>> This sentence is the meaning of ergodic process [
>>> https://en.wikipedia.org/wiki/Ergodic_process]
>>> If it's ergodic, you can characterize the stochastic process using only one
>>> realization.
>>> If it's not, your measurement is worthless, because there's no guarantee
>>> that it contains all the statistical information.
>> You are mixing up ergodicity and reproducability.
>> Also, you are moving the goalpost.
>> We usually want to characterize a single clock or oscillator.
>> Not a production lot. As such the we only care about the statistical
>> properties of that single instance. If you want to verify that your
>> production lot has consistent performance metrics, then this is a
>> completely different goal and requires a different methodology. But
>> in the end it will boil down to measuring each clock/oscillator
>> individualy to make sure it fullfils the specs.
>>>> A flat signal cannot be the realization of a random variable with
>>> a PSD ~ 1/f. At least not for a statisticially significant number
>>> of time-samples
>>>
>>> Without ergodicity you cannot claim it. You have to suppose ergodicity.
>> If you demand ergodicity, you cannot have 1/f.
>> You can have only one or the other. Not both.
>> And if you choose ergodicity, you will not faithfully model a clock.
>>
>>> If it's not stationary, it can change over time, therefore you are not
>>> authorized to use a SA. It's like measuring the transfer function of a
>>> time-varying filter (e.g. LTV system), the estimate doesn't converge.
>> Please take one of the SA's you have at CERN, measure an oscillator
>> for a long time and note down the center frequency with each measurement.
>> I promise you, you will be astonished.
>
> After tons of measurements and attempts on theory a model was formed that was sufficiently consistent with measurements.
>
> The model that fits observation makes much of the traditional statistical measures and definitions "tricky" to apply.
>
> Flicker, that is PSD of 1/f, still is tricky to hunt down the real root and model it, so we just use approximation in it's place because we need to have something to work with.
I believe that was roughly the third thing the prof said when he introduced 1/F noise back when I was in school. It *might* have been
the fourth thing …. that was a long time ago ….
Bob
> Even without flicker, the white frequency noise messes with us.
>
> This thread seems to lost contact with these aspects.
>
> Cheers,
> Magnus
> _______________________________________________
> time-nuts mailing list -- time-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
> and follow the instructions there.