time-nuts@lists.febo.com

Discussion of precise time and frequency measurement

View all threads

Re: [time-nuts] ``direct'' RS-232 vs. RS-232 via USB vs.PPSdecoding cards

HM
Hal Murray
Wed, Feb 22, 2017 11:37 AM

It seems to me that if the read path and the write path are different it
breaks down. Given how USB works, not to mention all the layers of software
involved I can't imagine the paths are equal. There's no doubt you can get
great consistency from your method. But turning that into precise time
requires some kind of calibration of the actual code path delays.

There are 3 clocks: PC, USB, and Gizmo on the end of the USB.

We could get a USB logic analyzer.  That would give us some more data on the
USB timings.  (There are probably some limits in the specs, but I won't be
surprised if somebody doesn't get it right..)

I think the problem is how to get accurate timing on a write from PC.  With
that, we could collect some data, wave our hands, and make an estimate on the
error bars.

The USB firmware has a schedule with slots for the Gizmo.  (I don't know the
USB details, so this is a bit handwavy.)  You can get synced with the USB
schedule by doing a read.  Then pause a bit, do a write, then a read.  The
Gizmo returns time stamps (its clock) for the write and read.  Now adjust the
length of the pause to minimize the PC time between start of write and finish
of the read.  That gives you an upper bound on the time error between the
Gizmo clock and the PC clock.

We could subtract off the USB data transfer times, and maybe some of the idle times.  With some kernel hacking, we could get some PC time stamps for when the read and writes finish.  On the read side, that just saves a bit of CPU time - interrupt to running user code.  I don't know how to wait for a buffer to get sent.  There may be an ioctl for that.

I'm not sure it would ever be good enough for serious time-nuts work, but it would be an interesting experiment.

--
These are my opinions.  I hate spam.

tvb@LeapSecond.com said: > It seems to me that if the read path and the write path are different it > breaks down. Given how USB works, not to mention all the layers of software > involved I can't imagine the paths are equal. There's no doubt you can get > great consistency from your method. But turning that into precise time > requires some kind of calibration of the actual code path delays. There are 3 clocks: PC, USB, and Gizmo on the end of the USB. We could get a USB logic analyzer. That would give us some more data on the USB timings. (There are probably some limits in the specs, but I won't be surprised if somebody doesn't get it right..) I think the problem is how to get accurate timing on a write from PC. With that, we could collect some data, wave our hands, and make an estimate on the error bars. The USB firmware has a schedule with slots for the Gizmo. (I don't know the USB details, so this is a bit handwavy.) You can get synced with the USB schedule by doing a read. Then pause a bit, do a write, then a read. The Gizmo returns time stamps (its clock) for the write and read. Now adjust the length of the pause to minimize the PC time between start of write and finish of the read. That gives you an upper bound on the time error between the Gizmo clock and the PC clock. We could subtract off the USB data transfer times, and maybe some of the idle times. With some kernel hacking, we could get some PC time stamps for when the read and writes finish. On the read side, that just saves a bit of CPU time - interrupt to running user code. I don't know how to wait for a buffer to get sent. There may be an ioctl for that. I'm not sure it would ever be good enough for serious time-nuts work, but it would be an interesting experiment. -- These are my opinions. I hate spam.
J
jimlux
Thu, Feb 23, 2017 12:20 AM

On 2/22/17 3:37 AM, Hal Murray wrote:

We could get a USB logic analyzer.  That would give us some more data on the
USB timings.  (There are probably some limits in the specs, but I won't be
surprised if somebody doesn't get it right..)

USB has a basic timing interval of 125 microseconds (8 kHz) and most
devices follow that in some way (e.g. they'll burp out a buffer every
125 microseconds, or do a bulk transfer, etc.)

The stuff I've done with microcontrollers with USB onboard (mostly
Teensy devices, which is the Freescale micro) seem to have this sort of
125 microsecond granularity in timing.

I just ran across this, which has lots of useful timing info
http://www.usbmadesimple.co.uk/ums_6.htm#frames_and_microframes

There's a whole lot of complexity in how the bus switches directions
(since there's only two wires), how it detects the speed, etc.

(and let's not even get started on hub and device power management,
which was my particular bete noir a few years ago...)

I make no claims about USB 3.0..

On 2/22/17 3:37 AM, Hal Murray wrote: > > We could get a USB logic analyzer. That would give us some more data on the > USB timings. (There are probably some limits in the specs, but I won't be > surprised if somebody doesn't get it right..) > USB has a basic timing interval of 125 microseconds (8 kHz) and most devices follow that in some way (e.g. they'll burp out a buffer every 125 microseconds, or do a bulk transfer, etc.) The stuff I've done with microcontrollers with USB onboard (mostly Teensy devices, which is the Freescale micro) seem to have this sort of 125 microsecond granularity in timing. I just ran across this, which has lots of useful timing info http://www.usbmadesimple.co.uk/ums_6.htm#frames_and_microframes There's a whole lot of complexity in how the bus switches directions (since there's only two wires), how it detects the speed, etc. (and let's not even get started on hub and device power management, which was my particular bete noir a few years ago...) I make no claims about USB 3.0..