time-nuts@lists.febo.com

Discussion of precise time and frequency measurement

View all threads

Software

HM
Hal Murray
Wed, Oct 3, 2018 3:39 AM

At least for me. I took 1 course in Fortran 50 years ago, and that was the
extent of my software education. During my whole career, I have too busy
being well paid to design hardware, to have any time left over to learn
software.  After Fortran was over, there was the Pascal fad, then the C fad,
etc, now I guess Python is the latest. Never got involved in any of that.

Interesting.

All the hardware people I've worked with have been reasonably happy working on
software.  That may be more common in the digital world.

As an example, most people write PAL code as logic equations rather than
schematics.

It would be interesting to compare the costs of hardware vs software for a big
chip project over time.

--
These are my opinions.  I hate spam.

richard@karlquist.com said: > At least for me. I took 1 course in Fortran 50 years ago, and that was the > extent of my software education. During my whole career, I have too busy > being well paid to design hardware, to have any time left over to learn > software. After Fortran was over, there was the Pascal fad, then the C fad, > etc, now I guess Python is the latest. Never got involved in any of that. Interesting. All the hardware people I've worked with have been reasonably happy working on software. That may be more common in the digital world. As an example, most people write PAL code as logic equations rather than schematics. It would be interesting to compare the costs of hardware vs software for a big chip project over time. -- These are my opinions. I hate spam.
J
jimlux
Wed, Oct 3, 2018 12:22 PM

On 10/2/18 8:39 PM, Hal Murray wrote:

At least for me. I took 1 course in Fortran 50 years ago, and that was the
extent of my software education. During my whole career, I have too busy
being well paid to design hardware, to have any time left over to learn
software.  After Fortran was over, there was the Pascal fad, then the C fad,
etc, now I guess Python is the latest. Never got involved in any of that.

Interesting.

All the hardware people I've worked with have been reasonably happy working on
software.  That may be more common in the digital world.

As an example, most people write PAL code as logic equations rather than
schematics.

It would be interesting to compare the costs of hardware vs software for a big
chip project over time.

Like most things, "it depends" - here, I'm going to talk about embedded
applications (as would be typical for a "time-nuts" widget of some sort)

From a "first unit" cost standpoint, in general software is
cheaper/easier than FPGA code. It might also be cheaper in hardware on
recurring cost basis. Purpose designed processors tend to be faster and
lower power, and cheaper than FPGA instantiated processors - ultimately,
it takes less sand to make them.

There are orders of magnitude more C programmers available than FPGA
folks. And there are more of both of them then folks who can design an ASIC.

The maturity of the development tools for "software" is far greater than
for FPGA and they are cheaper. There are things like documentation
generators, debuggers, code analyzers, integrated development
environments, and on and on for software. There are multiple vendors for
each.

In the FPGA world, we're just starting to get analysis capabilities that
look for things like clock domain crossings that might cause trouble,
decent library management tools.  I don't know that there are FPGA
equivalents to static code analysis tools like Coverity, CodeSONAR,
Semmle, etc.  There probably are, but I'm going to bet that they've only
been out for a few years, and don't have decades of use behind them like
most software tools do.

In terms of debugging - whether it's "printf() to the console" or
embedded hardware debugger supports like the SPARC/GRMON combination -
there's a lot more support for software debugging than in an FPGA.  Part
of this is that FPGA designs tend to be timing critical - you can't just
add a block of code that dumps a bunch of values to a device or file.
Tools like ChipScope in the Xilinx family do let you look at some stuff
(like having a oscilloscope or logic analyzer in your design) - but the
scale and practicality is limited.

Just the time required to turn around a new design after fixing a bug is
typically faster with software.  On the Xilinx Virtex 6 designs I'm
working with, resynthesizing the bitstream takes on the order of 30-45
minutes.  I haven't had a C compile take that long in years, except when
I was compiling some package from source on a Beagle.  For most software
applications, recompile is a matter of <1 minute (if you're not working
in an interactive or JIT language where the time is zero).

Maturity of software libraries vs embedded components for FPGAs - for a
given function, it is far more likely that there are multiple high
quality software implementations than for a FPGA. Look at something like
BLAS (for linear algebra) or FFTW - folks have been optimizing numerical
libraries for decades - you want a Radix 17 FFT for some reason, and
there's probably one out there: in 3 different languages, and either
platform indpendent or with a well defined path to
optimizing/configuring them for a given platform.

In FPGA land, there's some standard building blocks: FFTs, FIR
filters,NCOs, various common interfaces (Ethernet, serial port) - but
often, there's one or two instances, they're somewhat tied to a specific
architecture, and tend to be straightforward implementations.  And
they're not necessarily cross platform supported - I made the mistake a
few years ago of thinking that I could move a digital down converter
from Virtex 2 to Virtex 6, but discovered that the IP cores (from
Xilinx) used were supported on Virtex 2 but not Virtex 6, and the new
cores worked entirely differently. Over a span of a few years, we got
virtually no code-reuse - for a very non-sophisticated, non-platform
specific design - an oscillator, multiplier, CIC decimator and FIR filter.

I can still use the original 10-15 line FORTRAN Radix2 FFT from that
IEEE transactions paper in the late 60s without much trouble.
An awful lot of people reuse a CRC calculation code for 8 bit
microcontrollers that originated in a paper by Aram Perez "Byte-wise CRC
Calculations" in IEEE Micro June 1983. It works, why change it.

On 10/2/18 8:39 PM, Hal Murray wrote: > > richard@karlquist.com said: >> At least for me. I took 1 course in Fortran 50 years ago, and that was the >> extent of my software education. During my whole career, I have too busy >> being well paid to design hardware, to have any time left over to learn >> software. After Fortran was over, there was the Pascal fad, then the C fad, >> etc, now I guess Python is the latest. Never got involved in any of that. > > Interesting. > > All the hardware people I've worked with have been reasonably happy working on > software. That may be more common in the digital world. > > As an example, most people write PAL code as logic equations rather than > schematics. > > It would be interesting to compare the costs of hardware vs software for a big > chip project over time. > > > Like most things, "it depends" - here, I'm going to talk about embedded applications (as would be typical for a "time-nuts" widget of some sort) From a "first unit" cost standpoint, in general software is cheaper/easier than FPGA code. It might also be cheaper in hardware on recurring cost basis. Purpose designed processors tend to be faster and lower power, and cheaper than FPGA instantiated processors - ultimately, it takes less sand to make them. There are orders of magnitude more C programmers available than FPGA folks. And there are more of both of them then folks who can design an ASIC. The maturity of the development tools for "software" is far greater than for FPGA and they are cheaper. There are things like documentation generators, debuggers, code analyzers, integrated development environments, and on and on for software. There are multiple vendors for each. In the FPGA world, we're just starting to get analysis capabilities that look for things like clock domain crossings that might cause trouble, decent library management tools. I don't know that there are FPGA equivalents to static code analysis tools like Coverity, CodeSONAR, Semmle, etc. There probably are, but I'm going to bet that they've only been out for a few years, and don't have decades of use behind them like most software tools do. In terms of debugging - whether it's "printf() to the console" or embedded hardware debugger supports like the SPARC/GRMON combination - there's a lot more support for software debugging than in an FPGA. Part of this is that FPGA designs tend to be timing critical - you can't just add a block of code that dumps a bunch of values to a device or file. Tools like ChipScope in the Xilinx family do let you look at some stuff (like having a oscilloscope or logic analyzer in your design) - but the scale and practicality is limited. Just the time required to turn around a new design after fixing a bug is typically faster with software. On the Xilinx Virtex 6 designs I'm working with, resynthesizing the bitstream takes on the order of 30-45 minutes. I haven't had a C compile take that long in years, except when I was compiling some package from source on a Beagle. For most software applications, recompile is a matter of <1 minute (if you're not working in an interactive or JIT language where the time is zero). Maturity of software libraries vs embedded components for FPGAs - for a given function, it is far more likely that there are multiple high quality software implementations than for a FPGA. Look at something like BLAS (for linear algebra) or FFTW - folks have been optimizing numerical libraries for decades - you want a Radix 17 FFT for some reason, and there's probably one out there: in 3 different languages, and either platform indpendent or with a well defined path to optimizing/configuring them for a given platform. In FPGA land, there's some standard building blocks: FFTs, FIR filters,NCOs, various common interfaces (Ethernet, serial port) - but often, there's one or two instances, they're somewhat tied to a specific architecture, and tend to be straightforward implementations. And they're not necessarily cross platform supported - I made the mistake a few years ago of thinking that I could move a digital down converter from Virtex 2 to Virtex 6, but discovered that the IP cores (from Xilinx) used were supported on Virtex 2 but not Virtex 6, and the new cores worked entirely differently. Over a span of a few years, we got virtually no code-reuse - for a very non-sophisticated, non-platform specific design - an oscillator, multiplier, CIC decimator and FIR filter. I can still use the original 10-15 line FORTRAN Radix2 FFT from that IEEE transactions paper in the late 60s without much trouble. An awful lot of people reuse a CRC calculation code for 8 bit microcontrollers that originated in a paper by Aram Perez "Byte-wise CRC Calculations" in IEEE Micro June 1983. It works, why change it.