This is for ...

  • curious people


A replacement for  commerical FPGA design suites?

The current open source FPGA tools are in no way a replacement for commercial offerings like

in the year 2000, Linux was in no way a replacement for Solaris from Sun Microsystems.

Open Soure gives any interested developer the possibility to look under the hood and tailor the current offering to his own needs. So the open source FPGA toolchain is able to fill some needs currently not adressed by commercial tools.

As Python as a tool was a boost to the accessability and ease of programming, we need progress in the FPGA field towards easier and more powerfull tools, so that not only Electronics engineers, but software developers and even researchers them self can build FPGA solutions.

how about Xilinx, Altera and others?

We move forward as we have documentation about the chips available. There are huge differences in strategic interest of the chip manufacturers. We consider it currenlty very unlikely that Xilinx as the market leader will make available any chip documentation as NVidia still can afford to not make documentation available about its graphics chips.

Other FPGA chip companies might have different interests and the market is moving as Intel recently bought Altera and there are new contenders.

Clifford knows how to legally reverse engineer Xilinx chips (especially Series 7). Its just that it is a lot of boring work and Clifford is not interested in doing this work. If someone is willing to put in the effort, Clifford might help you out with some hints. But be prepared: you need to have serious knowledge in CS and EE.  Some first hints you find here.

why Lattice chips?

Lattice ICE40 series was simple enough the be reverse engineered with reasonable effort, on the other hand their chips are cheap, modern and large enough to be userfull for real designs.

Will VHDL be supported by Yosys?

There are currently no plans by Clifford to support VHDL.  A current unifinished attempt to implement it you can find is at

Why do we need an open source FPGA toolchain?

Let me quote Google:

"If TensorFlow is so great, why open source it rather than keep it proprietary? The answer is simpler than you might think: We believe that machine learning is a key ingredient to the innovative products and technologies of the future. Research in this area is global and growing fast, but lacks standard tools. By sharing what we believe to be one of the best machine learning toolboxes in the world, we hope to create an open standard for exchanging research ideas and putting machine learning in products"

So do we think that it requires a common plattform on which innovation in the field of FPGA programming can happen. And for economic and strategic reasons, this common plattform must not be owned by one private company, but need to be a common good, open and hackable and affordable for everyone.

What are small and slow FPGAs used for (with little SRAM) ?

Today most embedded problems can be solved with a cheap 32bit MCU.

Those are also much easier to programm than FPGAs.

But small FPGAs are an absolute must if you have a problem which requires very fast reaction time,

very precise synchronisation of multiple sensors or actuators or you need very accurate timing of events.

This is the case when you need fast control loops controlling very fast processes (like controlling fast spinning motors, regulating voltage or electric current, when you want to filter data very fast (like in a logic analyser looking in realtime for a trigger situation) or in driving lots of nozzels in an Injet printhead or the laser in a laserprinter (in case you do not want to have printed trackable documents ).

Precise synchronisation you need when you have system with lots of sensors where the data acquisition need to be at the exact same time with precise repetition (=low jitter) to be able to do a good measurement (like in a MRI scanner or phased array antenna).

Here you can find a good video explaining when a FPGA might be a better solution than an embedded controller.

Generate great music

FPGAs are very well suited for high level math operations, especially for complex algorithms or when operating on data streams at a high data rate (in the MHz). The strength of an FPGA is its ability to pipeline operations which means that the algorithm is broken up spatially into different steps. Each cycle new data flows into the top of the pipe and the processed data exits at the end. In this way, each step of the process occurs simultaneously, like an assembly line, so that no single iteration of the algorithm is blocked by the iteration before. By contrast a processor is a sequential machine so each iteration of an algorithm has to wait until the previous one finishes before starting.

A CPU is doing one thing after the other, no matter how fast it is clocked. And high end CPU are massive pipelined, which increases the lattency of any code execution.
FPGA are needed when you want to be sure that things happen exactly at the same time on a microsecond scale.  This might be required treating several sensor inputs (like in a MRI or other sensor arrays where timing is critical) or a phased array antenna, or this might be at the output like controlling several currents and voltages of several motors to generate one smooth and precise movement of one robot arm or CNC machine with minimal electrical loss.

If you want to happen small things really fast (like a trigger at a logic analyser), CPUs with their task switching and pipeling of instructions just react too slow. You want to have the trigger be implemented in hardware.

If you want things to be reliable and save, then you implement something in hardware. Simple hardware circuits are much simpler to prove to be error free that a whole CPU with its operating system software stack and the simple application on top. Complicated software can not be implemented in hardware, So there you need to "use the complex stack. But if you dont have a complex problem, you can keep it simple by implementing it in programmable hardware.

Small FPGAs can move closer to the sensors and make the edge smart. This simplifies system design and software development a lot. The signal conditioning and validation and aggregation could be done directly at the sensor.

Very often this could be done by a MCU, but when low latency is important, FPGA are a better choice.

They can be used for indoor navigation system where the runtime of radiowaves determines the location of devices.

What can large and fast FPGAs (with GB of RAM attached to them) be used for?

The european defence agency identified the following application areas for FPGA

  • Optronic (image processing)
  • Software Defined Radio (radar, signal intelligence and electronic warfare)
  • Missile (control)
  • Modem (data transmission, data encryption)
  • Navigation and guidance (GPS, GPS spoofing)

Other areas are:

High Performance Computing

Low Latency TradingMulti-core FPGA Prototyping and SoC Prototyping, ASIC Prototyping


Code verification

Chip design verification and simulation

Cloud computing

Video and Audio compression and filtering

Algorithm acceleration


Deep Learning

Network Encryption

Network Analysis (deep packet inspection)

Radio Communication (Software defined Radio)


Simulation and supercomputing (Oil and Gas)

High frequency trading (making money out of thin air)

Network Switching (Software defined Networking)

Video processing (Broadcast solutions)

Streamed Data filtering (Cern)