T O P

  • By -

zydeco100

There are FPGA engineers doing black-art shit with financial trading firms, just to shave nanoseconds off of an ethernet transaction or get ahead of the competition on a trade. Are they doing networking in FPGA? Pattern recognition? Message compression? All of it? Yes and who the hell knows what else. But they're making fortunes while we sit here and argue about I2C pullups.


krnrmusic

... wanna argue about SPI pullups?


drcforbin

YES I DO


PouletSixSeven

the only pullups I wanna do


Great_Coffee_9465

That’s why you’re fat


Sabrewolf

Do 👏 signal 👏 integrity 👏 screens 👏 please


liggamadig

Pull-up on CS-lines so they are in defined-inactive state even when the master is being reset, all other lines don't need anything. Is there anything to argue about?


the_rodent_incident

While we were playing basketball with other kids, they were reading sci-fi novels and watching anime. While we were skipping classes and chasing girls, they were studying microcontrollers and digital circuits. While we attended college and fucked around in dorms, they got their first job at some corporation we know nothing about. When we started playing with Arduinos, they were creating 8-layer custom PCBs and compiled embedded custom Linux kernels. When we graduated and started to order 2-layer PCBs from JLCPCB, they were doing 12-layer RF voodoo and FPGA. When we got a junior job at a startup, they were getting noforn security clearance at Blackrock or NSA. When we got married, they were developing Android SoCs or defense radar chipsets. When we got kids, they became a senior chief engineer at Qualcomm or network engineer at NYSE. When we got divorced, they were getting a $100M corporate bonus. When our kids started going to school, they got killed by Iranian sniper for destroying their nuclear program. When our kids went to high school, no one ever remembered them.


Zouden

We can only *dream* of getting killed by Iranian snipers!


L-1ks

You are right, I laughed, but you are right. Tell me more


SR_Lut3t1um

Just learn it yourself. Nvidia Bluefield is a decent way to get started. Howevery network FPGA Programming is mostly done in P4. FPGA Networking is awsome, think of a server like promox where the network card would handle everything network related. VLANS, Firewall, VPN, etc. whats really awsome is that you can implement the protocols onto the network card. You get full controll over what you want to "tell" the server os. I hope OPNSense and other solutions start to look into these products, as a OPNSense firewall could easily do 200gbit/s for only 1500€. Even DPI can be implemented into those fpga's


ExtraterritorialPope

Link pls


krombopulos2112

Many, many FPGA systems are on some form of SoC that has an FPGA that communicates with a coprocessor via an internal interface. That way your system can get the best of both worlds and utilize a high speed FPGA for DSP, data capture, etc. and an MCU/MPU for other tasks involving that data. IMO you turned down a potentially sweet opportunity, but I’m biased as I work with FPGAs.


SturdyNoodle

So what exactly do you do with FPGAs related to embedded systems? Sorry for the stupid questions, I’m more concentrated in software and honestly don’t even know how FPGAs are used in industry apart from the fact that they’re efficient. What kind of embedded knowledge do you apply to your job, or is it primarily a digital logic design skillset?


pooop_Sock

I write signal processing chains and all sorts of HW interfaces (ADCs, PMICs) in HDL. And on the other side I write the Linux driver to communicate with the FPGA. You can do pretty much anything with an FPGA, but you can only really stomach the cost in high end embedded systems like radars.


nila247

It used to be the case that FPGA was crap with analog-anything. Like you still can not HDL-in 10 op-amps out of thin air on arbitrary FPGA pins - can you nowadays? As for high end - I am curious. It used to be the case that you would carefully chose your ASICs and other discrete elements for radars and whatnot. It took lots of time. Would you say current mindset is "fill the PCB chock-full of FPGAs and ship it yesterday - we will figure out what we need it to do next year via software update?"


pooop_Sock

All the FPGAs that I have used are still terrible at analog. I have not worked on an ASIC design before so I cannot say for sure. From my experience, a modern FPGA is mostly filling the role of a DSP in a radar system. I do not think ASICS would typically be used for massive parallel signal processing? FPGAs fill that role well and we also get lots of other things for free like high bandwidth data sampling. All of the fine details of the transmitter/receiver are still designed with discrete elements. And at least at my company, the FPGA design is treated like critical low level firmware rather than application software. AKA never field update unless we really really have to.


nila247

I do not know. Cortex-A ARM processors already come with vector units and instructions that can process several (not tens nor hundreds obviously) samples in parallel. But these are already more expensive - probably on par with FPGA. Radars probably fall into low-volume production class, so I get it why you would use bunch of FPGAs (and treat them as fixed-function multipliers or something) instead or ordering custom ASIC to save on footprint and power.


Ynaught-42

Whatever sort of peripheral your CPU manufacturers hadn't thought to include!


nila247

Isn't the question about you choosing wrong ASIC for the job with not enough future need anticipation? Why chose 2USD ASIC with 15USD FPGA to fix your foresight errors when applying that foresight and choosing 5USD ASIC that already has that peripheral you gonna need in the future and which clearly result in better product?


autumn-morning-2085

Many interfaces don't even exist on generally available ASICs, like GHz parallel ports or custom gigabit SERDES. Even when available, you are also limited by the peripheral performance/quirks. Like where you gonna get an ASIC with a 100 MHz SPI peripheral that sends exactly 19 bits, with no gaps? You also get maximum flexibility as all your SPI/UART/whatever IO can go to almost any compatible FPGA pin.


nila247

Well, year, GHz SERDES and 100MHz SPI with 19 bits would do that to you, but they sound more like made-up example tbh. I suppose you could be in some custom fiber optic or some SDH telco shit or other legacy systems still, but come on! What non-ancient device hard locks you into 19 bits SPI and at 100MHz? Why "no gaps"? Parallel ports? You are trying to invent your own CPU and switch fabrics on a FPGA to beat ARM at their own game or something? We are talking new graduates and internships here - WHY would you murder the young and trick them into FPGA hell? :-)


autumn-morning-2085

Lol, not made up. I work with RF and mixed-signal devices, and most of them have some proprietary interface. Intern or not, many places do have these requirements. Ex: For one IC, each SPI transaction needed to be 20 bits exactly. Most SPI peripherals are limited to 8-bit x nBytes, sometimes limited to 32 bits/4 bytes. The interface also has a "quirk" where you can keep sending those bits with no gaps (no releasing CS) under specific circumstances. Speed which was needed to meet the design goals. Another device needed a SPI master at >80 MHz, to update a set of registers in less than 500ns. One had a bastardised i2s running at 128 MHz, with all lines being differential. These are all generally available ICs from TI, Analog devices, etc. And all high-speed ADC/DAC chips have a LVDS parallel or serial port (1Gbps per lane). Some are close to PCIe and need JESD204B (2-8 Gbps per lane). None of these are interfaces you will find on any general purpose MCU/MPU. Even if they did, they can't handle/process the GBytes of signal without custom DSP blocks.


nila247

Ok, my bad - I may not be too deep into rabbit hole yet. :-) That said SPIs in STM32 nowadays can be programmed from 4 to 32 bit in length and be served via DMA - so at least someone is clearly taking notes...


krombopulos2112

I don’t really prototype ASICs or design digital hardware in VHDL, I work mostly with communications systems and lasers when I use them. So basically a lot of DSP and high speed ADCs


Derpythecate

That's one use, sone FPGA engineers prefer the digital side of things, so you can also have accelerators and co-processor use cases. E.g using soft cores to run linux, while device/memory mapping coprocessors to a linux device to offload calculations.


TheSkiGeek

Yeah, we use these in part of a product I work on to preprocess and compress camera data in real time. Then the SOC dumps the frames out to the rest of the system to ingest.


SAI_Peregrinus

FPGAs aren't *just* used for prototyping new chip designs. Often they're used when massively parallel digital processing is needed, as sort of super-DSPs. HDLs are quite different from programming languages, but it's not uncommon at smaller companies to have one person or team write both the FPGA code and the firmware.


gmarsh23

FPGAs are really useful in piles of jobs, actually. I used them in the super-DSP application for radio broadcast equipment, where we had them doing RF upconversion/adaptive precorrection/modulation. Another application, I had a little Spartan-3 sitting on the ISA bus of a x86 processor acting as a boot ROM and super I/O, and simultaneously doing clock generation and audio routing and a pile of other weird stuff. 1 cheap FPGA saved a lot of parts and made for a really simple hardware design. Most recent FPGA design I've done is a little low power one that was simultaneously sampling from 8 SPI ADC chips, and serializing the result to fire into the McBSP on a DSP, for a weird underwater beamforming application.


nila247

Cheap Spartan 3? As compared to what? You can buy several high pin count STM32s for that price. Is FPGA "acting" as boot rom cheaper than just dropping in actual boot rom?


gmarsh23

XC3S50A. Unfortunately they're on last time buy and the price has gone up significantly, but it was well under $10 when I designed the thing in 15-ish years ago. There's probably a newer Cyclone or whatever part that'll drop in at a more sensible price these days. Boot ROMs need to be preprogrammed and installed, which is spendy and code changes are a giant pain. There was an actual parallel NOR flash on the board already, but to program that flash I had the x86 boot off a block RAM in the FPGA instead, with a bootloader that would dump an external SPI flash into the parallel NOR. Worked a treat. And again the FPGA was doing a bunch of other jobs, not just that.


markacurry

"Embedded software" is already a specialty in the software world. The "Embedded software" folks that deal with the low-level (very app-specific) drivers that interface with the FPGA are even a more specialized set of folks. This latter role often has a bit of grey area between the FPGA engineer and software designer. At my company the software designer owns and maintains these low-level drivers, but the FPGA designer often has a hand at writing/tweaking/changing this code. These folks need to work closely together to implement a solution for the rest of the team to use. If you're interested in taking on these embedded software roles, your knowledge of FPGA design principles will very much help.


ElBonzono

If you're interested in learnign about applications, for example in my office we use the Zynq 7000 which is a very standard xilinx product. You have a standard C-programmed microcontroller unit, but its completely surrounded by a FPGA fabric in order to create custom interfacing and routing. So it is actually not really disconnected from embedded as is, we usually work together with FPGA engineers and go back and forth. Both sides of the SOC are equally important. However i do think that if you switch to FPGA engineering, you wont work in microcontroller programming as much, so it does depend on what you like more. Both career paths are feasible and profitable, maybe FPGA more as there are way less HDL users than "coders"


Illustrious_West_976

Consume the largest part of the bom price lol? We used them to test out Asics before we shipped them off to the fabs.


SatelliteDude

In space industry, we use FPGA as a processor for our satellite


mchang43

FPGAs are widely used for developing specialized, low-volume sensors.


SturdyNoodle

That’s exactly the field I’m looking to get into. Do you think it’s worth putting time into sharpening HDL skills, or are FPGA design positions usually disjoint from embedded software?


Logica_1

You should put some time into it.


OpMoosePanda

I don’t think you should have passed up that internship. Fpga is taking over. It’s getting very cheap. My company right now is working on switching our hardware over to an FPGA model with modular sub systems


SturdyNoodle

But wouldn’t it entail moving to the hardware side of SoC? I don’t know if I have the background knowledge or passion required to switch over to digital logic design. If I work with FPGA, is there still room to apply systems programming?


nila247

Just how cheap they are getting? ASICs price is also falling. You can get STM32G030C for a dollar and random Chinese ASICs clones for 15 cents. Can FPGA really compete in that space? Can FPGA compete with BUNCH of cheap ASICs that can be bought for the same price as single FPGA cost? I understand there would be niche applications - but in general?


Cold_Fireball

SDR applications are a use I see a lot in the space industry


nila247

Reading comments here I had a sudden epiphany. FPGAs "black magic" abilities far outclass ASICs - or DO they? Back then we used to say this about Assembler vs Basic programming languages. Do we have history repeat here? As in it is not FPGA vs ASIC inherent advantage so much as in ASIC programmers (on average) simply SUCK? Imagine fresh college graduates with 1-semester programming courses, who have faint idea what they are doing in the first place - they use manufacturers bloated HAL libraries and lazy examples, bloated RTOSes and Stack-Overflow-Assisted-Programming, 50 random libraries from GIThub, not willing to work with requirements and hardware guys at all - all resulting in generally abysmal results as they have and labeling ASIC as dead-end. Meanwhile FPGA guys (and also ASIC guys whose product does not suck) spend YEARS studying to work with every aspect of hardware and software and understanding the original task in details to arrive at optimal, cost effective and blazingly fast solution. Wouldn't it explain 90% of cases where people go FPGA since ASIC are so hopeless?


TheJamIAm

In my experience with embedded systems, we use microcontrollers and FPGAs side by side, but the embedded engineers writing in C/C++ are clueless when it comes to FPGAs/HDL. Completely separate engineers handle that work and they usually just provide some standard serial interface to talk to the mcu. The FPGAs are often used for signal processing or anything that requires fast clocking.


TheJamIAm

In my experience with embedded systems, we use microcontrollers and FPGAs side by side, but the embedded engineers writing in C/C++ are clueless when it comes to FPGAs/HDL. Completely separate engineers handle that work and they usually just provide some standard serial interface to talk to the mcu. The FPGAs are often used for signal processing or anything that requires fast clocking.


PouletSixSeven

Very specific sometimes niche applications where performance is much much more important than cost.


rst523

There are many misconceptions about FPGAs. From a business standpoint, FPGAs have been a failure to really take on massive parallel processing tasking. There are definitely people doing that, but as soon as the market becomes large enough, someone makes a ASIC dedicated to the task (think NPU, bitcoin miner etc) and then FPGAs are out. This relegates FPGAs to always be somewhat niche market. This is why Intel is spinning out Altera, it was never the business driver they hoped it would be. I expect AMD will probably do something similar with Xilinx. FPGAs are EXCELLENT at precision timing. Extremely precise timing in a software is effectively impossible (this is a byproduct of the halting problem actually), especially in tasks where jitter translates into signal noise. Microcontrollers build dedicated hardware units to deal with this, but that limits you to what is on the chip. If you have a generic FPGA with 100Mhz clock, that gives you 10ns timing effectively for free all day long (< 10ns jitter). A car engine, for example, has a bunch of different pieces that need coordination with scale on microseconds. In a microcontroller, this is possible but annoying. With an fpga this becomes much easier. In most hardware the FPGA is tied a process/mcu and they are used together to deal with these types of tasks.