- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello, I would like to know how people think about my comments concerning the increasing complexity with today's FPGA's? This will date me but I'm just now learning TimeQuest and admit that it is a very powerful tool but it does take some training to get proficient. Many of my co-workers have stayed with Quartus 9.1 for that reason. With the recent launch of the embedded hard-core processor came another tool that one must pick up and learn, SoC Embedded Design Suite, Altera SDK for OpenCL, SOPC Builder, Qsys. With all these new tools that one must struggle to learn I begin to think that maybe FPGA's are not worth the trouble. Maybe the tools will get too complex and costly in terms of labor that designers will opt for something else such as the new DSP's that have just about every peripheral interface one might need and multiple cores running a 1GHz and higher and all running under one IDE. I am anxious about the Stratix 10 but from what I see so far with the complexity of the tools and the large megafunctions that require the NIOS processor or embedded Linux I'm taking a closer look at other options.
What are other thinking about the complexity with the newer FPGA's and where Altera is going? Thanks, joeLink Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Altera are only going down these roads because customers are asking for them, and Xilinx are doing the same things too.
With more complexity I think it is allowing engineers to specialise in specific FPGA aspects. Up until recently, I think a lot of people doing the FPGA design were also the same engineers doing the board design and schematics. But more and more these two roles are being separated. I myself have been working in FPGAs since I graduated (nearly 9 years ago) and I have done minimal schematic entry (I hate it anyway). I have done a simple backplane (with many circuits copied and pasted from another) and a small voltage level changing test PCB. The rest of my time has been in VHDL/synthesis/fitting/time specs, with some time interfacing with algorithms teams in matlab. It's also coming from the other direction, a lot of engineers called "software" are moving into the world of QSYS and system design. There are still companies out there looking for engineers that can do everything, even using all the latest tools. But these tend to be the smaller companies who can only afford to take on one new engineer to manage entire projects. Larger companies are already separating out firmware design, board design, software, unit test and verification teams. In this time I have never used QSys until now.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Joe:
Just to add to Tricky's answer, Although the performance and capabilities of processors are increasing, there will always be a need for real time operations that are outside the capabilities of processors. FPGA's still have a place, I think a larger place with the advent of the SOC FPGA's. It's it a one chip fit all? Not yet, but it's fitting more and more designs. Is the expertise required more than it use to be, YES, but so are the capabilities. Once you get use to timequest, it's a powerful tool. It's also more of an industry standard. So if you have designs going to ASIC's, the SDC file more directly translate into the timing tools required for the ASIC flow, so I HIGHLY recommend you take the time to learn it, vs sticking with 9.1. With everything in this industry, change is occurring quickly. If you don't keep up with the tools, and capabilities, you quickly get obsoleted. I remember it taking 3 days to compile a design that can now be synthesized in a hour or so. Pete- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Not all FPGA applications have an increasing complexity. There's also a need e.g. for 5k -15k LE signal processing solutions which can be handled presently by Cyclone III or IV family. devices. I presume that they are produced in large volumes and that Altera continues to serve the market.
Regarding the Timequest "obstacle", if you are presently satisfied with Quartus 9.1 and the classical timing analyzer, because your design has a neat clocking scheme and no critical external timing, it's quite easy to translate the constraints to Timequest by just a mouse click. You can explore the advanced Timequest features little-by-little.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
One reason we use fpgas is that they don't go obsolete in the same way as other complex chips do.
At worst the board will need minor retrack (and maybe power supply changes) for a different package, but all the logic can be carried forwards and should 'just work'. The modern fpga are large enough to replace several big chips from only a few tears ago. We nearly wrote a 6-port ethernet switch because the part we had been using went obsolete, but did find an alternative one.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
FPGAs aren't getting too complex, but both Altera and Xilinx have gotten so caught up in the high-end that they are forgetting a lot of customers don't need a massive Stratix part. In fact, In the last five years at my current job, we haven't used anything more powerful than a 3C40 (though that is about to change) We build test sets that often just need a blob of logic to speak some non-standard protocol, generate a waveform to a DAC, packetize the output from an ADC, or implement a custom algorithm faster than a CPU can do it.
I am more worried about the disappearance of the *QFP packages than I am complexity. Virtually all, save one or two, of my designs are 4-6 layer boards with either a 144-pin EQFP or 240-pin PQFP FPGA. I believe a lot of designs could be done on a 4-layer board with a QFP, but once you start talking about BGAs, it's next to impossible to break out on a 4-layer board. 6-layer is doable, but to get at every I/O pin you generally have to go up to 8-layer, which increases cost. Given how often the technicians manage to burn out some of these boards, that can get pricey fast. Then, there is the fact that BGAs are hard to repair or replace. We have a BGA machine, and we are looking at getting a reballer, but we only have one tech that knows how to operate it. With QFP parts, any of the technicians can replace a faulty part. I know these packages are as old as the hills, and they slow down designs because of the massive inductance in the pins and bond wires, but there are still a lot of shops like ours that don't need 200+ MHz I/O, fancy transceivers, etc. Thus, I'm really hoping that either the C3/C4 parts stick around for a long time, or Altera releases a QFP version of the C5 at some point.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The problem is that the high end is where the margin is....
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
in my opinion this is the "old question" of selling high-end with margins (but less devices) or the "butter and bread" types that are cheap enough to be used in many applications with less margin (but a lot of units). You need to have high-end solutions otherwise you are not recognized as an innovative company, but if all companies are only innovative, this leaves the normal users "lost in space". I also do not know, if these devices with hardcore processors are really well named "FPGA"; Long years ago there were already some companies which were combining stuff on a chip and these units were called "System on chip". Working with FPGA since 20 years I started with the FLEX 8K devices and every new Generation offered more logic at lower costs, targeting more and more a lot of applications which "had less money" to spend on the Hardware, sometimes even replacing DSP implementations as the FPGA offered a lot more flexibility. This "offer cheap powerful FPGAs for empowering a lot of applications" seems to be replaced partially by the intention to provide the highest integration at highest speed and highest performance. For many applications the Stratix, Arria, ... were never an Option; if the trend of FPGAs is towards these devices, the FPGA Integration into cost sensitive applications will shrink (leaving either space for other devices or ?)- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- The problem is that the high end is where the margin is.... --- Quote End --- I don't think anyone has a problem with a company selling high-margin products. I think the problem comes when they abandon everything else in favor of only high-margin products, and leaves a void in the market. I actually like to see improvements in the high-end parts, because I know those improvements will eventually trickle down. Having recently redone a design in a Cyclone part, I found myself really missing the flexibility in the Cyclone III/IV PLL's, for example. I'm clearly no sales guy, but I can't imagine that the volume of "low-margin" sales to folks like us isn't profitable either. I suspect that for every high-end Stratix part sold, there are dozens or more Cyclones or Arrias sold.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I think it simply comes down to numbers. It costs a fortune to develop a new part, and all the technology will be in the new stratix. You'll see that cyclone and arria have almost identical parts to the stratix, just less of them and less flexible. But with that high NRE, they want to recoup as much as they can asap.
If they went into a new cheap part, they would have to sell in large volumes at the start to get the numbers to pay back some of that cost. But it's many months or years before a customer will start to ship in any decent volume, as they need to do their engineering. The top parts can be sold as engineering samples at a high premium, but a cyclone couldnt. So it only makes sense to sell the top end parts first.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Well I agree with the basic idea behind having higher complex (and thus higher priced) parts to come earlier to the break-even point between NRE and money earned by selling parts.
But there is one other issue besides this that is related to regulations of the intended target application's domain. Assuming you are not as free as possible in choosing your architecture but have to observe regulations on design, validation, verification, ... Especially if there are different rules for logic design (like FPGA) and controller design (like µC/µP) the idea of a SoC to combine "best of both" (technological) turn to be a huge increase of work as both domains are combined. If you are in regulations you would either use an µC/µP OR a FPGA but never combine both (if not really required by application) and moreover if you have to combine these types you would not do this on one chip as this chip design get's highly complex in V&V... Finally the problem of the time gap for development of new designs for the CycloneV has been increased very much by the (IMHO) strange timing of ALTERA. I did a design which I'd like to do with the new announced Cyclone V (now it uses a mid-size Cyclone IV). When we did this design ALTERA had announced the Cyclone V already some month ago but it was years to come between the announcement and the first production units being available. Maybe the "there are no engineering samples available" statement I was frustrated of when we did the design saved myself from getting blamed by my boss and customers, as all units we produced the last years would not have been produceable due to the missing CycloneV. Thus it is one side of the medal to blame the developers not to place orders for the new chips due to their latency in development, but the other side is that the devices should not be announced if there are no units to ship in time to support development, prototyping and first batch of electronics... Just my two cents :-)
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page