Programmable Devices
CPLDs, FPGAs, SoC FPGAs, Configuration, and Transceivers
21337 Discussions

NIOS II vs 100% hardware implementation

Altera_Forum
Honored Contributor II
1,281 Views

Hi all, 

 

I'm working on a videoprocessing application. Now I am faced with a importnant design choice: create the whole system in hardware or create a SoC (a NIOS II system). 

 

Most importnant to me is that the application is maintainable, modular and boosts enough realtime performance. 

 

I know the basic premise that a hardware solution is as fast as possible, can be designed with modularity in mind and might be more of a hassle to maintain in comparison to software. On the other hand a NIOS II system should also be able to meet realtime requirements without too much trouble also provides the modularity of the Avalon bus and good maintainability. Drawback is possible IP-costs off course, but I forget about that for the time being. 

 

What I'd like to ask is, what other factors should I take in consideration (e.g. power consumption) and how do both solutions compare against each other?  

 

Thanks in advance
0 Kudos
4 Replies
Altera_Forum
Honored Contributor II
538 Views

it really depends on what you're doing to the video. 

For real time manipulation, the FPGA is is going to be cleaner and faster, especially with higher speed/resolution video. If you need to do some decision based processing (like target tracking) then you probably want the nios for the software.
0 Kudos
Altera_Forum
Honored Contributor II
538 Views

The hybrid approach Tricky suggests is typically what I go with. Normally I start with a software only approach, identify bottlenecks, then move those bottlenecks into hardware. Implementing an entire application in hardware is normally overkill as certain types of algorithms are no faster in dedicated logic than performed by a CPU. Here is an example: 

 

do { 

a = b * 2.3; 

c = b * 6.2; 

while (c < 100000); 

 

In that case a CPU with a floating point unit could keep up with dedicated hardware since there isn't much opportunity for parallelization (in it's current form). In cases like that it makes more sense to use reusable hardware (a CPU) than wasting LEs/ALUTs that will be underutilized. 

 

.... And since you want something modular and maintainable I recommend microcore based design. If you haven't heard of this before it is basically a design approach where you design smaller cores and stitch them together for a specific purpose. It has many positive attributes such as: 

 

- use standard interfaces and let the tools handle adapting the differences (SOPC Builder/Qsys) 

- smaller cores are easier to verify 

- microcores can be replaced while leaving everything else intack 

- microcores will be easier for someone else to understand than a gigantic core 

- Qsys/SOPCB let you create components made up of smaller components 

 

I can go on and on about this but I think you can see why I like microcore design :)
0 Kudos
Altera_Forum
Honored Contributor II
538 Views

I'm not too sure if NIOS II can meet timing requirements for real-time frame rate (typically 30 frames per second). But it depends on what your video resolution is. At some point you want to do some processing in hardware as well. 

 

On the other hand, the difficulty of doing a pure hardware implementation has a lot to do with your interfaces. What kind of camera are you using? A CCD camera connected to GPIOs is much easier than a USB 2.0 webcam (for which driver coded in Verilog would be insanely difficult).
0 Kudos
Altera_Forum
Honored Contributor II
538 Views

Thanks for the informative responses; this is very helpfull to me since I'm new to programmable logic. 

 

@BadOmen 

I definately see your point. As a software engineer I've always liked using divide-and-conquer methods (e.g. writing small, but very manageable functions). :-) 

 

@Pilipala  

The resolution is not too huge, just 320x240 pixels. The kind of 'camera' I'm using is a bolometer which is connected to an ADC (it has no ADC backend integrated). So in regard to the interface it hasn't been too troublesome to write the driver in VHDL.
0 Kudos
Reply