Intel® Quartus® Prime Software
Intel® Quartus® Prime Design Software, Design Entry, Synthesis, Simulation, Verification, Timing Analysis, System Design (Platform Designer, formerly Qsys)
17246 Discussions

Schematic entry (4th generation) vs. VHDL/Verilog (3rd generation)

Altera_Forum
Honored Contributor II
15,514 Views

Over the past 30 years I've seen languages come and go. I started programming in assembler, and have since graduated to many 2nd and 3rd generation languages. I tend to shy away from 4th generation tools because I dislike graphical interfaces during programming. Meanwhile the world is going wild using ever more graphical tools to describe processes to make it easier to model/manipulate/program. 

 

Enter the electronic design world. I've designed (mostly digital) circuits in the eighties, using schematic entry. Then I return to digital design these days and find that schematic entry is frowned upon, and using VHDL/Verilog is deemed "more professional". 

 

Can someone explain to me why schematic entry (which, from a pure programming standpoint could be considered a 4th generation entry-tool) is considered inferior to VHDL/Verilog (which, from a pure programming standpoint, could be considered 3rd generation entry-tools)? 

 

I've looked at VHDL/Verilog, and even though they work for me, I find that they make it more difficult for me to envision (gate)delay characteristics and clock domains than when I'd use schematic entry/building blocks. Schematic entry supports a fairly straightforward left-to-right/top-to-bottom processing workflow; VHDL/Verilog however tend to result in spaghetti-like codeflows (because of the parallel nature of the hardware description), which makes design errors more likely... 

Unless, someone is able to envision the schematic equivalent in his head while kranking out VHDL, but then using VHDL is more of a nuisance than an advantage. 

 

VHDL has its place for complicated table driven logic which isn't easily described in functional blocks, but other than that, it makes it more difficult to envision the hardware equivalent than using schematic entry and thus hinders productivity. 

 

Could anyone more proficient than I am in VHDL/Verilog, comment on my assertions above?
0 Kudos
69 Replies
Altera_Forum
Honored Contributor II
2,947 Views

Thanks.Are these used only with schematic entry or do they show up when HDL is compiled?

0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

They describe the schematics and the (graphic) symbol respectively. They are necessary for schematic entry, they are pure source files, they are not generated during compilation. The bsf file can be generated manually from an existing bdf file if the bdf file is supposed to be used as a hierarchic submodule in a larger schematic.

0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

 

--- Quote Start ---  

And, did Altera drop it for lack of interest? I don't know AHDL and cannot offer any worthwhile comment. Just a logic/systems guy that can put together some code and get it to run. Would like your comments, especially if the similarity is undesirable. Thanks. 

--- Quote End ---  

 

 

Theres nothing wrong with it as a language. The big problem is lack of support from other vendors (especially as it is Altera HDL). 

 

The problem with the language, and anything similar is that you cant get away from the problem that different vendors have slightly different technologies.  

 

IIRC, altera registers all have an async reset input, while Xilinx prefer sync resets. Also, Altera have a reset and a preset inputs. This comes out in the code like this: 

 

node my_reg : dffe; my_reg.clk = clk; my_reg.en = enable; my_reg.reset = rst; mr_reg.prn = something; my_reg.d = input1 & input2; output = my_reg.q; etc. etc. It also has libraries for the multipliers, fifos, memories etc etc that come from vendors. But again the problem is that the interfaces for these differ from vendor to vendor. Writing like this is forcing you to think about the target technology, rather than code that is vendor independent. In VHDL, I can create a RAM that any synethesisor should pick up and place (sorry for the VHDL, but I dont know verilog): 

 

type ram_t is array(0 to 1023) of std_logic_vector(7 downto 0); signal RAM : ram_t; process(clk) begin if rising_edge(clk) then if we = '1' then RAM(wr_addr) <= input; end if; read_addr_r <= read_addr; end if; end process; output <= RAM(read_addr_r); --for a registered read. With this code, I can run it through any synthesisor without modifying the code, because it defines a behaviour that matches the behaviour for a ram, without worrying about different vendors port names.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

I believe Altera is going to or already has 'orphaned' AHDL, as it looks they are developing all their new IP in Verilog? 

We used to do a lot in AHDL (being a concise 'low level' language in comparison with verbose VHDL and others), but have switched to VHDL mainly because that's what the younger developers learn at the university (over here in Europe).
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

I dont think it will die for a long time. A huge amount of their low level ip is in AHDL. Look in your \DB\ folder in a design - it is full of .tdf (text design file).

0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

There are so many concerns that the situation seems hopeless except for the fact that there are successful projects So ...  

1) There is a seed that contains the gist of function and likely a block diagram .. just general goes in, comes out stuff, with an idea of data manipulation. 

2) At this stage, there is not really enough detail to effectively do design entry, but it would be nice to have a way to apply some inputs and see some output. 

3) There are controls, registers, and memory. and each has a name. The regs and memory hold data. Boolean combinations of controls sampled by a clock control the data movement among the regs and memory. 

4) There is no need to draw lines to connect the logic, do it with name.block resolution Coding style for HDL is unnecessary. Everything is text so a text editor can be used to select the appropriate text for the level of function detail for each block in a folder. Here is a place where source control can be used to see exactly what is different for each level of detail. If the interface between two blocks changed, then both blocks have to be at compatible levels. 

5) The function can be simulated to the level of detail where there is value added by doing detailed design entry with the tool and vendor of choice. 

6) RTL applies to programs as well as hardware and this methodology as well. Register contents are transferred directly or combined by operators during the transfer. It should be possible to generate internal RTL syntax to get the netlist which is common to all forms of design entry and serves as the graphic image of the logic, but that would bypass synthesis, placement, and routing which which require conventional design entry.  

7) Is it reasonable for young engineers to gain experience by using their HDL skills in the design entry and gain experience in looking at RTl/netlists and relating that back to the intended logic implementation? 

8) Now the sticky questions. This would be a tool independent of vendors. Who would own and support it? Will anyone try it out on some simple little design?
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

I wonder if your question was answered. Seems like HDL has the momentum. A big reason is that it is a standard across the industry. Also, the tools are written by people who are not concerned about path delay and clock domains. Even when we used schematic entry, the placement and routing used a sort of HDL to map from logic block to physical block. Our logic diagrams required placement and routing for printing as well as placement and routing for wiring. Text input is the natural for program data entry and the net list view does show a schematic. HDL does work, just not too efficiently. 

 

I have a simulator prototype attached along with a short description of the input syntax and a test case that initializes a couple of registers and moves some bit fields from one to the other. The control logic input is Boolean rather than if/else. The register assignments controlled by each clock(domain) follow the clock definition in the text input. 

 

The .exe is in the .zip and the syntax description and test case example are in the .txt. 

 

Enter the electronic design world. I've designed (mostly digital) circuits in the eighties, using schematic entry. Then I return to digital design these days and find that schematic entry is frowned upon, and using VHDL/Verilog is deemed "more professional". 

 

There are some religious issues, converts would say "more professional". 

 

Can someone explain to me why schematic entry (which, from a pure programming standpoint could be considered a 4th generation entry-tool) is considered inferior to VHDL/Verilog (which, from a pure programming standpoint, could be considered 3rd generation entry-tools)? 

 

I've looked at VHDL/Verilog, and even though they work for me, I find that they make it more difficult for me to envision (gate)delay characteristics and clock domains than when I'd use schematic entry/building blocks. Schematic entry supports a fairly straightforward left-to-right/top-to-bottom processing workflow; VHDL/Verilog however tend to result in spaghetti-like codeflows (because of the parallel nature of the hardware description), which makes design errors more likely... 

 

That is because the hardware is parallel, and the objective of HDL is to use sequential if's to define parallel and gates. In a schematic you can follow a clock line and see everything it triggers. In hdl you have to find all the always blocks where the clock appears in the text and there has to be an always block for practically every register. 

Why would anyone want to look at a block symbol shape, see 4 inputs, and immediately realiz that 4 signals are being and'd. It is more of a challenge to count the if statements. 

 

 

 

Unless, someone is able to envision the schematic equivalent in his head while kranking out VHDL, but then using VHDL is more of a nuisance than an advantage. 

 

Remember also that synthesis is in there anxious to throw away the logic that you don't have totally connected so that it can see that it drives an output pin. And it may or may not generate the logic that you wanted. And placement and routing also waste compile time by running while the design is not logically complete. I have been called an idiot for complaining about synthesis throwing logic away that did not drive an output pin because all I was doing was wasting precious power. So I cannot put in incomplete logic to get a resource count for sizing purposes.  

 

VHDL has its place for complicated table driven logic which isn't easily described in functional blocks, but other than that, it makes it more difficult to envision the hardware equivalent than using schematic entry and thus hinders productivity. 

 

Could anyone more proficient than I am in VHDL/Verilog, comment on my assertions above? 

--- Quote End ---  

0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

From My POV - drawing schematics becomes extremly unwieldy as designs become more complex (and I have seen large complex designs created with schematics). You could argue you just break it up into smaller chunks, but then you get into a massive heirachy, which can be just as bad. 

 

The great thing about VHDL/Verilog is that it allows you to write more behavioural like code to avoid the stupidly complex diagrams or large hierarchies. Synthesisors and parts are pretty good at laying this into logic on chips at a decent clock speed (if you're working at <100MHz, you can get away with quite long logic chains between registers). I wouldnt want to do a large design with multiplee FIR filters, downsamplers, upsamplers and histograms with just schematics. (and I have tried with simulink). Besides, HDLs have nice methods to tidy away parrallel and identical bits of hardware. 

 

Plus at the end of the day, HDL is standardised text. Schematics have to be visualised.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

I do not agree, again, with Tricky's view. A well crafted schematic shows the data- and control- flow in his 'large' design, where now that view only exists in his mind. HDLs may be standardised text, but try reading a large text design ... it may take days before you understand what the hell is going on! Of course here you can resort to RTL-viewer but that is only half-baked in comparison to a proper drafted schematic. 

 

Of course one doesn't use schematic design to build the lowest-level (no belittling intended here) building elements like FIR filters, etc. Because here the HDLs are at their best.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

 

--- Quote Start ---  

From My POV - drawing schematics becomes extremly unwieldy as designs become more complex (and I have seen large complex designs created with schematics). You could argue you just break it up into smaller chunks, but then you get into a massive heirachy, which can be just as bad. 

--- Quote End ---  

 

I realise that any method of describing things can be used in improper ways, but I would suspect that if done with equal care and consideration, a properly created schematic hierarchy will be more readable than an equally proper equivalent piece of HDL (yes, I know this doesn't hold for all cases, there undoubtedly are situations where HDL is superior in clarity). 

 

--- Quote Start ---  

 

The great thing about VHDL/Verilog is that it allows you to write more behavioural like code to avoid the stupidly complex diagrams or large hierarchies. Synthesisors and parts are pretty good at laying this into logic on chips at a decent clock speed (if you're working at <100MHz, you can get away with quite long logic chains between registers). 

 

--- Quote End ---  

 

I wholeheartedly agree that certain pieces of logic are easier described using HDL, however that doesn't preclude that it is equally possible to describe functionality in a hierarchical schematic with the knowledge that the same compiler/optimiser as for HDL will reduce and integrate the schematic into something more efficient than the bare schematic would indicate. The difference between HDL and schematic entry in this case is though that when I use schematic entry I can largely predict what the total delay of the logic is going to be; when using HDL it is more of a surprise as to what comes out. 

 

But, yes, at the end, the current portability of HDL is something which is hard to beat with schematic entry.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

Portability is highly overrated.  

A few questions / observations:
  • Are you using a portable toolchain, i.e. only using Quartus II to place and route.
  • Do you develop IP for third party use (commercial)
  • Do you plan on switching between competitors? 

I am absolutely sure that 80 percent of the of the forum-vistors will answer three times no
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

 

--- Quote Start ---  

I do not agree, again, with Tricky's view. A well crafted schematic shows the data- and control- flow in his 'large' design, where now that view only exists in his mind. HDLs may be standardised text, but try reading a large text design ... it may take days before you understand what the hell is going on! Of course here you can resort to RTL-viewer but that is only half-baked in comparison to a proper drafted schematic. 

 

Of course one doesn't use schematic design to build the lowest-level (no belittling intended here) building elements like FIR filters, etc. Because here the HDLs are at their best. 

--- Quote End ---  

 

 

Actually, When I wrote my post I was from the POV of someone who implements FIRs, downsamplers etc. I HATE implementing individual bits with schematics. connecting individual registers and logic gates is a waste of time. This is what I think Simknutt is trying to talk about, rather than higher level connections. 

 

But when it comes to higher up the hierarchy I can see the advantages. Im just limited to HDLs so I can actually simulate a connected design. I will often create a document of the basic layout schematic though so others can try and see the layout - be it either something I draw or the existing simulink model.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

 

--- Quote Start ---  

I have a simulator prototype attached... 

--- Quote End ---  

 

I have to admit that I probably won't have time to properly look into this until after my current project is finished. I like toying with new ideas, but sometimes work comes first. 

 

--- Quote Start ---  

... 

That is because the hardware is parallel, and the objective of HDL is to use sequential if's to define parallel and gates. In a schematic you can follow a clock line and see everything it triggers. In hdl you have to find all the always blocks where the clock appears in the text and there has to be an always block for practically every register. 

Why would anyone want to look at a block symbol shape, see 4 inputs, and immediately realiz that 4 signals are being and'd. It is more of a challenge to count the if statements. 

 

--- Quote End ---  

 

No argument here, I think we approach this in similar ways. Even though I have a vast experience in designing (large project) software and basically are proficient in reading through large amounts of source; I prefer the schematic approach in case of a circuit design. 

 

--- Quote Start ---  

 

Remember also that synthesis is in there anxious to throw away the logic that you don't have totally connected so that it can see that it drives an output pin. And it may or may not generate the logic that you wanted. And placement and routing also waste compile time by running while the design is not logically complete. I have been called an idiot for complaining about synthesis throwing logic away that did not drive an output pin because all I was doing was wasting precious power. So I cannot put in incomplete logic to get a resource count for sizing purposes. 

 

--- Quote End ---  

 

This I solve using a wildcard virtual pin assigment which will make sure that any output which I don't assign to a real pin, is at least not optimised away.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

 

--- Quote Start ---  

Actually, When I wrote my post I was from the POV of someone who implements FIRs, downsamplers etc. I HATE implementing individual bits with schematics. connecting individual registers and logic gates is a waste of time. 

--- Quote End ---  

 

The lowest level I normally go during entry is the individual LPMs Quartus II already provides. The trick obviously is to build up sane (usually meaning that there are few inputs and outputs) logic blocks which can be reused as larger blocks in the next hierarchy level. I.e. in that way, most of your design is designed at higher levels.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

HDL's standardization is key, agreed. We are certainly not going back to build the cross industry standard for schematics. "Schematics have to be visualized" I am thinking about. I am used to visualizing hardware from block diagrams or schematics, but have a hard time getting the big picture from HDL.(Maybe lack of education).  

You describe a design that has multiples of different type, and can run at moderate clock speed. 

Other designs may have complex functions with only moderate amounts of identical bits of hardware, other than the basic Mega Function blocks. Usually an embedded processor is used and the function programmed in a programming language which is very different from HDL, The program source code is text, the HDL is text, but what connects(interfaces) the two? 

Also if performance is key, the physical long path is known, what information is needed to fix it?
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

 

--- Quote Start ---  

Actually, When I wrote my post I was from the POV of someone who implements FIRs, downsamplers etc. I HATE implementing individual bits with schematics. connecting individual registers and logic gates is a waste of time. This is what I think Simknutt is trying to talk about, rather than higher level connections. 

 

But when it comes to higher up the hierarchy I can see the advantages. Im just limited to HDLs so I can actually simulate a connected design. I will often create a document of the basic layout schematic though so others can try and see the layout - be it either something I draw or the existing simulink model. 

--- Quote End ---  

 

 

I think it's the basic data flow that you generate and I won't argue HDL for that part although grabbing the mega functions and lpm's onto a bdf and sliding them around to form the picture is my preference. It is OK that they are coded in HDL. Then I can connect them with busses and only draw physical busses where I like. 

 

The point of being limited to HDLs and the connected design and what I think is verbose text. Define a module, list the input ports, output ports, the edge of the clock used, I don't know what else, then an if condition(s) to describe the logic, then the name of the register. And that still does not connect the modules ports to the other modules ports. Hopefully simulink helps with this.  

 

We used to pencil in Boolean expressions on complex control lines to have a concise description of the control then probe to see which condition was incorrect. Worked! 

 

OOP languages have scope resolution operators for connecting objects, maybe as simple as a composite name with the module name '.' register name . So the syntax I have suggested: name of the register or bus to be assigned followed by ':' then the name of the register or buss used for input followed by '?' and the Boolean condition that controls/enables the assignment. Then back to the block diagram that has the clks and data inputs, add the control Boolean to the enable and the functional connection is done, but for simulation, the conditional assignment statement is used for input to a rather simple simulator that draws a functional waveform much as the designer might draw by hand to understand a particular event. 

 

Then comes the messy part, that's the cloud of combinatorial control logic that leads to spaghetti tangles.
0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

Just looked at Qsys docs and it looks like a good approach. Too bad it is not available on the web edition so a poor boy like me can try the beta.

0 Kudos
Altera_Forum
Honored Contributor II
2,947 Views

You can install the subscription version for a 30 day free trial.

0 Kudos
Altera_Forum
Honored Contributor II
2,951 Views

 

--- Quote Start ---  

 

You forget that one of the biggest reasons for creating VHDL/Verilog was ostensibly to get a bunch of S/W people coding H/W designs. That, at least, was the "pie in the sky" vision being pitched by Cadence 10+ years ago. 

--- Quote End ---  

 

 

This is still an important, possibly overriding factor. I remember quite well the visionaries promoting the repurposing of C++ commodity programmers into hardware "designers". The takeover just happened to be with VHDL etc. 

 

HDLs make designs less dependent on "gurus" who actually understand hardware, unlike many young pups just out of school, reared on VHDL or Verilog, who couldn't design a 7-segment decoder by hand to save their life, or understand how a JKFF state machine works (actually, I thought JKFFs sucked: once you have DFFs there's no going back IMO).  

 

The gurus only design compilers now. Rooms full of cheap "paper clip" pseudo-HW engineers crank out the HDL, having only a hazy idea of how their stuff will be expressed in the actual hardware. 

 

//A
0 Kudos
Altera_Forum
Honored Contributor II
2,951 Views

 

--- Quote Start ---  

This is still an important, possibly overriding factor. I remember quite well the visionaries promoting the repurposing of C++ commodity programmers into hardware "designers". The takeover just happened to be with VHDL etc. 

 

HDLs make designs less dependent on "gurus" who actually understand hardware, unlike many young pups just out of school, reared on VHDL or Verilog, who couldn't design a 7-segment decoder by hand to save their life, or understand how a JKFF state machine works (actually, I thought JKFFs sucked: once you have DFFs there's no going back IMO).  

 

The gurus only design compilers now. Rooms full of cheap "paper clip" pseudo-HW engineers crank out the HDL, having only a hazy idea of how their stuff will be expressed in the actual hardware. 

 

//A 

--- Quote End ---  

 

 

 

I think thats rather unfair. Technology today is much different. People have to deal with generating large multi-dimensional filters with complex memory controllers and other complex interfaces (as an example). Id hate to see anyone try to create large FPGA designs using 74 series chips. The best engineers I know still understand all the old school stuff, but are much happier today in the HDL world. They all know what kind of resource usage they will be looking at without even writing a word of HDL code.
0 Kudos
Altera_Forum
Honored Contributor II
2,951 Views

 

--- Quote Start ---  

The best engineers I know still understand all the old school stuff, but are much happier today in the HDL world. They all know what kind of resource usage they will be looking at without even writing a word of HDL code. 

--- Quote End ---  

 

 

And that's where the trouble starts with software engineers in general (including HDL only types): the majority has extreme difficulties thinking about problems in a parallel processing fashion. Old-school hardware engineers are very skilled in thinking in parallel processing. If someone trained in parallel processing (like an old-school hardware engineer) then turns to using HDL to implement the more complex designs, he'll be fine, because he's able to extrapolate from software into hardware and back; it should result in balanced and efficient HDL which can be turned into efficient hardware. 

 

New(er) generation HDL/(or even old generation) software engineers are more likely to produce HDL which does not necessarily translate into efficient hardware. They basically depend on the quality of the compiler to get it right, which is not bad in itself, just a little unpredictable (at least with the current quality of compilers).
0 Kudos
Reply