- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hello all,
I am trying to reproduce an actual design bug in simulation. I can identify the bug in my real system by performing some statistical tricks, e.g. exciting the design with controlled stimuli. However, for some reason, I cannot reproduce the bug in a RTL-level simulation, that would be really very easy to spot the problem. I am now willing to run a gate-level simulation to see if I can reproduce the problem. For the RTL-level simulation, I have used the signal_force function of Modelsim to set some initial values in registers that are typically programmed by the NIOS II system, which in turn is driven by PC software via a USB connection. By using the signal_force function, I can setup my simulation without the need to run the NIOS II processor. Basically, the NIOS II is only used to setup those resisters in the real system. The question now is, how can I use signal_force in a gate-level simulation? In RTL, the register names are clearly known, so it is easy to spot them. In the gate netlist of my Cyclone III design, I can observe that every bit of my registers are made up of at least to instances: "dffeas" and "cycloneiii_lcell_comb". Is there a easier trick to force those internal register values without the need to model all my system (USB interface protocol, NIOS II softwware, etc...)? Any suggestion is really appreciated. Cheers,コピーされたリンク
5 返答(返信)
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Timing simulations are unlikely to show these types of problems. (Heck, timing sims often miss stuff that would have been picked up with static timing analysis). If you're passign static timing, I would suggest signal tap(or signal probe). There are many things timing sims won't show, but probing the actual design is pretty much guaranteed to eventuall show the problem.
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Hi Rysc,
Thanks very much for your suggestion, I understand your point. The problem is that my design takes up nearly 90% of the available SRAM and 80% of the logic cells, hence, a big design. Also, my bug is very suptle, not a catastrophic one. In order for me to spot the problem, I need to acquire quite a lot of data onto a PC via USB. Only after performing some statistical tests in MATLAB is that the problem can be seen. When I run the RTL (or gate-level for that matter), I use textio to write the results into texts files, then I use MATLAB to process the output to check if the error is present. So far, in RTL, I could not reproduce the error. In conclusion, my error is only indentified after approximately 100ms of run/simulation time, and the output data is quite significant. By looking at the waveforms of signal tap, not possible to spot the problem, not enough data. I hope the problem is static bug, that could be spot by the gate-level simulation. If not, I would be really in trouble to reproduce the bug and eventually correct it. To do a gate-level simulation, probably I will need to go over the gate netlist and spot all the required registers manually. This is a huge amount of work. Any other suggestion? Cheers!- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
People,
I have gone through the gate-level netlist, and figured out the name of all the registers I need to force for simulation. Eventually, the names are very similar to the RTL, except for some synthesized out signals. My gate-level simulation is now running without too much trouble. The only downside: it will take 25 days to complete the required 100ms! How faster would be Modelsim PE than Modelsim-Altera Start edition? Does anyone know? Thanks.- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Rysc,
Having given up the gate-level sim, I decided to go through the painful task of modifying my design to leave some space for signal tap. This was the winning approach, although I was reluctant... After several hours observing the signal tap waveforms, by chance, I spotted the problem. I can't get over it! Thanks for pushing me onto this approach. You're totally right, signal tap should be the way to go every time. Cheers.- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Nice work. SignalTap is a fantastic debug tool, but even with that, you can spend hours/days tracking down the really low-level stuff. (Or in your case, the stuff that isn't apparent without viewing the data at a higher level). But I think users have significantly better odds over timing sims.
One trick I've seen done on designs where most of the memory is used, is to put some hierarchy that is irrelevant to the problem area into a partitiont and set it to Empty. That removes that logic from the design, freeing up any RAM resources it was using, but the hierarchy boundary is preserved so nothign is ripped out upstream or downstream. Of course anything talking to that block is now unreliable, and in many cases this won't work, but I have seen a number of people be successful with those flow and never have to modify their HDL.