I am trying to simulate the Intel DisplayPort Arria 10 Sample Project(Quartus Prime v17.0) in Aldec Active-HDL (v10.3 64 bit) and don't think the simulation is working correctly.
The testbench executes all the way to the end with CRCs reported back to be all 0's for R, G & B. The vbid - NoVideoStream_Flag is high during the entire simulation.
All of the signals mentioned in Figure 36 of UG-01131 (RX Video Waveform) are static for the entire simulation. I would expect the received video to look similar to the timing diagram of that figure.
Is there something wrong with the simulation or am I not setting it up correctly?
Signal "rx_vid_clk" is input signal and this frequency is ex. 162MHz.
So, would you input rx_vid_clk as 162MHz ?
This module (dp_sink) is generated some signals (ex. rx_vid_sol, rx_vid_eol, rx_vid_sof, r_vid_eof and so on) by rx_vid_clk.
BTW, if rx_vid_clk already toggled, would you wait for one frame ?
These signals are generated by MSA, VBID and so on.
Would you make sure rx_parallel_data signal on native_phy_rx, if it exist ?
I'm using DP Sink IP on Arria V.
So, it might not exist this module. But different name...
I run the simulation in modelsim_se edition and I also see the same observation (all zero). Let me check further on this.
But frankly speaking, for DP, usually we directly validate the IP on actual hardware and seldom run the simulation.