- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have run into an issue while simulating an Arria10 Transceiver configured for Rx using DeSerialize X2 option. It is configured for 6Gbit operation and the FPGA interface is 40-bit. The problem is that the rx_clkout on one of the lanes is going away for some cycles and then coming back (either staying high or staying low). There appears to be a valid stream coming into all lanes with what looks to be proper 8b/10b coding and a data rate of 6Gbit. The rx_clkout on other lanes does not seem to have this issue. If I turn off the DeSerialize x2 option and run the FPGA interface at 20-bit, this problem goes away. Also, if I use a 4Gbit lane with everything else the same, I don't have the issue.
Configuration: Basic/Custom (Standard PCS) PMA configuration (basic) Transceiver mode (Rx Simplex) Number of data channels (2) Data Rate (6000 Mbps) Enabled Simplified data interface CDR Reference clock (100MHz) PPM detector threshold (1000PPM) Enable rx_is_lockedtodata port Enable rx_is_lockedtoref port Standard PCS/PMA interface width (20) FPGA fabric /Standard Rx PCS Interface width (40) RX FIFO mode (low_latency) Rx Byte Deserializer mode (Deserialize x2) Rx rate match FIFO mode (Disabled) Rx word aligner mode (bitslip) Rx word aligner pattern length (7) Rx Word aligner pattern (0x0) MattLink Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I've tried a few different reference clock frequencies for both the ATX pll and the for CDR reference clock. What I have found is that 100MHz and 125Mhz do not work, but if I use 150Mhz the problem goes away. Are there simulation constraints that I'm not aware of?
Matt- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page