When I run compliance testing on the Stratix V FPGA card, with the LeCroy PERT and Wavemaster scope, I only get it running Gen3 at pre-set 7.Can anyone say what causes that constraint ? To pass PCI SIG compliance I believe the card / device has to pass at atleast 3 pre-sets. Best Regards, Bob.
I'm no PCIe expert, by my understanding after I read the spec is that a gen 3 endpoint is required to operate at the highest speed supported by the root complex. I does not imply that this is selectable. Assuming your device is a card that plugs into a PC, for full compliance testing you would need to verify it with 3 different motherboards, one that supports gen 1, one at gen 2 and one at gen 3 speeds. Conversely, if your device is a motherboard/root complex, you would need to test it with gen 1, 2, and 3 cards.
Thanks Galfonz for the reply...My understanding is that the 11 presets are equalization settings ( at tx and rx ), that relate specifically to Gen3 speed. I have attached the pre-sets definition ... I am no expert either and preset 7 is a "nominal" setting as I unserstand... in compliance mode or normal mode , I understand test equipment or the RC can negotiate with the EP to try another preset in order to get to a better bit error rate ( BER ). My colleague indicates Stratix V kit as an EP is stuck at preset 7 and refuses to move to another preset when requested ... I believe there is an easy answer and that person is probably on this forum. Best Regards, Bob.
I didn't work on the stuff at this level, but my colleagues who did said that the higher levels of the IP are in control of these settings and they can't be changed by the user application. The IP negotiates them between root and endpoint and then locks them. As I recall, only when we were using the transceivers directly to talk between 2 FPGAs on a board could those settings be changed. This wasn't using PCIe.