FPGA Intellectual Property
PCI Express*, Networking and Connectivity, Memory Interfaces, DSP IP, and Video IP
6355 Discussions

Avalon burst mode not working with LPDDR2 IP core

Altera_Forum
Honored Contributor II
941 Views

I'm experimenting with the TerAsic demo card (Cyclone V) and want to pass data from a module to the LPDDR2 memory in bursts. I created my memory interface in Qsys using the LPDDR2 SDRAM Controller with UniPHY, specifying the hard EMIF and the Maximum Avalon-MM burst length = 16. Next, I wrote a custom data acquisition module that creates an Avalon MM master with a five bit burstcount output. I then packaged my module into a Qsys component and connected the Avalon MM bus between them in Qsys. I can find nothing in any literature to indicate this will not work, but when I cut the simulation files and try to run the sim, it appears the DDR2 core has completely ignored the requirement to allow bursting and only negates the Avalon waitrequest for one word at a time. It would also appear that the Qsys generated verilog file for the Avalon MM slave has the parameter for USE_BURSTCOUNT set to 0 which seems to agree with what I'm seeing in SimVision.  

 

How do I get the DDR2 core to accept the input bursts?
0 Kudos
0 Replies
Reply