- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi forum,
I'm trying to load a txt file containing 1350000 32 bits words to the SDRAM of a DE0 nano dev board but system console is freezing or crashing. I've tried to read the whole file and loading it to a variable (set data [split $file_data "\n"]) and then write it to the SDRAM in a single command (master_write_32 $m $sdram $data) but that crashes system console instantly. Then i tried to read the file line by line and, on each line, i would write it to the SDRAM, using the following code: --- Quote Start --- while { [gets $fp line] >= 0 } { foreach word [split $line] { master_write_32 $m [expr $SDRAM+[expr ($i*4)]] $line incr i } } --- Quote End --- I left the code running for a whole day just to find that system console froze :( the code worked perfectly for a file with only 10k lines. So i came to ask for help, is there any other way to do this? Is there a way to break the big file into small files using tcl? Thanks for the help.Link Copied
4 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I analyzed the JTAG-to-Avalon-ST/MM protocol
http://www.ovro.caltech.edu/~dwh/correlator/pdf/altera_jtag_to_avalon_analysis.pdf and found that master_read/write_memory work significantly faster than master_read/write_32. If you are using an older version of Quartus, you can use master_read/write_memory to improve your performance. If you are using the latest version of Quartus, look in the latest version of the handbook http://www.altera.com/literature/hb/qts/qts_qii53028.pdf master_write_from_file and master_read_to_file (see p38 of this PDF). I suspect they will be optimized for file transfers too. Cheers, Dave- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I was using version 13.0, i was not aware of those new commands, master_write_from_file and master_read_to_file, in 13.1.
I will upgrade to the latest version and try, thanks!- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It worked wonders, took 55 seconds to read 6M lines with 4 bytes each line.
Thanks dwh!- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
--- Quote Start --- It worked wonders, took 55 seconds to read 6M lines with 4 bytes each line. --- Quote End --- Yeah, I noticed that memory_read/write_bytes was significantly faster too :) Cheers, Dave

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page