Intel® Quartus® Prime Software
Intel® Quartus® Prime Design Software, Design Entry, Synthesis, Simulation, Verification, Timing Analysis, System Design (Platform Designer, formerly Qsys)
16556 Discussions

Does SDK 16.1 support host channel, as described in Host Pipelined Multithread?

Altera_Forum
Honored Contributor II
1,143 Views

Hi, 

 

I found a new pipelined framework for high throughput design proposed in the document an831 - intel fpga sdk for opencl host pipelinedmultithread. I am using Arria 10 board and OpenCL SDK 16.1, does my platform support this new framework, especially for the host channel streaming feature? 

 

In the document, he took data compression as an example, and mentioned read_channel_intel(host_to_dev) and write_channel_intel(dev_to_host,...), but how can I realize these channels? Where to find the related BSP and also, the related code? 

 

 

 

Thanks in advance. :p
0 Kudos
3 Replies
Altera_Forum
Honored Contributor II
215 Views

The actual code is probably not available to public, you can directly ask Intel to see if they will give you the code. Host channels are only supported in Quartus 17.1 and above and a compatible BSP only exists for Altera's reference Arria 10 board right now.

0 Kudos
Altera_Forum
Honored Contributor II
215 Views

Starting in 17.0 you can emulate host channels with named pipes if your BSP has support for IO channels. Reading from and writing to an IO channel in 17.0+ creates a file with the binary data during runtime which can then be replaced with a named pipe if you want to emulate the communication from one fpga to another using IO channels. In the actual IO channel use (without emulation), you must use the IO channel named specified by the BSP. Otherwise, I have not seen the use of host channels outside of emulation, plus the support for this feature currently seems limited.

0 Kudos
Altera_Forum
Honored Contributor II
215 Views

Thanks a lot. 

 

Terasic has not provided the BSP of my board for SDK 17.1... Maybe I have to consider other options...
0 Kudos
Reply