GPU Compute Software
Ask questions about Intel® Graphics Compute software technologies, such as OpenCL* GPU driver and oneAPI Level Zero
140 Discussions

GPU with native hardware support for binary128

elliotl
Beginner
3,053 Views

Does anyone know if Intel or any other vendor is contemplating creating GPU hardware that supports binary128?

There is an interesting article published by IEEE.org describing a university research project that creates basically what I am looking for. It is titled "High Performance High-Precision Floating-Point
Operations on FPGAs using OpenCL".

OpenCL does not currently support binary128,  but the Intel FPGA SDK for OpenCL provides support enabling one to call a custom module from an OpenCL kernel. This enabled the researchers to access their custom binary128 hardware from inside OpenCL kernels.

Labels (1)
0 Kudos
6 Replies
cw_intel
Moderator
2,937 Views

For intel GPU, I will check with the relevant team and update here. 

0 Kudos
cw_intel
Moderator
2,883 Views

For Intel GPU, I have escalated it to the development team. And waiting for their feedback. When I receive the feedback, I will let you know.


0 Kudos
cw_intel
Moderator
2,826 Views

Hi Elliotl,


Could you tell us about the business impact of GPU supporting binary128?  Your points will help us evaluate this work. 


Thanks,

Chen


0 Kudos
elliotl
Beginner
2,737 Views

Hello Chen,

 
I wish that I could say that I needed 1000000 GPUs with native binary128 support to populate a server farm for bitcoin mining. But that is not the case.
 
I calculate very large Mandelbrot fractal images. A typical image contains about a trillion pixels and takes about a week to calculate and store. I am currently doing this primarily with NVIDIA GPUs using OpenCL. The OpenCL kernel code calculates the fractal pixel values using 64 bit floating point numbers to represent the locations in Mandelbrot space. If I zoom in deep enough, the distance between adjacent pixels in Mandelbrot space is too small to be represented with a 64 bit float. My preferred solution to this problem is to use 128 bit floating numbers to represent pixel locations in Mandelbrot space. But GPUs that support binary128 do not seem to currently exist. That is why I asked you about this.
 
My specific use case almost certainly will not lead to a lot of potential sales. But surely I am not the only person who would like to do lots of binary128 calculations very quickly. Supporting binary128 in a GPU would give Intel some serious bragging rights.
Someday, I am sure that GPUs will have native hardware support for binary128.
 
Thanks for listening,
Elliot Leonard
0 Kudos
cw_intel
Moderator
2,700 Views

Hi Elliot,

Thanks for telling the details. I will send your feedbacks to the relevant team.


Regards,

Chen


0 Kudos
cw_intel
Moderator
2,637 Views

Hi Elliot,


Thanks for telling the details. But after our discussion, we decided that this request would not be supported.​


Thanks,

Chen


0 Kudos
Reply