Hello
I'm starting with OpenVino and Intel FPGA AI suite. I have a pretrained model imported from TF and working with OpenVino. I have the .bin and the .xml files. However, when I try to copile the model for Arria 10 SOC FPGA I get the following error:
"
Starting compilation
Error Code: 78
Error Description: Error: Filter cache is too small. Maximum filter channels is too small to be bumped down to a multiple of sb_c_banks. Must be multiple of 32 that does not exceed 20
File: ../compiler/core/src/dla_pass_slice_analysis.cpp
Function: computeChannelSliceFactor
Line #: 2053
Error occurred.
../compiler/aot_plugin/src/dla_executable_network.cpp:60 DLA has thrown an exception: Error: Filter cache is too small. Maximum filter channels is too small to be bumped down to a multiple of sb_c_banks. Must be multiple of 32 that does not exceed 20
"
I have used the given A10_Performance.arch file. Could you help me?
Thank you in advance
連結已複製
Hello @JohnT_Intel ,
NN was trainded with TF 2.6.0 and it was experted as SavedModel format (.pb). The NN is working properly as IR model once it is imported.
Hello @JohnT_Intel ,
In the file attached yo will find the following files:
- SavedModel: Model exported from TF
- IR: Model converted from TF to Intermediate Representation (IR) with model optimizer function:
where
- input_shape = [1,28,28,1]
- path_savedModelPath: Path to SaveModel files
- str_modelName = "CustomModel"
- path_irTargetPath: Target path to save .xml and .bin files
- A10_Generic.arch: IP configuration
After getting the .bin and .xml files I checked that IR model works properly and I tried to compile using the following dlaCommand command:
where
- path_archPath: Path to "A10_Generic.arch" file
- path_xmlPath: Path to IR .xml file from previous step
- path_binPath: Path to IR .bin file from previous step
Thank you for your help
Hi,
I have check with engineering and based on their feedback, the error you observed is due to the model not able to be implemented into AI Suite on FPGA. The reason is that not all model are able to be supported.
