- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It's OK when run segmentation_demo.py with FP16 or FP32. However the FP16-INT8 occurs the core dumped error.
Maybe it is a problem of the instruction set, as another computer with intel core works fine with FP16-INT8.
Other demos also works fine with FP16-INT8. Maybe there are some special layers that not compatible with Celeron Processor?
OS: Ubuntu 18.04
Openvino version: 2020.2
CPU: Intel 8th Gen Celeron Processor N4100
Link Copied
2 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
As you have mentioned, this seems to be an issue with device support for the INT8 Model Format.
Could you please answer the following?
- Are you using the semantic-segmentation-adas-0001 model?
- Are you able to run the FP16-INT8 Model using benchmark tool on the Celeron Processor?
- Are you able to use any other FP16-INT8 Model on the Celeron Processor?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Of course.
1. Yes.
2. Not tested.
3. Yes, I've tried with face recognition python demo with FP16-INT8 Model and it works well.

Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page