Hello to all,
I have a question for you....
I have trained my custom dataset model for recognise two different object each one with different label, everything was fine, I got the files and I convert it in IR format for run it in NCS2, so I got the file frozen_inference_graph.xml and .bin ect...
for run this file I used object_detection_demo_ssd_async.py and I got the output on video, it's ok, but I saw just one of my object trained (while with jupyter notebook I can see both with each label)...
How can I do for see on video both objects with own label I trained?? should I use another file .py or fix the file object_detection_demo_ssd_async.py for get this kind of output?
many thanks for your answer ...
Could you please answer the following questions.
Hi Jesus E.
many thanks for replay to me....
the problem above I solved, I just fixed the label file .pbtxt and now I can see both of my object trained.....
I have other two questions for you....before I'll answer about your questions below:
How did you convert the model to IR format? Please provide full command used.
.../object_detection$ mo_tf.py \
--input_model ~/<my path>/tensorflow/models/obeject_detection/glass_graph/forzen_inference_graph.pb \
--tensorflow_use_custom_operations_config /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/extensions/front/tf/ssd_v2_support.json \
--tensorflow_object_detection_api_pipeline_config ~/<my path>/tensoprflow/models/research/object_detection/glass_graph/pipeline.config \
Are you using the latest OpenVINO toolkit 2019 R3? --> yes
Did you try lowering the probability threshold? By default it's set to 0.5. --> by default
What base model did you use to custom train your model? Could you provide the customs trained model or a link to the base model used? --> ssd_mobilenet_v1_coco_11_06_2017 , ssd_mobilenet_v1_pets.config
My questions are:
1) I would like to have a different draw box color for each object trained (in my case just two objects), is it possible to change something into the file object_detection_demo_ssd_async.py? at moment I have same color for both objcets
2) I tried to run the same model that I run on my PC as well on raspberry Pi3+, but I got the error :
<my path>$ python3 object_detection_demo_ssd_async.py \
-m ~/<my path>/frozen_inference_graph.xml \
-i cam \
[ INFO ] Inizializing plugin for MYRIAD device...
[ INFO ] Reading IR...
Traceback (most recent call last):
File "object_detection_demo_ssd_async.py", line 187, in <module>
sys.exit(main() or 0)
File "object_detection_demo_ssd_async.py", line 61, in main
net = IENetwork(model=model_xml, weights=model_bin)
File "ie_api.pyx", line 266, in openvino.inference_engine.ie_api.IENetwork.__c int__
RuntimeError: Error reading network: cannot parse future versions: 6
I run as well other trained models on my Raspberry without this problem, do you know how can I solve it, please?
many thanks for your help
I'm glad you were able to solve your initial problem. You should be able to change the color of the detection boxes by adding some logic to the python code using the "class_id" on line 159.
The error you are seeing on the Raspberry Pi is due to the OpenVINO version installed. You need to use the same version that is installed on the system with the model optimizer. You can find the OpenVINO toolkit 2019 R3 for Raspbian OS here:
Hi Jesus E.
finally I had time to follow your tips and now I have update the openvino on raspberry and my custom training run as well....
now I just need to understand how to change the color of the detection boxies ....
many thanks for your help
Sorry for the delay, did you figure out how to change the color of the detection boxes? This question is not specific to OpenVINO and I have not tried it myself. However, as I mentioned before, you should be able to use the "class_id" when calculating the color of the detection boxes. Since you only have two classes you can probably use an if statement with the "class_id" and set a specific color of you choice.
cv2.rectangle(frame, (obj['xmin'], obj['ymin']), (obj['xmax'], obj['ymax']), color, 2)