- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Have run the asl_recognition_demo, program only do user tracking, but has no ASL hand signs reading.
The Demo Windows only able shows user tracking, but not do any ASL hand signs detection or readings.
Program can only do tracking user when walking around with square lines border and number '0'.
But program does not do any ASL hand sign recognition , nor reading ASL hand signs at all.
When user do any ASL sign , the program did not shows anything detections, and not shows any reading user's hands ASL hand signs.
For reference: The command that I Run
"python
asl_recognition_demo.py -m_a asl-recognition-0004.xml -m_d
person-detection-asl-0001.xml -i 1 -c classes.json"
at
C:\Program Files (x86)\IntelSWTools\openvino_2021.1.110\deployment_tools\open_model_zoo\demos\python_demos\asl_recognition_demo\
Have moved all the .xml and bin files in same directory
both asl-recognition-0004 and person-detection-asl-0001
ASL Demo program error:
Demo Windows program Run, and show video images, program able to tracking user, but program has not do any ASL hand signs detection , program not do any reading user's hand signs.
Anyone know where might be the problem need to be corrected? to make the asl_recognition_demo program ASL Hand Signs detection and readings works?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
please note what is written in demo description:
The demo starts in person tracking mode and to switch it in the action recognition mode you should press 0-9 button with appropriate detection ID (the number in top-left of each bounding box). After that you can switch back to tracking mode by pressing space button.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
please note what is written in demo description:
The demo starts in person tracking mode and to switch it in the action recognition mode you should press 0-9 button with appropriate detection ID (the number in top-left of each bounding box). After that you can switch back to tracking mode by pressing space button.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Vladimir,
Thanks for reply and have point out the MS-ASL-100 in description.
"in model description it stated that MS-ASL-100 dataset used in model training. From dataset specification and from supplied with model classes.json file you may clearly see what gestures/signs."
I have use the 100 hand signs to test and the ASL demo program works. Program able to predict and reading user ASL hand signs defined in classes file.
Question:
Can the ASL demo program ab use for more than 1 user simultaneously? instead of need to click 0 - 9 to setup detection on particular user.
Can the program detect and reading ASL signs from several users? each by each? sequentially?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
demo application was not designed in the way to support actions from multi persons simultaneously.
Regards,
Vladimir
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Vladimir,
Thanks for reply message.
Have another question:
Hi,
How to have build visual studio Demo project for asl_recognition_demo and person-detection-asl-0001?
Able to build python demos Human-Pose-3d and Speech Recognition demo projects, but not other python demos.
Attached screen pictures for reference
Details:
Have followed the Open Model Zoo Demos link page to built Demos Projects.
To Build the Demos Applications on Microsoft Windows* OS
PC: OS Windows 10
Python 3.6.5
Have created Demo projects in the
C:\Users\User\Documents\Intel\OpenVINO\omz_demos_build
In this directory have both intel64 and python_demos folders. Please see attached screen pictures.
Inside python_demos folder has created only 2 demo projects.
But there are around 22 demo (python) projects in the C:\Program Files (x86)\IntelSWTools\openvino_2021.1.110\deployment_tools\open_model_zoo\demos\python_demos
Question 1:
Why only have 2 demo projects created in the python_demo folder?
Human-pose-3d and speech recognition demos. and there others python demos did not built?
and
in the \python_demos folder have not created the demos project for asl_recognition_demo and person-detection-asl-0001?
Anyone knows what should do in order to have build individual python demos in python folder?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Chiu Hsien Hsiang,
You don’t need to build the Python demos.
You can edit the Python script (asl_recognition_demo.py) directly from the demo folder.
(C:\Program Files (x86)\Intel\openvino_2021\deployment_tools\open_model_zoo\demos\python_demos\asl_recognition_demo)
The steps to run the asl_recognition_demo are available here:
Regards,
Munesh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Chiu Hsien Hsiang,
This thread will no longer be monitored since we have provided solution. If you need any additional information from Intel, please submit a new question.
Regards,
Munesh

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page