- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Team,
How can we measure GPU usage(% memory usage/RAM usage) when inference is running on GPU?
intel_gpu_tools doesn't seem to provide these info.
Thanks and Regards,
Harsha Shetty
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear Shetty, Harsha
This is not an OpenVino issue. This forum is dedicated to Model Optimizer and Inference Engine support. Memory/RAM usage while inference is running on a GPU is no different than using an Intel GPU for gaming from this perspective. Kindly post your question to the below forum:
https://forums.intel.com/s/topic/0TO0P00000018NKWAY/graphics
Thanks,
Shubha
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear Shetty, Harsha
This is not an OpenVino issue. This forum is dedicated to Model Optimizer and Inference Engine support. Memory/RAM usage while inference is running on a GPU is no different than using an Intel GPU for gaming from this perspective. Kindly post your question to the below forum:
https://forums.intel.com/s/topic/0TO0P00000018NKWAY/graphics
Thanks,
Shubha
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page