- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I was doing some performance tests on mobilenet with the Movidius stick. I used the pretrainded mobilenet-v1-1.0-224 downloaded with the open model zoo downloader. I made a pie chart of the inference time for every specific layer type, this is the result:
What I don't understand is why the non-linearities (Bias & Relu) are taking 27% of the inference time. How is it possible that those non-linearities are taking so much time?
Thanks in advance,
Emiel Deprost
Link Copied
1 Reply
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The MAC(memory access cost) of those elem-wise operations like bias/relu can't be ignored in small networks. There are some studies in shufflenetv2 paper https://arxiv.org/abs/1807.11164
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page