- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear Team,
We have installed Intel MCUv5 CentOS 7.2 Software accelerated in a Natted Environment in a single machine
Type of NAT: Static
No of Network Interfaces: 1
No of ports opened: TCP: 443 , 80, 3000-3004, 8080, 27017 , 5672, UDP Range: 49152-65535
1. when we configure public IP under webrtc_agent.toml file stream from local network wont publish at all, but when we changed it to default[] it works from local network with out any issues
from the above scenario(1) from public network stream sometimes get publishes but most of time it wont publish, we also configure stun/turn servers also on public network directly with no restrictions
2. what ever the configuration we give in webrtc_agent/agent.toml server just takes and it is not working from public network
Please let us know the best configuration which works under NAT Environment
Thanks
Naresh
- Tags:
- HTML5
- JavaScript*
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Similar configuration works well with MCU v 3.3 in the above said environment
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We will try to reproduce this issue, stay tuned
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Qiujiao,
Did you tried to reproduce the scenario. even when mcu v4 and mcu v5 installed on public network some times video gets stopped at end of candidates, but this issue is not there with 3.3 version
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We did not reproduce this issue with v3.4. For v3.5, we will have a 3.5.1 release to fix this issue, stay tuned.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Qiujiao,
Can we get a quick patch to fix this issue with 3.5
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Naresh, version 3.5.1 will be released no later than next week, stay tuned.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, Naresh, 3.5.1 has been released, please have a try.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Qiujiao,
Thanks for the update, now it seems to working now, but tcp ports and woogen_webrtc PID is not getting released or closed even after session disconnects.
I have raised this issue earlier too
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Naresh, about tcp port and woogeen_webrtc PID issue, could you provide more detailed info on how to reproduce?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Qiujiao,
Scenario:
We have installed Intel MCUv5.1 Ubuntu 14.04 LTS 64bit (14.04.1-Ubuntu SMP Wed Jul 13 01:07:32 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux Codename: trusty) Software accelerated in a Natted Environment in a single machine with all the components like mongodb, rabbitmq installed on same machine
Type of NAT: Static
No of Network Interfaces: 1
No of ports opened: TCP: 443 , 80, 3000-3004, 8080, 27017 , 5672, UDP Range: 49152-65535
Just opened basic example in single machine multiple tabs and try to refresh multiple times and close
Attached is the webrtc agent.toml and portal.toml file for configuration and screen sdot of netstat -pltun after session ends you can see TCP and their PID still active and has to restart service to close all the pid
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Also from the above scenario when i restart rabbitmq service those pid or ports will stop listening or closed, i believe connection between rabbitmq and webrtc component is not getting closed even after session ends
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Naresh, thanks for reporting this issue, we will try to reproduce it, stay tuned
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Qiujiao,
Did you tried to reproduce the scenario
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, we reproduced it in our environment. We are looking into this issue and stay tuned.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Qiujiao,
Also we are facing another issue
We have installed Intel MCUv5.1 or any version of MCU from v3.3 Ubuntu 14.04 LTS 64bit (14.04.1-Ubuntu SMP Wed Jul 13 01:07:32 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux Codename: trusty) Software accelerated in a Natted Environment in a single machine with all the components like mongodb, rabbitmq installed on same machine
Type of NAT: Static
No of Network Interfaces: 1
No of ports opened: TCP: 443 , 80, 3000-3004, 8080, 27017 , 5672, UDP Range: 49152-65535
Provided Public IP under webrtc_agent
1. Now when users connecting the server with in MPLS network we observed in network tool that session is also hitting public IP due to this MPLS network users getting blacked out videos some times of subscribers and user has to refresh multiple times to get video.
MPLS users are from different geographical locations connected to central server with MPLS point to point connectivity
Local network users will not have access to public IP in Natting environment which is obvious
but public users having no issue in subscribing videos
2. when i remove public IP from webrtc_agent internet users are not able to login to the session and stream gets failed
I believe there is some configuration issue at webrtc_agent level which has to be fixed in natting environment
Please test this scenario with atleast 10 users internally with in local network in natted environment providing public IP in webrtc_agent and some users connecting from internet also
Thanks
Narsh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Also from the above issue when we provide local ip in webrtc_agent toml file it works well will mpls or lan users.
But doesn’t work with internet users. I believe there is some configuration issue how the call routes
thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
we will try to reproduce this bug, stay tuned
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Naresh, for the MPLS internal and external access issue, you can launch 2 webrtc agent and set 2 different isp for these 2 webrtc agents, then one is for internal network users, the other one is for external network users. Then user can specify isp in createToken API for different network.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Qiujiao, you mean do we have to deploy cluster for these in 2 separate machines, how can we launch 2 webrtc-agents in a single machine with natted environment.
Also did you reproduced the issue which i have mentioned
Thanks
Naresh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Yes, we have reproduced issue you mentioned in nat network, and it will work well launching 2 webrtc agents with different isp. Please follow steps below:
1. set ipsp to "internal" in Release-v3.5.1/webrtc-agent/agent.toml and other internal network settings and launch webrtc agent 1
2. copy webrtc-agent, bin, package.json in Release-3.5.1 to a new folder test, and then set ipsp to "external" in test/webrtc-agent/agent.toml and other external network settings, then launch webrtc agent 2
3. user 1 in internal network call createToken with setting isp to "internal" in preference option and then join room
4. user 2 in external network call createToken with setting isp to "external" in preference option and then join room
Then internal and external network should work well, you can deploy 2 portal with the same isps setting as webrtc agent
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page