I have some questions to consult.
I can run TexturesDepthAndColor sample to get depth and RGB frame in unity with android device
If I want an unity app to do SLAM and stream camera frame to video server at the same time. How can I achieve this goal?
Another question is
If I have two unity apps, can I open D435 depth camera sensor at the same time (I am not sure is this requirement like shared camera access of Arcore did)