- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I recorded several volumetric videos with RealSense. Now I want to trim the beginnings and the ends of them, put them into Unity so they play one after another, and export them to Magic Leap. Is there a place online that would show how to do that, step by step?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
It sounds as though RealSense will not be involved in your setup, in the sense that you are aiming to send pre-recorded videos to Unity instead of live camera data. Is that correct, please? If so, the guide in the link below for creating a Unity app for Magic Leap may be a useful starting point.
https://www.pubnub.com/blog/getting-started-with-magic-leap-and-unity-developer-tutorial/
If you wanted to make a system that integrated the RealSense SDK into it, that may be a bit more complicated. It looks as though the Lumin OS operating system used by Magic Leap is primarily based on the Aarch Linux operating system. So following procedures for setting up a RealSense Unity environment based on Aarch Linux may be the way to go, though it may inevitably need some adaptations due to the unusual nature of Lumin OS's structure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I recorded it with RealSense already. So now I have volumetric video files that are in the RealSense format, these weird ".bag" envelope files. So I need to know how to edit the bag files, and arrange them in Unity.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for the info on Lumin. In the future I would like to make it real-time.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks for mentioning that you are using bag files, as it helps me to understand your project better. Although the RealSense SDK can make extensive use of bag files, they are not proprietary to RealSense. They are also known as 'rosbags', associated with the Robot Operating Software (ROS) computer vision software platform.
ROS has a 'rosbag API' with versions in Python and C++ language for reading, modifying and writing bag files with functions such as cropping.
https://wiki.ros.org/rosbag/Cookbook
It would be problematic to get it working within Unity though as it is not in Unity's native C# language. A well-established approach to running C++ code in Unity is to create a "plugin" file that is placed in your Unity project. Details can be found in the Unity documentation link below.
https://docs.unity3d.com/Manual/NativePlugins.html
An alternative approach might be to use the ROS bag_tools package to edit the bag:
and access the bag_tools package from within Unity using the free Unity ROS# system in the Unity Asset Store.
https://assetstore.unity.com/packages/tools/physics/ros-107085
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
RealSense is a very developer-centric platform, and ROS files cannot really be edited like traditional video.
Your best bet is to use DepthKit to capture RGBD videos with your RealSense camera, and use their Unity plugin to play your captures back on a magic leap.
Unfortunately I do not think you will be able to convert your .bag files to DepthKit's format.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page