<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic experiment setup for good depth map in Items with no label</title>
    <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621289#M13425</link>
    <description>&lt;P&gt;Hi Guys,&lt;/P&gt;&lt;P&gt;I want to organize an experiment about &lt;B&gt;counter-movement jump using D435 and Nuitrack skeleton tracking SDK&lt;/B&gt;. The subject will jump 2m away in front of the camera (see figure_1). My aim is to plot various joint angles and displacement using the xyz data at each joint during the jumping phase as accurate as possible. It obviously &lt;B&gt;requires the accurate depth map&lt;/B&gt;.&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;I decide to use the &lt;B&gt;down-sample &amp;amp; spatial filter&lt;/B&gt; to increase the depth map precision. However, after reading all white papers from Intel, I still have some questions in my mind:&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;1/ What's the &lt;B&gt;ultimate aim&lt;/B&gt; for the spatial filter to achieve? My understanding is: Reduce RMS error while preserving the edge. &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;2/ As the paper said, the point cloud is the best way to view the spatial noise. I can use Realsense Labview code to view the point cloud of the subject in front of me. (see figure_2)&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;But what's your suggestion to &lt;B&gt;find good set of parameters&lt;/B&gt; for the spatial/subsample filters according to the experiment environment using point cloud? In other words, what the good standard for the filtered depth map? Less RMS error and clear edge without any over smooth? Or...... &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;3/ I read the paper which said the &lt;B&gt;RGB color&lt;/B&gt; will help the depth calculation. What does this mean? In my understanding: If I want to improve the depth precision, I should wear red, blue or green color. And it's better to wear long trousers and long-sleeve shirt. Am I right?&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;4/ The paper said the RMS error will be smaller if we &lt;B&gt;turn off the projector&lt;/B&gt;. How can I understand this point? &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;And if I follow this logic, can I improve the depth quality by using the &lt;B&gt;outside projector&lt;/B&gt; which project semi-random dots on the subject body? Do I have to use the infrared dot? Where can I buy the special&lt;B&gt; IR projector&lt;/B&gt; which can project IR dots? I can't find this kind of product on the market.&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;5/ I still remembered the &lt;B&gt;reflective marker&lt;/B&gt; will reduce the quality of the depth map for Kinect depth map. There is hole at the marker. Does the Realsense has the similar problem around the reflective marker? &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;Actually, if we have to stick some reflective makers during the experiments, is there any method to &lt;B&gt;mitigate this problem&lt;/B&gt;?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
    <pubDate>Wed, 05 Sep 2018 17:41:16 GMT</pubDate>
    <dc:creator>FDing3</dc:creator>
    <dc:date>2018-09-05T17:41:16Z</dc:date>
    <item>
      <title>experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621289#M13425</link>
      <description>&lt;P&gt;Hi Guys,&lt;/P&gt;&lt;P&gt;I want to organize an experiment about &lt;B&gt;counter-movement jump using D435 and Nuitrack skeleton tracking SDK&lt;/B&gt;. The subject will jump 2m away in front of the camera (see figure_1). My aim is to plot various joint angles and displacement using the xyz data at each joint during the jumping phase as accurate as possible. It obviously &lt;B&gt;requires the accurate depth map&lt;/B&gt;.&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;I decide to use the &lt;B&gt;down-sample &amp;amp; spatial filter&lt;/B&gt; to increase the depth map precision. However, after reading all white papers from Intel, I still have some questions in my mind:&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;1/ What's the &lt;B&gt;ultimate aim&lt;/B&gt; for the spatial filter to achieve? My understanding is: Reduce RMS error while preserving the edge. &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;2/ As the paper said, the point cloud is the best way to view the spatial noise. I can use Realsense Labview code to view the point cloud of the subject in front of me. (see figure_2)&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;But what's your suggestion to &lt;B&gt;find good set of parameters&lt;/B&gt; for the spatial/subsample filters according to the experiment environment using point cloud? In other words, what the good standard for the filtered depth map? Less RMS error and clear edge without any over smooth? Or...... &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;3/ I read the paper which said the &lt;B&gt;RGB color&lt;/B&gt; will help the depth calculation. What does this mean? In my understanding: If I want to improve the depth precision, I should wear red, blue or green color. And it's better to wear long trousers and long-sleeve shirt. Am I right?&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;4/ The paper said the RMS error will be smaller if we &lt;B&gt;turn off the projector&lt;/B&gt;. How can I understand this point? &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;And if I follow this logic, can I improve the depth quality by using the &lt;B&gt;outside projector&lt;/B&gt; which project semi-random dots on the subject body? Do I have to use the infrared dot? Where can I buy the special&lt;B&gt; IR projector&lt;/B&gt; which can project IR dots? I can't find this kind of product on the market.&lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;5/ I still remembered the &lt;B&gt;reflective marker&lt;/B&gt; will reduce the quality of the depth map for Kinect depth map. There is hole at the marker. Does the Realsense has the similar problem around the reflective marker? &lt;/P&gt;&lt;P&gt; &lt;/P&gt;&lt;P&gt;Actually, if we have to stick some reflective makers during the experiments, is there any method to &lt;B&gt;mitigate this problem&lt;/B&gt;?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Wed, 05 Sep 2018 17:41:16 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621289#M13425</guid>
      <dc:creator>FDing3</dc:creator>
      <dc:date>2018-09-05T17:41:16Z</dc:date>
    </item>
    <item>
      <title>Re: experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621290#M13426</link>
      <description>&lt;P&gt;Hi jakeding,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;Thank you for your interest in the Intel RealSense D435 camera.&lt;P&gt;&amp;nbsp;&lt;/P&gt;Please let me look into it and I will get back to you later. &lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;Regards,&lt;P&gt;&amp;nbsp;&lt;/P&gt;Alexandra</description>
      <pubDate>Thu, 06 Sep 2018 15:32:57 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621290#M13426</guid>
      <dc:creator>idata</dc:creator>
      <dc:date>2018-09-06T15:32:57Z</dc:date>
    </item>
    <item>
      <title>Re: experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621291#M13427</link>
      <description>&lt;P&gt;Thanks! Hope to hear from you soon.&lt;/P&gt;</description>
      <pubDate>Mon, 10 Sep 2018 16:56:59 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621291#M13427</guid>
      <dc:creator>FDing3</dc:creator>
      <dc:date>2018-09-10T16:56:59Z</dc:date>
    </item>
    <item>
      <title>Re: experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621292#M13428</link>
      <description>&lt;P&gt;Hello jakeding, &lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;We will answer your questions individually. It seems that you have you read the Depth Post-Processing paper from &lt;A href="https://www.intel.com/content/www/us/en/support/articles/000028866/emerging-technologies/intel-realsense-technology.html"&gt;https://www.intel.com/content/www/us/en/support/articles/000028866/emerging-technologies/intel-realsense-technology.html&lt;/A&gt; &lt;A href="https://www.intel.com/content/www/us/en/support/articles/000028866/emerging-technologies/intel-realsense-technology.html"&gt;https://www.intel.com/content/www/us/en/support/articles/000028866/emerging-technologies/intel-realsense-technology.html&lt;/A&gt;.  We recommend to also read the BKMs for Tuning Whitepaper at &lt;A href="https://www.intel.com/content/www/us/en/support/articles/000027833/emerging-technologies/intel-realsense-technology.html"&gt;https://www.intel.com/content/www/us/en/support/articles/000027833/emerging-technologies/intel-realsense-technology.html&lt;/A&gt; &lt;A href="https://www.intel.com/content/www/us/en/support/articles/000027833/emerging-technologies/intel-realsense-technology.html"&gt;https://www.intel.com/content/www/us/en/support/articles/000027833/emerging-technologies/intel-realsense-technology.html&lt;/A&gt;. It may answer more of your questions and give you more guidance.&lt;P&gt;1. Yes, the spatial filter applies edge-preserving smoothing of depth data, while minimizing the RMS error.&lt;/P&gt;&lt;P&gt;2. Please review the BKMs for Tuning White paper for more guidance on using these filters.&lt;/P&gt;&lt;P&gt;3. RGB color is not currently incorporated into the post processing, so wearing red green and blue would not currently affect depth.&lt;/P&gt;&lt;P&gt;4. You can use visible or infrared projectors. So yes you can use a regular front projector that you can buy at Amazon, for example:&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.amazon.com/AAXA-M6-Projector-Built-Battery/dp/B06ZZ3MPR3/ref=sr_1_1?s=electronics&amp;amp;ie=UTF8&amp;amp;qid=1536685349&amp;amp;sr=1-1&amp;amp;keywords=aaxa+m6"&gt;https://www.amazon.com/AAXA-M6-Projector-Built-Battery/dp/B06ZZ3MPR3/ref=sr_1_1?s=electronics&amp;amp;ie=UTF8&amp;amp;qid=1536685349&amp;amp;sr=1-1&amp;amp;keywords=aaxa+m6&lt;/A&gt; &lt;A href="https://www.amazon.com/AAXA-M6-Projector-Built-Battery/dp/B06ZZ3MPR3/ref=sr_1_1?s=electronics&amp;amp;ie=UTF8&amp;amp;qid=1536685349&amp;amp;sr=1-1&amp;amp;keywords=aaxa+m6"&gt;https://www.amazon.com/AAXA-M6-Projector-Built-Battery/dp/B06ZZ3MPR3/ref=sr_1_1?s=electronics&amp;amp;ie=UTF8&amp;amp;qid=1536685349&amp;amp;sr=1-1&amp;amp;keywords=aaxa+m6&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Or try to contact AMS about their different IR projectors.&lt;/P&gt;&lt;P&gt;5. To the extent that the reflective spot gives a hot spot that saturates the detector, then yes, these cameras will also give zero depth at that point. But in general you should be able to manage the exposure or laser power and avoid this.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;Let me know if you have further questions.&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;Regards,&lt;P&gt;&amp;nbsp;&lt;/P&gt;Alexandra</description>
      <pubDate>Wed, 12 Sep 2018 08:15:00 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621292#M13428</guid>
      <dc:creator>idata</dc:creator>
      <dc:date>2018-09-12T08:15:00Z</dc:date>
    </item>
    <item>
      <title>Re: experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621293#M13429</link>
      <description>&lt;P&gt;Hi Alexandra, &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Your reply is very helpful! After reading all papers from Intel recommended by you,I still wanna discuss the following questions a little bit:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;1. Since RGB color won't help depth measurement, do we need to care about the cloth color anyway in attempt to increase accuracy?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;2. I am confused about the definition of "active stereo", could you look at the figure 1 in attachment and tell me which active stereo technology you are using for D4XX? &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;In other words, what's the use of the laser pattern in D4XX? Does it perform like structured pattern or just add texture to help two cameras distinguish different points in the space?  &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;3. You said in last post: "To the extent that the reflective spot gives a hot spot that saturates the detector, then yes, these cameras will also give zero depth at that point. But in general you should be able to manage the exposure or laser power and avoid this."&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I don't understand your explanation, could you explain the above words in detail? Thanks.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;4. Can I put a squared box 2 meters away from the camera and then play with the spatial filter until it neither over-smooth nor leave too much noise in the point cloud? &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I appreciate it!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Best,&lt;/P&gt;&lt;P&gt;Jake&lt;/P&gt;</description>
      <pubDate>Thu, 13 Sep 2018 14:21:09 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621293#M13429</guid>
      <dc:creator>FDing3</dc:creator>
      <dc:date>2018-09-13T14:21:09Z</dc:date>
    </item>
    <item>
      <title>Re: experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621294#M13430</link>
      <description>&lt;P&gt;Hi jakeding,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;Please find below the answers to each question.&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;1. The texture is more important than the color. For example is better to wear different colored clothes than a completely blue shirt. &lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;2. In the D400 datasheet linked to below, Section 2.3 page 12 answers your question . The laser pattern from the IR projector adds texture to help the two cameras distinguish different points in space.&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;A href="https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf"&gt;https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf&lt;/A&gt; &lt;A href="https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf"&gt;https://www.intel.com/content/dam/support/us/en/documents/emerging-technologies/intel-realsense-technology/Intel-RealSense-D400-Series-Datasheet.pdf&lt;/A&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;3. A reflective spot may cause over-exposure on the sensors. It is the same as taking a picture of a bright light with a regular camera. When there is bright, uniform light on a reflective surface, such as glare from the sun or a reflective spot, depth cannot be detected because there is no texture. Remember the importance of texture. You can play with the exposure settings to reduce this effect.&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;4. Why would you not be able to do this?&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;Regards,&lt;P&gt;&amp;nbsp;&lt;/P&gt;Alexandra</description>
      <pubDate>Fri, 14 Sep 2018 18:58:03 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621294#M13430</guid>
      <dc:creator>idata</dc:creator>
      <dc:date>2018-09-14T18:58:03Z</dc:date>
    </item>
    <item>
      <title>Re: experiment setup for good depth map</title>
      <link>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621295#M13431</link>
      <description>&lt;P&gt;Thank you very much! You perfectly solved all my questions. &lt;/P&gt;</description>
      <pubDate>Fri, 14 Sep 2018 19:24:45 GMT</pubDate>
      <guid>https://community.intel.com/t5/Items-with-no-label/experiment-setup-for-good-depth-map/m-p/621295#M13431</guid>
      <dc:creator>FDing3</dc:creator>
      <dc:date>2018-09-14T19:24:45Z</dc:date>
    </item>
  </channel>
</rss>

