Software Archive
Read-only legacy content
17061 Discussions

New RealSense SDK - Old Perceptual Camera

tony2
Beginner
1,879 Views

Hi there!

The RealSense SDK has finally been released. That is great, but does it work with the old Perceptual Camera? Or we will have to wait until  the release of the developer kit...

 

Thank you.

0 Kudos
16 Replies
Mike_V_
Beginner
1,879 Views

Hi tony,

the old camera (Senz3D) will not work with the new RealSense sdk. Have a look at this comment: https://software.intel.com/en-us/forums/intel-perceptual-computing-sdk/topic/330215#comment-1797407

Mike

0 Kudos
SoftKinetic
Beginner
1,879 Views

Hello Mike and Tony,

To continue to use your Creative SenZ3D camera feel free to download iisu 3.6 here:

http://www.softkinetic.com/en-us/support/download.aspx

It includes the original hand tracking features plus additional tools and samples.

If you're excited by VR, please check out our www.reachVR.com challenge. You'll find everything you need to use your camera with the Oculus Rift!

Best,

The SoftKinetic team

0 Kudos
MartyG
Honored Contributor III
1,879 Views

While it's great that there's another source for the Perceptual Computing SDK for those who want it, there is an inherent risk to developing an application for the RealSense App Challenge 2014 contest purely in that SDK.  The rules of the contest state over and over that it's a contest to leverage the RealSense camera and the RealSense SDK. 

So whilst using the 2013 SDK is fine for prototyping code whilst waiting for the RealSense camera and then porting the code over, any entrant who submits an application made in the 2013 SDK with the 2013 camera is probably not going to win, as it would render the contest pointless if the new camera or the new SDK was not leveraged at all.

On the subject of compatibility of the 2013 SDK with the 2014 camera, Intel's view has been that the only features of the new SDK that should be incompatible with the 2013 cam are those that involve IR feeds and depth sensing, as they are features specific to the 2014 cam.

Regarding other specifications: the sand has shifted a bit regarding Windows version and CPU.  Whilst at first Windows 8.1 and a 4th Generation Haswell CPU were presented as being absolute prerequisites, members of the Intel team have recently communicated that they are very strong recommendations to get the strongest performance from the RealSense camera, but older tech will work with the SDK and camera.  

What does remain true though is that the RealSense camera needs to be connected to a USB 3.0 port (not the old 2,0 kind) and it has to be a direct connection to the computer, not a USB hub.

0 Kudos
daflippers
Beginner
1,879 Views

Hi Marty,

Are these your view on Intel's policies or have they been communicated by Intel?

Are you absolutely certain the RealSense camera needs to be connected directly to the computer and not via a hub?  That is probably put out to stop the use of hubs that do not support SuperSpeed.  USB 3.0 is really a USB 2 port with SuperSpeed TX and RX pairs and there will be a root hub  inside the PC.

David         

0 Kudos
MartyG
Honored Contributor III
1,879 Views

Quote on the USB 3.0 requirement:

PUBUDU S. (Intel): "You need to have a USB 3.0 connection for the RS Camera and for SDK beta you should connect it directly to a USB 3.0 port (i.e. not through a USB hub etc)."

https://software.intel.com/en-us/forums/topic/531946

In the same article:

PUBUDU S. (Intel): "With the built in normal web cam, samples that require depth cam input won't work.  As it is stated in RealSense website and in installation, you need an Intel RealSense Camera to use most of the SDK features.  So without that cam you are yet to experience the full power of the SDK".

"While some of the features will work with older processors, it is only with an Intel 4th generation (Haswell) processor or above, we can guarantee the best possible performance and smooth operation of SDK algorithms."

0 Kudos
daflippers
Beginner
1,879 Views

Hi Marty,

8<  Intel's view has been that the only features of the new SDK that should be incompatible with the 2013 cam are those that involve IR feeds and depth sensing

Perhaps I should have snipped that last time.  Is that Intel's view or just speculation given that some aspects of the SDK don't use depth there was an actual intention?

As for USB3, if the SDK will not work with a camera connected via a SuperSpeed hub then either the SDK is incredibly demanding or very inefficient and coded in both cases to look for a hub.  If RealSense is to run on an Atom Based Android tablet it cannot have high computing and data requirements otherwise the tablet UX would be awful.

Intel will push latest gen CPU and no external hub because it will give a much better platform for development and the internal cameras will be connected to a root hub however you can already see backtracking on >= Haswell.  Remember it is Intel's business to sell product.

When I was chatting to Intel rep last week at an exhibition it seems no-one has considered the effect on developers that the tablet/laptop segmentation will have on dev resources.  Should you concentrate on gestures etc. for desk use or real world augmentation type apps on tablets.  Also on a tablet with a forward facing RealSense camera how is the user supposed to see what they are taking a picture of if the screen and camera are on the same side of the tablet?    

David

0 Kudos
MartyG
Honored Contributor III
1,879 Views

There seems to be variations of opinions on exactly how compatible the 2013 camera is with the RealSense SDK depending on who is talking.  The one thing they all seem to agree on is that IR feeds and depth sensing should not work with the 2013 cam.

Features such as object tracking, hand tracking, face tracking and synthesis should work, according to another Pubudu (Intel) quote in a different article on this forum:

"Unfortunately most of the cool features won't work without the RealSense camera...However you can still try out features that don't rely on the depth and IR feeds, like face related features such as emotion recognition, face tracking and other features like object tracking etc.. with a usual RGB web cam such as one comes with laptops.  You certainly don't need the cam for speech recognition and synthesis.  Even for features that do require the RealSense cam, you can still read the manual and play with the sample code to get used to API and start writing your apps, so you can right away test them when you get the cam (which won't be that long)."

In Intel's RealSense webinar earlier in the year which I attended (hosted by Intel's 'RealSense Evangelist' guy Eric Mantion, known as CaptGeek on this forum - he with the sticky at the top of the forum), I asked whether the 2013 SDK gesture functions were supported by the 2014 SDK and the reply was that they were.  Comparing that statement with the one above by Pubudu, you could read Eric's statement as being true in the real sense (bad pun intended)  that the 2013 SDK functions will work in the 2014 one.  However, Eric's reply didn't promise that the reverse would also be true and that 2014-specific features would work with the 2013 camera.  :)

As the 2013 cam had depth sensing and IR too, I would guess that the 2014 SDK requires the 2014 RealSense camera's particular implementation of these two features, which may be different from those of 2013 models like the Creative Senz3D.

In regard to USB 2.0: my Creative Senz3D only works when I plug it into the PC's USB 2.0 port directly and not a hub, though perhaps it is just my admittedly elderly hub that it dislikes!

RealSense is kinda in an arms race with other camera tech like Kinect 2 (which announced its own finger-tracking implementation the other day), so it's in Intel's interest to encourage users to be able to create applications that push it to its max to create 'killer apps' that can persuade people to adopt RealSense instead of competing platforms.  Applications created with older technology that are hobbled in their capabilities would not meet that goal of driving adoption.  Still, as I said in another article, being on the cutting edge means that users sometimes get nasty paper-cuts.

0 Kudos
daflippers
Beginner
1,879 Views

Hi Marty,

My points are:

  • If the ReaSense is running on Atom based tablets the >=Haswell requirement cannot be based on instruction set etc. and is more likely a marketing decision.  The SDK may check for processor and may only run on selected CPU families. 
  • I bet you will find that the RealSense camera will work with a good SuperSpeed hub.
  • Using a tablet or laptop with a front facing RealSense camera means it is ideal for gesture based apps.  It also makes it very difficult to use for real world analysis applications such as the sample picture http://www.engadget.com/2014/09/09/dell-venue-8-7k-hands-on/ as the user cannot see the screen when they take the picture. Has anyone thought that through?
  • PerC/RealSense are very interesting technologies, look at what has been done with the Kenetic on and off the Xbox however the technology in Desktops/Laptops/Tablets is still looking for a killer application which is where the developers come in.  If there was a killer app then there wouldn't have been  PerC and now RealSense competitions.  What is not required are ways to replicate a mouse.  What Intel and everyone else is looking for is an application that is a useful and intuitive UX to carry out a task the user can't live without.  

That last point is the whole purpose of Intel releasing the SDK.  If you do come up with that killer app make sure you retain the IP rights.

David

0 Kudos
MartyG
Honored Contributor III
1,879 Views

*  Regarding checking for processor: someone else asked about that, and another Intel representative on here (I can't find the post to confirm who it was) said that while they thought that the camera would not check for processor type, there was a clear advantage in using a Haswell processor.  

*  Even if the camera did not work with a hub, I would not be surprised if some enterprising hardware modder finds a way around it quite quickly.

*  Staying on the subject of modding, some people will also undoubtedly rip the camera out of the tablet, combine it with the PC camera and create applications that use both forward and rear cameras in one seamless application.  The amount of people who modded the Kinect hardware is evidence of this.  And such hardcore modding tends to actually drive interest in the tech for non-modders who are inspired by what they see.  Such modifications also tend to influence the official feature-set in the next iteration of the camera, so that people do not have to be hardware geeks to replicate the cool things people before them created.

*  Regarding making killer apps and retaining IP: IP retention is how most dev studios make a profit, as they get a pittance if they sell it off and then make applications for the new IP owner under licence.  A good killer tech is one where you own the core technology, so even if you make a product for one publisher and they want exclusivity on that product, you can still use the core engine that you DO own to make more hit products for other companies.

A killer app is probably going to be one where the user can sit down whilst using it, as people are often inherently lazy when it comes to physical exertion.  If they don't even have to wave their arms around much whilst sitting, even better.

Early Kinect 1 games used to really push the physical exertion aspect, asking the user to contort themselves into poses a yoga master would be proud of, and those games didn't do too well commercially!  'Sonic Free Riders' from 2010 is a classic example.

https://www.youtube.com/watch?v=n08-J-8AF3A

While a great example of motion detection is Sony's 'EyePet'.

https://www.youtube.com/watch?v=YZvxIjdyyII

0 Kudos
daflippers
Beginner
1,879 Views

Hi Marty,

  • Using the latest and greatest CPU has to be better but as I have suggested the >= Haswell requirement will be Marketing driven rather than technical especially if RealSense is designed to run on an Atom based tablet running Android. 
  • The operation of a hub is defined in the USB specification and depending on the hub it might be limited by the devices connected to it which is why I don't believe it is a hub issue per se but a 'requirement' to ensure there aren't people trying to support someone using the wrong hub or similar.
  • no-one would rip a camera out of a tablet if the camera exists as a standalone device although I agree with your point about modders.
  • The problem with a competition is that you have to read the T&Cs very carefully.  You also may have a problem obtaining a Patent if you have placed an idea in the 'public domain'.

 Just my opinions of course,

David  

 

0 Kudos
MartyG
Honored Contributor III
1,879 Views

* Whether the modders would rip a tablet camera out depends on the particular hardware project.  The tablet camera is just the size of a coin on images Intel have shown of its circuit board, so if space inside a hardware mod-project casing is an issue then it is better to have one full-size PC cam and one small tablet one than two full-size desktop cams.

* My own opinion is that protections like trademarking are pointless, because it costs a huge amount of money for worldwide protection and it obligates you by law to sue anyone who even slightly infringes it even if you don't want to sue or you lose the patent (e.g ZeniMax, parent company of 'The Elder Scrolls' publisher Bethesda, suing 'Minecraft' creator Mojang for calling their card game 'Scrolls'.  Then there was 'Candy Crush Saga' creator King considering suing people for any game that had the word 'Candy' in it. 

I can see cases where PATENTING something - which is clearly different from trademarking - is of value, but again, the cost can be prohibitive and defending the patent a pain in the butt.

It's much better in my opinion, and cheaper, to just keep on developing and "eating your own product" with new iterations of it, like the iPod Touch replacing the first iPods, and always staying a few steps ahead technologically of what your rivals are releasing at any given time.

* In the case of public sharing, when my own company enters contests we always make the tech we developed publicly available in comprehensive developer diaries once the contest deadline has passed.  We don't mind giving away our older secrets, because we're always staying ahead of the pack by developing something bigger and better.

But yeah, while we tend to share our tech with others, we keep character and story-world IP firmly in our ownership so we can re-use that IP in future products, including ones we might sell on a marketplace like the Appe App Store

0 Kudos
daflippers
Beginner
1,879 Views

Hi Marty,

I am just trying to highlight possible pitfalls and assumptions others may make.

David

0 Kudos
MartyG
Honored Contributor III
1,879 Views

I know, David.  Your thoughts are very appreciated and useful.  For instance, you know stuff about USB specifications that I don't.  So when we air our opinions freely and pool the knowledge together, we all win.  :)

0 Kudos
daflippers
Beginner
1,879 Views

What most people don't realise is USB 3 is in effect a USB 2 port with additional connections for the SS tx and rx pairs.  The connector and USB 2 are purely to retain backwards compatibility with USB >=2 and the speed comes from the additional two pairs.

David

0 Kudos
MartyG
Honored Contributor III
1,879 Views

Yeah, I remember someone on the internet asking if there was a way to update USB 2 ports to USB 3 with a software driver and they were told no, because of the additional connections that were needed.

0 Kudos
daflippers
Beginner
1,879 Views

Exactly.

David

0 Kudos
Reply