I'm working on a facial recognition application which requires using multiple machines.
One of the machines is using the F200 camera from the dev kit (henceforth called system A), and the other machine is one of the Dell Inspiron All-In-One desktops with a camera that (in device manager) also claims to be an F200 (henceforth system B).
This application requires using system B to register a user (generating the face recognition data) and the recognition itself is being done on system A. The issue I seem to be having is that the recognition data generated by system B (which performs well on system B) does not perform very well when using system A.
Has anyone else run into this issue?
I have also experimented by taking system A's camera, attaching it to a third machine (C), capturing the recognition data on system C, and then moving the camera back to system A. When I do that, the recognition on system A performs well.
Lighting conditions are somewhat different, but decent, at each of the three systems, and as I mentioned, recognition data always seems to perform well on the system that generated the data.
Does anyone have any thoughts on why this might be, or what might be done to improve the portability?
I will try that when I get a chance.
My initial thought though is that I'm fairly certain that the variation of lighting conditions at system A throughout the day is significantly greater than the variation in lighting conditions between any of the systems at the same time of day, yet data taken from system A seems to perform well on system A at all times of day.