EDtracker vs Delanclip ?
-
If you use 2 sensors, then the inertial tracker can work along 5 axes.
-
+1
You can get zoom by adding a piece of software called FaceTrack No IR. It is tracker software that track both a primary and secondary input source. You can use EDTracker as the primary source and then use the built in face tracking via webcam in this software to track the Z-axis. Problem solved!
-
You can get zoom by adding a piece of software called FaceTrack No IR. It is tracker software that track both a primary and secondary input source. You can use EDTracker as the primary source and then use the built in face tracking via webcam in this software to track the Z-axis. Problem solved!
Can you please elaborate a bit more? Seems interesting…
-
Can you please elaborate a bit more? Seems interesting…
FaceTrackNoIR can be found here: http://www.facetracknoir.nl/
This tracker program has built-in support for many of the most popular types of head tracking solutions. The main tracking solution that it uniquely offers is face tracking via a standard webcam. This capability is the main reason this head tracking software exists. While not flawless, the face tracking does a pretty reasonable job of 6 DOF head tracking by identifying the positions of your eyes, nose, and mouth relative to each other via webcam. The most problematic tracking is for left/right head twist movements equating to yaw and less so with up/down movements equating to pitch. Obviously, at some point in the yaw and pitch axis near the extremes of the x and y movement the webcam will lose tracking on some of your facial features. But the zoom on the z-axis and translation movements in the x, y axes are not affected as foward/back movement and right/left up/down translation movments of the head keeps all the facial tracking features fully in view of the webcam. You can get around these limitations by modifying the sensitivity of the response curves on the x,y axes for yaw and pitch by limiting the range of head movement in order to keep your face in view of the webcam. This works reasonably well although it restricts somewhat the range of head movement in the x, y axes.
FaceTrackNoIR provides an alternative solution to this issue by including dual tracking inputs. So I normally set up EDTracker as the primary tracker by emulating x, y axis joystick inputs for yaw and pitch. I use the built-in face tracker as a secondary input for the z-axis and x,y translation movements. This solution works perfectly and allows you to cheaply get a near perfect approximation of a more expensive 6 DOF TrackIR solution in terms of range of movement and overall performance. I say near perfect because while TrackIR may be used in a darkened room, you will require a well lit room for the webcam tracking to work. But other than that it is a very good solution.
Alternatively you can use this pseudo 6 DOF solution provided by one of the EDTracker engineers:
. Essentially what this does is partially map the z-axis forward/back and x, y translation movments to the x, y yaw and pitch axes such that as you start to look down your virtual head gradually leans forward. Of course, this solution only works well with sims where you have simpler cockpit instrumentation and you only need to lean-in to the cockpit to get a closer view. However, this is really not usable in the case of BMS or DCS where you need full 6 DOF to focus in on specific areas of the cockpit instrumentation and interact with them.