I have used both FaceTrackNoIR with webcam, and the SmoothTrack App on iOS with OpenTrack. My experience clearly demonstrated that using SmoothTrack with iPhone cameras are much, much better than using FaceTrackNoIR with webcams. The latter was extremely unstable and I could never enjoy it. Once I switched to SmoothTrack with iPhone, it met my expectations although I found tuning of some mapping curves would make it more enjoyable.
e2eSoft VCam is a Webcam Emulator, which emulates a webcam in your system and works like a real one. It can be used in most of the applications which use webcam, such as IM software, video broadcasting, video conferencing, video teaching, remote education, video chatting etc.
I just can't... v4l2loopback just doesn't get detected. (Simply does nothing even when I pipe all the video to /dev/video0 using ffmpeg). I can't join any web meetings because my webcam is broken. Any help?
The Webcam Play Mode uses a webcam to simulate tracking of Vuforia targets at your desk. It requires you to use your in-built Webcam in your laptop or connect one via USB (External Camera) to your PC. The Vuforia Engine will automatically detect if a Webcam is available that can be chosen from the dropdown menu Camera Device if you have multiple connected.
Webcam play mode supports all Vuforia target types except Area Targets which is supported only in Simulator Mode. Once in play mode, the webcam will simulate detection and tracking of your target. Test your target and your content as you would if you had built the Unity Project to your device.
To test Vuforia Ground Plane with a Webcam, it is necessary to use an Image Target as the ground. A webcam cannot detect and track the ground on its own, so we will instead emulate the ground with an Image Target. If you are using the Vuforia Core Samples, the EmulatorGroundPlane.pdf can be found in Packages\VuforiaEngineAR\Vuforia\Databases\ForPrint\Emulator or in Packages\com.ptc.vuforia.engine\Vuforia\Database\ForPrint\Emulator
This Play Mode is purely simulating tracking and detection of your Vuforia targets. When enabled, the simulator mode will create a virtual space that you can move the ARCamera around in. Area Targets work well in this play mode as you can navigate around the space as a user would and preview your content and interactive components.
Go to Vuforia Configuration and set the Play mode type to SIMULATOR. The simulator mode allows you to adjust the virtual walking speed as well as detection distance for simulated tracking and the key bindings for moving around. (See below Image).
To test one or more of your Vuforia Targets in your build simply add the Vuforia Engine GameObjects in your scene and set them up with the appropriate targets or databases. Upon pressing Play you will be able to navigate within the space starting at the origin of the ARCamera GameObject.
NOTE: That this simulated detection does not actually use computer vision and therefore does not replace testing the actual detection and tracking quality of your real targets on mobile devices.
Unity's Editor Play Mode can also be used with video recordings from the Session Recorder API. It is a convenient addition to help you continuing authoring with your Vuforia Targets even when the target is not available or in vicinity to you. Using a recording in Play Mode requires a video recording made with the Session Recorder API. See Recordings and Playback for more information on creating recordings.
The work can then continue with exchanging content on the Vuforia target and test it in the recorded sessions and scenarios. We encourage you to integrate this feature into your development process and use recordings to adjust your AR app to specific scenarios, environments, Vuforia targets, and even lighting settings, and use the versatile possibilities that a recording can offer.
SOLVED!.Since Apple doesn't enable iOS simulators to use the macOS Camera, what you can do is add a "Mac Designed for iPad" supported destination, which we were on your app Built for iPad, and since this is our macOS app, you will be able to use the macOS Camera..
The two programs work in conjunction with each other and, yes, it does eat a bit of your CPU but not as much as you might think. It's downright reasonable, in fact, when you consider the work it's doing. You run AITrack first, do a couple of settings and enable the webcam, then run OpenTrack and set some settings there too (both use your local IP address and port number, so it's not connecting via online, so no need to fret about that).
With a little reading and maybe watching a video or two, I quickly learned how to set it up to connect to MSFS and set some of the options available (like inverting the Y-axis). I spent a good amount of time tweaking the mapping (which is really curves for how fast you want the track to go and how far). There's also smoothness and dead-zone settings which I fiddled with a bit, mostly to see how it works.
The verdict? WOW. I had no idea what a difference head tracking could make while flying. I still need to play with the settings much more to get it to where I'd be comfortable with it, probably mostly increasing the dead zone so my little head movements don't jerk it around so much. I did experience slight queasiness, though, which I think will subside once I zero in on the settings and get used to it. I can even zoom in on the instrument panel simply by leaning in! (Towards the webcam, that is).
1) Using your mouse to look around ur cockpit. This is a great way to do it, if your ambidexterous. All u do is bind ur middle mouse button to ur view, then just hold it down to look around. If your ambidexterous this works great for ya
2) Using your WSAD keys to look around and to look above your nose and such. This will require some keybinding and tweaking however this works WONDERFULLY if your right handed, left handed (because u can bind whatever key to this u want), or if your hand is tiny like mine is. I have a hatswitch on my Logitech Extreme 3d Pro and it hurts my wrist to use it because i have to constantly look around. So what i did was i enabled my view axes in view controls then binded the W and S keys to look up and down and i binded the A and D keys to look left and right, then i binded my Head Movement: Up and Down to Alt+W and Alt+S, then i binded the Head Movement: Left and Right to Alt+A and Alt+D keys. This option is great for most players without head tracking software.
ontopic:
personally, I'm not ambidextrous, but I use C or MB5 (on the sides) button to look around (as my middle mouse button is occupied asn i really need that finger to use to fire the gun)
One thign I've notice, and it helped my friend quite alot aswell is by using keys for Pitching plane UP or DOWN.
It other words, i use Shift and CTRL to pull nose up and down.
This way it allows me to do turns and controll the plane without having to rely on mouse tracking, this leaves my mouse free to use for camera look around. aswell as its buttons to fire the gun when aimed.
I'm playing SB with mouse and keyboard. Its pretty easy. Mouse basicly controls the stick, sometimes you roll with A/D. Throttle up/down i've set to W/S, pitching i've set to Lshift/CTRL. And i look around by holding C. Can't controll my stick anymore with my mouse but i roll/turn/clim/dive with my keyboard then.
This one (facetracknoir) is fairly popular, however you'll need a decent webcam. The PS3 eye works fairly well with it, can be picked up from amazon for less than $10. There are some other interesting options that use webcams, some which don't involve headtracking; this thread had some interesting setups, if you're interested.
However, you don't need headtracking or some kind of facetrack solution to be competitive - setting up view bound to virtual axes or the pov-hat on your joystick is all you need, provided you get practice using it.
b37509886e