VR and clicking possible solution
-
Well, I guess one would have to try different input methods to see how they feel. For that BMS would need to run in VR in the first place - is there something I am not aware of?
Also personally I still use the mouse in VR i.e. in DCS. It doesnt seem to be a critical problem.
But I dont have much experience with it as I only have the free demo and DCS gives me poor framerates in VR. Different input methods are certainly welcome.I think in Elite Dangerous the cusor was controlled by headtracking, but its a while ago and i could be wrong. However, there might be examples to learn from.
But how to run Falcon in VR? Does this work?
-
Among other requirements, I think VR requires DirectX 11 as a minimum.
I’d love to be wrong about that:
https://steamcommunity.com/app/358720/discussions/0/535151589883544694/ but this link is rather old.
-
I’m glad we have some new eyes looking at the challenge of getting BMS in VR. Right now it’s possible to get a 2D image of the pit in your headset using a combination of OpenTrack for head tracking and VR Toolbox / VR Desktop. Getting the pit in 3D is more challenging and causes a big performance hit - you need a 3D injector. The only one we’ve found that works is TriDef, and you’re capped at something like 30-45 FPS - far too low for a comfortable VR experience. Also, the HMCS and HUD are projected at the wrong distance. VR Toolbox has an FPS mode that fills the headset with the image from the computer monitor and also makes the cursor track your head movements. Binding HOTAS buttons to left click, right click, and mouse wheel up and down would allow you to control the entire cockpit without using a mouse. Some of the switches are in inconvenient places (e.g. FLCS reset) which are a bit challenging to get your head all the way around for. I also use the mouse in DCS and it’s not too bad. It feels more rapid than using a gaze-based cursor, especially when the targets are very small (OSB buttons).
-Rabbit
-
Logitech is trying something new in the vive for virtual keyboard.
Στάλθηκε από το MI 5 μου χρησιμοποιώντας Tapatalk
-
the HMCS and HUD are projected at the wrong distance.
Can you explain it? I might do something about it with another wrapper.
Among other requirements, I think VR requires DirectX 11 as a minimum.
I’d love to be wrong about that:
https://steamcommunity.com/app/358720/discussions/0/535151589883544694/ but this link is rather old.
DirectX 11 performance is awful. Developers are going for Vulkan or DX12. I’d very much prefer the former to win the “API war”.
Logitech is trying something new in the vive for virtual keyboard.
That’s something an overlay over SteamVR can do without additional devices. Of course, many people will jump on a device solving a nonexistent problem.
-
The HMCS (in the F-16C block 50, anyway) is actually a monocle, so it should only appear in 1 eye meaning that convergence shouldn’t be much of an issue (there may be other systems by now). The HUD is rendered on the HUD glass in the cockpit. It doesn’t appear to be floating at infinity like it does in the real jet. Instead it looks like its about 50 cm from you. This means that if you look at the HUD, things out the window look doubled, and if you look at things out the window, the HUD looks doubled. This makes using CCIP or TD boxes very challenging and the constant focus switching is distracting. I tried editing the cockpit definitions to move the HUD further out, but it didn’t seem to work.
Perhaps the most promising approach is FlyInside3D. This is a 3rd party utility that adds very good VR support to Xplane and FSX (including Leap Motion integration). Several of us have asked for BMS support, but this being a small niche of a very small niche product, I don’t think it would be worth the developer’s time. Any experience or help you could bring would be much appreciated!-Rabbit
-
FlyInside FSX has like an HMCS cursor, you aim with the center of your head of switches you want to click. Or if you want you can also move the mouse over stuff. HMCS cursor is really easy to use.
-
if you look at the HUD, things out the window look doubled, and if you look at things out the window, the HUD looks doubled.
I wonder what’s the depth at which the collimated HUD image is projected. A program like “apitrace” would give a definite answer. BUT when moving in the 3D pit, the collimated display looked as if projected on the right surface.
There are multiple “fake 3D” modes, for instance, drawing a very wide image and then cutting out the center portion of it. Are you able to configure it to any degree?
I was looking at HMD prices. That things are almost as expensive as my workstation…
-
I wonder what’s the depth at which the collimated HUD image is projected. A program like “apitrace” would give a definite answer. BUT when moving in the 3D pit, the collimated display looked as if projected on the right surface.
There are multiple “fake 3D” modes, for instance, drawing a very wide image and then cutting out the center portion of it. Are you able to configure it to any degree?
I was looking at HMD prices. That things are almost as expensive as my workstation…
They do a good job faking the HUD in 2D, but the illusion falls apart in stereoscopic 3D: It appears to be on the HUD glass, rather than floating outside of the cockpit.
The Rift will be on sale for $350 (link)with touch controllers as a Black Friday special. Still quite pricey, especially if you don’t have a graphics card suitable for it yet. However, it’s definitely in the realm of an (expensive) consumer product.
-
Can you locate the draw call using an “apitrace” recording? Given XYZ coordinates for that quad (or a triangle strip, etc.), its size, or other characteristic, a DLL wrapper can rewrite the call before it goes to the GPU. It’s very similar to the headless dedicated server thing.
I won’t buy any VR especially below 200$, but hardware is very expensive in Poland.