TrackID moving HMD only, for simpit, not moving with the rendering camera? Possible?
-
Well, coding effort is one thing, I am not afraid of that when it comes to building something cool long term. And TrackIR is just entry point library for the game, even if you use something better like the edtracker (didn’t know existed, thanks!), you would still take TrackIR library as your input as that is what most games support. Just disabling that TrackIR inputs changing camera position and coding in an TrackIR provided “angle to X,Y position” translation for HMD GUI is what is needed here. Plus inside the engine substituting he camera agles with the new angle of the HMD (angle of camera source via the X,Y of HMD as new camra axis) for target locking.
However the second half here is that in times like these I regret that the FreeFalcon didn’t survived. There it would be at least remotely possible for me to get the code and on lone nights attempt something to code such stuff in. With secretive BMS development it looks like I am out of luck. And I do not have the time to become active developer with NDA with someone. But I do believe that putting BMS on github would bring fresh impulse of life here. But again this is offtopic.
-
Yeah - using the TrackIR interface I have no issue with…I just don’t like the device itself. I think if BMS went open source it would descend into chaos, though…
It’s far easier to just blank the reticle from the screen and display it on a display on your head - I’m going to do that to implement a working HUD…and I’m going to knock out that silly nose model - I’ve never had a seat in a real jet where you could see the nose from the seat. Why build a fighter where the nose would obscure the view? I just have to come up with a way to align the reticle in my on-head display, which I don’t think will be that difficult using the existing interface. I just need to make my routine talk to BMS.
What I don’t know yet is if the reticle can be treated like an MFD or the HUD for display extraction…I may need to use something like YAME to do that.
-
So about this “add-on routine to interface with BMS” you want to write for your HMD input. How does that work ? I am not aware that you can write input like to for BMS. Ergo you will get vector from your head orientation, how do you plan to give this vector to BMS to use as in-game HMD vector ?
-
Hi do you know whether the edtrackerpro will work with arma3?
Thanks.
Regards Metalhead -
Hi do you know whether the edtrackerpro will work with arma3?
Thanks.
Regards MetalheadIt will work with any game/sim that will work with TrackIR.
-
So about this “add-on routine to interface with BMS” you want to write for your HMD input. How does that work ? I am not aware that you can write input like to for BMS. Ergo you will get vector from your head orientation, how do you plan to give this vector to BMS to use as in-game HMD vector ?
So, what I would have to do is:
- blank the HMCS reticle from the BMS OTW.
- extract the BMS HMCS reticle and display it on a head/helmet mounted display over my eye.
- extract my look angle from my head mounted device and feed that to BMS, presumably through the BMS TrackIR interface.
- find a way to align that look angle to the BMS world - similar to boresighting a MAV. (This is how it’s done in RL, BTW)
- implement #4) in such a manner that I can use the existing HMCS Align options on the DED to make it all work; i.e. - I’ll also have to read SM from BMS to drive my alignment.
I seem to recall that we now have some limited ability to write to BMS SM…I may not be able to do any of the above without that ability.
-
“3) extract my look angle from my head mounted device and feed that to BMS, presumably through the BMS TrackIR interface.”
This point shares exactly my problem from the opening post. The momemt you feed this data to the TrackIR interfaces, the whole game camera starts moving to it. Yon need to fixate the camera just like I need to.
-
That goes straight to my point about the requirement to do an alignment - you have to de-couple the reticle from the OTW and then blank it from the OTW. Then deselect TIR as your View control.
Even though you tell BMS not to use the TIR input to move the OTW, that input data are still present and the angles wrt target can be calculated from them. It’s this calculation that is needed to slave sensors and bug the target you’re looking at. This is also where there may be a requirement to be able to write to SM and not just read it in order to do this.
-
Yes, I get all that. My point was I do not see how to even start achieving that with some form of development. I tried to search for any BMS APIs or plug-in system to code this stuff against, but didn’t found anything that looked useful to base our “3rd party” development of such decoupling against.
So how do we even start here?
-
The first thing to do is to blank the reticle from the OTW, and still be able to extract it. People can do this with the HUD, already so it’s just a matter of capturing the other graphic and recycling that code/process. The do some experimenting with a second screen to see if you’ve decoupled the reticle from the OTW and it is still using TIR inputs by default - that would certainly make life simpler, and could possibly eliminate the need for an actual alignment…but I strongly doubt it.
The trickier part is what can we do via SM - read/write vs read only. I’d start by looking over the SM map and then noodle things out…but as you’ve already guessed, it’s all going to be new ground and new learning. A lot of new learning.