4.36 might very well answer the question, even from Microprose’s perspective.
Explain better pls
4.36 might very well answer the question, even from Microprose’s perspective.
Explain better pls
…@ TheFalcon: Be an actor of your dreams!
I would be more interested in creating a terrain shape that is not edgy and photorealistic terrain textures, bms really needs that.
I would like to add something important, it is true that the term “modules” and “parts of a system” exists in the Software…
Guys … DX11 just came in and is not fully exploited yet and you are suggesting Vulkan implementation?! Are you nuts?!
Let the developers utilize DX11 first and correct me if I’m wrong , but iirc DX12 and Vulkan are quite similar APIs. So rewrite GFX engine to either DX12 or Vulkan should take similar amount of work. I’d opt for Vulkan because it would help running BMS on non-Widows systems.
Vulkan being available on WIndows7, 8/8.1 is another bonus.
But are vulkan and DX free? I mean why should they be debugged? there are no guides to integrate them?
I don’t know much about these things, i don’t even know how a motor like unity works.
….you start suffering from floating point jitter the further you move away from the origin…
but so how did they do with fs2020 and predecessors?
Let the developers utilize DX11 first and correct me if I’m wrong , but iirc DX12 and Vulkan are quite similar APIs. So rewrite GFX engine to either DX12 or Vulkan should take similar amount of work. I’d opt for Vulkan because it would help running BMS on non-Widows systems.
Vulkan being available on WIndows7, 8/8.1 is another bonus.
If we could use a vulkan version of bms on linux, assuming it is compatible with all products, joysticks, vr viewers … to be ready for the future, I would create a partition with linux only for bms
…Part of an engine can be swapped out and improved on to make it better…
This is what I meant, taking the parts we need and putting them in bms
Do you think it is possible to convert dx 11 to 12 or to vulkan on bms?
The performance overhead is typically not that high, but there is an increased requirement for video RAM to store the additional textures to be loaded, depending on the PBR implementation. Engines such as Unity and Unreal Engine support various metallic/roughness formats where you would typically have your diffuse/albedo textures, your normal map textures and then separate texture sets for metallic and roughness. All these textures need to be loaded into video memory and take up storage space, so some simulation developers try to embed as much information into the smallest file spaces to try and save on those. X-Plane for instance embeds the PBR texture info into the alpha channel of the normal maps. It is a crude solution but it works to some degree but it does not provide a 1:1 representation between the development software and the sim. Each texture set has two textures for X-Plane, and a third if you have an emissive set.
DCS embeds metallic/roughness data into a single texture but within the individual RGB channels to separate the info. DCS thus uses three texture files for each texture set.
The VRAM requirement depends not only on the number of additional files but also the resolution of the files. A high definition aircraft could have multiple 4K texture sets to allow a uniform texel density over the entire object - and that is only referring to external. Internally you may need the same of even more depending on the level of detail you are going for.
If I had to guess, adding PBR could add an additional 30% - 50% VRAM requirement over a standard albedo/diffuse & normal map texture set. This is a very rough guestimate that will depend entirely on the PBR implementation. On a simulation that is mainly CPU dependent the performance impact could be negligible. If you are running an old video card that is already struggling from low video memory then the impact could be noticeable, but I have no idea what the current BMS underlying system looks like so I can’t really comment on that.
Do you think bms developers are able to implement PBR completely? it is not possible to recreate bms into Unity and Unreal graphics engine?
I think VR support would be nice for….VR users. A large portion of us currently flying with TrackIR or other head tracking solutions will most likely keep flying the way we always have. With DX11 support I am more excited about the prospects (I am hoping) of having physical based rendering (PBR). It will require an updated renderer, but PBR together with a better lighting system will do a lot to breathe new life into Falcon.
But there is still a lot to do - I browsed through the tactical reference section again tonight and there are a lot of 3D models in need of a refresh. It would be great if members of the community could assist with modding and creating new 3D models. I know it is possible, but without an SDK or published guides it is hard to say what exactly is possible.
PBR looks good but how exactly does it work? specifically how would it impact fps in bms?
V-sync OFF*. It would defeat the point … triple-buffering is all about not tearing, while not blocking on the v-blank signal from the monitor.
(*If you have Nvidia graphics, the control panel offers a setting for v-sync called ‘Fast’ which is their implementation of triple-buffering. I really like v-sync=Fast, (a) because it works in all games, and (b) it also steps out of the way when running in windowed/borderless mode which is essentially triple-buffered always courtesy of how the DWM works.)
So, to recap… two ways to enjoy triple-buffering:
Nvidia control panel: v-sync=off; Falcon in-game: v-sync=off, triple=ON
…then be sure to run in fullscreen-exclusive mode; borderless/window will work ok but have extra latencyNvidia control panel: v-sync=Fast; Falcon in-game: v-sync=off, triple=off
…then run in either windowed, borderless or fullscreen, as you prefer, with no significant difference in latency
if my monitor has 60hz and if bms drops below 60fps with vsync off, what changes between tb(or nvidia fast vsync) on or off?
This is depending on your screen. If you mimic human FOV with “classic” screen, you will have deformation. To me, it is not the way to go.
The FOV is dictated by the screen (size and resolution) you are using … unless you won’t be able to read MFDs and will have a kind of distorted view.
in practice it would be a deep image of the cockpit, with the instruments too far away to be read.
Frame-rate limiter is something I forgot to mention … without it, your CPU and GPU will run very hot, with triple-buffering enabled! Although no different than running v-sync=off (tearing).
For BMS with triple-buffering, I’m finding I like to limit max fps to around 1.3x - 1.5x my monitor’s refresh rate. So eg. for 60hz monitor, set max frame rate = 80 or 90.
The theory: you want to avoid capping fps close to your refresh rate, or any even multiple of your refresh rate … ie. at 60fps the triple-buffer would rarely have any benefit, and at 120fps you’d begin dropping all the odd frames and showing the even frames.
But by keeping the CPU/GPU producing frames about 33% faster than they get pulled from the triple-buffer, it keeps the TB populated with fresh, recent frames, helps smooth out any frametime spikes/hiccups, and only drops around 25% of the rendered frames so your PC doesn’t heat up your whole room… lol
Also, when setting a frame-rate limit, it may be useful to set Low Latency Mode=on (this used to be a setting called Max pre-rendered frames=1) to avoid building up a queue between the CPU and GPU (if your CPU is faster than your GPU).
Let me understand, is vsync on or off.
At the risk of turning this into yet another “what’s the correct FOV?” thread… the various FOV calculators reckon 59 degrees for my screen and measurements (distance from screen). Which is bonkers. I can only see the HUD at that FOV.
is simple, uman eyes fov is 130-135º vertically and 200-220º horizontally. To know if what we have in bms is correct, we must take a photo inside the cockpit that allows us to have a vision of 135º v and 220º h, then compare that photo with the cockpit of bms and you will know if the fov it’s correct.
And for a rig with ample perf (over say 90fps) and a 60hz fixed-refresh monitor, I recommend you try Triple Buffering – with a frame-rate cap in the NVidia or AMD control panel, set to about 30% over your refresh rate. (So, for a 60hz monitor, cap at about 80 or 85 fps.)
Why?
This is probably a daft question. If so, my apologies.
I Run 4.35 on Linux and on Windows. On both, I can get 90-100 FPS @ 3440x1440@60Hz, but tend to use Vsync to minimise tearing, which limits to 60fps.
I don’t observe any performance difference between Vsync or not Vsync, unless I put the FPS counter on.
But what I do observe is that my speed perception/feeling of speed is significantly lower than that that I see on many people’s published videos (regardless of whether I’m at 60 or 100FPS). And that’s the bit I don’t understand. I had assumed that FPS was everything, i.e. the slower the FPS, the lower that feeling of speed. But that doesn’t seem to be the case. Unless I’m just making a fair comparison between my actual output and the YouTube version of someone’s flight?
FPS change the sense of fluidity not the perception of speed, unless they drop below a certain value. Above a certain value there is no noticeable difference, like above 60 FPS. The sense of fluidity is also due to the frequency of the monitor, in your case 60hz, I think it is useless to go beyond 60 FPS with your monitor because it would not be able to show them.
You can try to do this thing, record a video and upload it to youtube, then compare it with flying.
@rubbra:
what is the “correct” FOV
Now that’s a good question.
Not sure what you mean… many sims have VR, APIs exist for VR implementation. Anything DCS can do, we can do… it just take us more time because it’s not our day job and we have many less working hands.
Good news then it is possible! i can’t wait to see bms in 3d with vr.
@jhook:
Now, half dome projection is something entirely different. Doable with current technology. Expensive (5 projectors @ 4K). Far more realistic IMO. Once the assets are created and implemented, this option would create the most realistic pit build. VR and pit builds would not be a great (realistic) option IMO.
Do you think it is possible to build a small full dome? I’ve always imagined a small and narrow dome that doesn’t take up much space and even goes a little under the cockpit. Except that to make it like this maybe it doesn’t have to be completely spherical, I don’t know if it’s possible. The only flaw is that so there is no 3d effect, also the cockpit is not (maybe i’m not sure) as illuminated as inside the sim. Then the performance, we are sure that they would not be the same if not worse than the vr? having to render a scene with a half sphere fov does not seem like a small thing to me.
Forget VR that is yesterdays news
https://varjo.com/products/xr-3/
https://varjo.com/blog/case-dassault-aviation-developing-immersive-pilot-training/
What was that? we are not able to have a decent experience in vr and they are already thinking about creating a more immersive one?
I used to get sick in the golden days of computer games (Doom 1, Wolfenstein, etc) and resorted to taking Ginger pills before I would play. I did ONE fishing excursion my whole life around that time, went from perfectly calm to 10’ seas as soon as we reached our spot, so bad it took a 1/2 hour to tack in the last 200 cyds, spent the entire trip unable to move from the bench in the crew quarters (other than crawling to the rail more times than I can count), calling myself a pussy and trying to get up. Totally incapacitated… Found out 20 years later that instructing someone new (and not smooth with inputs) on a race track gives similar results… so VR is just not in the cards for me. Just bought 3 32" Samsung G5s, done…
I think VR could be the cure for seasick sufferers. With the vr you can train until you stop feeling nauseous.
Good to know, what was your 3D glasses and 3D monitor?