New to Falcon BMS
-
I tend to disagree with “anything over 25 is good”
It’s good enough for visuals–but frame rate also determines game updates (To be clear up front: This is an assumption based on my observations, I have NOT dug through the old code to verify this, read on for explanation). Every frame is a cycle of the game loop, which means everything that happens in the 3D, AI decisions, campaign, comm updates, weapons in the air, other entities out there, etc… everything gets updated every frame (Or at specific intervals, depending on how it’s coded, but my observations lean toward every frame for most things in the game–both explained later). So it’s not JUST about not having any tearing or visual glitches and lag, it’s also about the underlying object management in the code. For instance:
Let’s say the game is coded to update EVERYTHING every frame: An aircraft flying somewhere in 3D gets a position update every frame, so it looks at the amount of time that has passed since the last frame, multiplies that by the vector speed (direction and speed) and uses the product to update the position. Granted 1/30th of a second (30fps) vs 1/60th of a second (60 fps) isn’t MUCH difference, but at roughly M1, that’s a 5m difference per frame. Two aircraft closing on each other for ACM at high speeds, that can be a difference of 10-20m EVERY FRAME. Over time, the object ends up in roughly the same 3D space/position because it’s just math (speed * time elapsed). BUT, assuming the entire game loop executes every frame this means the AI now has 1/2 as many chances to execute a maneuver, react to the player, employ a weapon, fire flares, etc… Negative impact on realism, and erratic AI behavior . Every decision the game makes is limited by the number of frames. This was very common in most games of any type until mid-late 2000s because there just wasn’t THAT much going on to really bog down a decent CPU. RAM and HDD space often prevented a lot of huge environments, and GPUs couldn’t keep up anyways, so the CPU never got pushed to it’s full potential. Therefore, every frame could run through the entire update process with minimal impact on performance.
Most modern games now run updates based on elapsed time (Combined with some more event driven functionality), instead of every frame–with the exception of position updates; because position updates make pretty graphics, and pretty graphics make happy gamers. But drawing to a DX pipeline also uses a lot of CPU–much more so in older versions of DX because they cannot leverage multiple cores during the render stage. This is a big reason why most modern games sluffed off a lot of AI-type updates to only every few frames, or in some case every few seconds. This practice wasn’t widely adopted until the last 10 years or so because better graphics, RAM, and storage allowed for larger, more dynamic environments. That means there is a pretty good chance the Falcon code still executes a full game loop cycle every frame (think original Falcon with not so great graphics…not all that intense on the CPU or GPU, it was fine to execute like this when it was originally designed)–which is why super beefy systems still get a ton of frame rate drop, regardless of GPU and VRAM, and having 16GB of system RAM available with a huge CPU (Watch it in Resource Monitor, only 1 core ever REALLY gets taxed…thanks a ton old DirectX). In the same scenario as above, using a timed method, the position gets updated every frame using the time elapsed * vector speed equation, but all other functions are set on a timer. Every frame (Game loop cycle) the code tells each function that needs to happen to check if enough time has passed, and if it has, then execute said function–if it hasn’t, move on to the next function. This means 1 quick “if” test for each function, is required each frame, but potentially saves a ton of CPU because the rest of the function may not need to be executed. Realistically speaking, AI only needs to make most decisions once every second, or maybe .5 seconds for some better or “smoother” combat. But does it need to examine the ACM environment 30-100x per second while in a turning fight? Not really. Real time is great, and with unlimited CPU power would always be ideal, but for a “simulated realistic environment”, 1x per second is generally acceptable, and 2x per second is pretty good. The same concept applies to everything else in the game, do ground units need to check position and verify orders EVERY FRAME? No. In most cases, for this environment, 1x every 5-10 seconds would probably be more than acceptable. ATC making decisions for aircraft, campaign engine updating unit orders, logistics, etc etc… none of which need to be updated more than 1x every few seconds at the most, or even minutes in some cases. I’m almost positive a lot of the campaign stuff is a little more time-elapsed based instead of per-frame based, or more accurately “game time based”, which means a lot of it happens at certain times on the game clock instead of calculating how much time has passed, so some of this can be extracted from the FPS conversation. Some big exceptions to this would be launching intercepts or reacting to dynamic player driven events happening in the campaign.
So, what does this all mean for Falcon? Or more importantly what does a higher FPS mean for Falcon? Well let’s look at time sensitive functions, such as CCRP bombing. There is a window built in to allow a release to happen–this MUST be the case because there is no way to ensure the game loop will update to a PERFECT position for the release characteristics to be met. Dropping a bomb at 350 knots means roughly 3m elapse every frame at 60fps. If the window to allow a release to occur is less than 3.3m (3 on the back side of a “perfect position” + 10% overhead), there is a 50% you would fly through the release window every attempt to launch and never be able to pickle a weapon–which increases non-linearly as that window gets smaller (90% miss rate at <2m). At 20fps, roughly 9m elapse every frame. This means the window now has to be around 10m to maximize release potential. The flip side to having large windows like that, however, means that the weapon will launch, fall (We’re talking unguided), and impact on a path equivalent to the distance from “perfect solution” where the code saw all conditions being met and dropped the weapon. This means in an 8m window (Ideal size for 25fps), you could potentially drop a weapon up to 8m long of the target marked by the CCRP. So, the code has to be written in a way as to find a balance between a large enough window that lower FPS systems can still function, but small enough so that the weapon can still “hit” the intended target.
If we acquiesce to say anything over 25fps is “good enough”, and the coders adhere to those statements, then we are saying EVERYTHING in the game can never be more than 25fps accurate. In the same example from above, dropping a bomb from 350 knots, that means the bombs can never be more accurate than a 7m radius, but randomness and chance can make it appear so. Fortunately for us, this works out in a sim because RL ordnance delivery tends to have a little wiggle room in accuracy. But, at 900 knots closure speed for 2 aircraft in a merge, that’s a 15.4m change in position every frame. Most of the aircraft in the game are nowhere near 15m in length, which means the AI will either fly through the window for the gun 80% of the time, or we have to make the hit box extremely large to account for lower FPS–same dance as before trying to find a balance. Again, it’s fortunate for us in a game like this we can add a little wiggle room for when the AI decides to shoot, and it has the same effect as a pilot firing just a little early to hit a moving target, or a little early/late due to varying pilot skill. But in a PvP environment, this could really cause some issues. Thus, in MY OPINION only, higher frame rates DO make a significant difference in overall game play, but I agree with you that from a strictly visual perspective, 25 is adequate and 50 is not much different than 100.
-
Ohh boy! Here we go…
Wait, I’m grabbing the pop corn! [emoji897] -
50-60 is more than enough for flight simulation, in fact it’s perfect. Beyond 60 it’s just an ego thing, which I get it, but you should never complain about 60 FPS, it’s good as gold!
-
In the future version, F-18 will have its own dedicated (and realistic) FLCS. Still no dedicated avionic though.
Do you know if this is going to be a dynamic selection based on DAT files, or hard coded? The former opening the doors for flight computer models for all aircraft, and the latter just coding one specific for the F-18.
-
Ohh boy! Here we go…
Wait, I’m grabbing the pop corn! [emoji897]Why you gotta hate? Most people have no idea that FPS effects more than just the picture on the screen. Pardon me for trying to help people understand some of what happens beneath the hood.
-
In the waaaay past from coders was said the exact opposite. Era before dx9.
They said above 30 cause more trouble.
The reasoning was like the code doesn’t have the time to make the calculations cause it runs at x speed and awaits at x speed the results while your system runs 2x or 3x and this causes trouble.
I’m not saying this is the case now, just how it was mentioned.sent from my mi5 using Tapatalk
-
In the waaaay past from coders was said the exact opposite. Era before dx9.
They said above 30 cause more trouble.
The reasoning was like the code doesn’t have the time to make the calculations cause it runs at x speed and awaits at x speed the results while your system runs 2x or 3x and this causes trouble.
I’m not saying this is the case now, just how it was mentioned.sent from my mi5 using Tapatalk
I thought this was a reference to the accelerated game speed? Not native FPS.
-
Nope… This also. BUT this was back then. Maybe now it’s not the case.
sent from my mi5 using Tapatalk
-
Nope… This also. BUT this was back then. Maybe now it’s not the case.
sent from my mi5 using Tapatalk
This doesn’t make any sense to me then. You’re saying someone told you that any frame rate over 30 fps caused negative impact on the game’s ability to function? Your frame rate will never exceed what the CPU requires to complete a loop. If it gets better than 30 fps, that means it completed all the required calculations and exited to the draw call at the end of the loop. There is some multi-threading built in, but unless it was very poorly implemented, it would still need to report complete before you could exit back to the main loop and call the draw–number of times the draw is called is FPS. Maybe lost in translation somewhere, but that seems not quite right to me.
-
Ho Hum Dee Dum……:D
C9
-
I agree completely C9. Mortesil I don’t believe before dx9 era there was multithreading implemented in Falcon.
sent from my mi5 using Tapatalk
-
I agree completely C9. Mortesil I don’t believe before dx9 era there was multithreading implemented in Falcon.
sent from my mi5 using Tapatalk
I don’t believe the two have anything to do with each other. DX did not support multi-threading prior to DX12. So if the two were added concurrently, they have nothing to do with each other. But I have read that multi-threading has been incorporated since the beginning, in some way or another.
EDIT: I almost forgot to add my reference… silly me: https://msdn.microsoft.com/en-us/library/windows/desktop/dn859354(v=vs.85).aspx
Straight from the horse’s mouth.
-
Do you know if this is going to be a dynamic selection based on DAT files, or hard coded? The former opening the doors for flight computer models for all aircraft, and the latter just coding one specific for the F-18.
Perfo data are .dat … FLCS is harcoded.
-
Perfo data are .dat … FLCS is harcoded.
I know that’s how it is now, but that doesn’t mean it couldn’t change in the future. Sad to hear this though, so much potential is wasted when things get hard coded.
-
I tend to disagree with “anything over 25 is good”
It’s good enough for visuals–but frame rate also determines game updates (To be clear up front: This is an assumption based on my observations, I have NOT dug through the old code to verify this, read on for explanation). Every frame is a cycle of the game loop, which means everything that happens in the 3D, AI decisions, campaign, comm updates, weapons in the air, other entities out there, etc… everything gets updated every frame (Or at specific intervals, depending on how it’s coded, but my observations lean toward every frame for most things in the game–both explained later). So it’s not JUST about not having any tearing or visual glitches and lag, it’s also about the underlying object management in the code.Very true. And it is well know in BMS, when FPS becomes critical one might have issue in flight dynamics calculation leading into loss of control of the a/c. But 100 FPS is not needed.
I run BMS since 2009 with an average frame rate between 25FPS (Campaing+TGP+weather+FLIR) and 60FPS (clean TE) … I never had in issue because of it.
-
Mortesil I never said DX has anything to do with multithreading. When saying before dx9 era I’m talking before dx9 falcon era thus before 4.32.
sent from my mi5 using Tapatalk
-
Very true. And it is well know in BMS, when FPS becomes critical one might have issue in flight dynamics calculation leading into loss of control of the a/c. But 100 FPS is not needed.
I run BMS since 2009 with an average frame rate between 25FPS (Campaing+TGP+weather+FLIR) and 60FPS (clean TE) … I never had in issue because of it.
I wasn’t trying to infer that low FPS would cause issues per se. I think I really just meant to get people to realize that FPS is more than just how fast the images get drawn. It’s way more than just a good GFX card with a lot of video RAM. The examples about CCRP were just to illustrate where issues could occur that you wouldn’t normally associate with something like FPS.
-
-
What is the main factor for FPS in BMS 4.32/4.33? High CPU clock speed on a single core? Increased number of cores? VRAM speed? GPU speed?
My old Q9550 with 8GB DDR2 and a GTX 760 2GB runs BMS 4.32 (under XP) with FPS ranging from 25-50 over FLOT during the first day of the Rolling Fire campaign.
(1920x1200). Using TGP will really cause the FPS to dip but even without TGP, at times I find myself in <20 FPS. Any idea what the main driver is in this, given that my rig can run newer games still quite well in 1080p. Of course none of those have such old internals like Falcon 4.0 and none of these mimic a full scale war either. -
The 120 pixel lines more than the 1080.
Adding pixel lines on the Y axis is a killer for vga’s
Also your CPU. I had a similar and when I went to a newer 2 core it was better.
sent from my mi5 using TapatalkEdit. Oh and your ddr2 ram.
Also your hdds are on sata2 logically.So all those hold it back.
With same VGA when I went from Q6600 to the 2 core and now 4 core system things where better.
The 2 core was on the same Mobo.