New to Falcon BMS
-
, but everything is always based on the F-16 avionics.
In the future version, F-18 will have its own dedicated (and realistic) FLCS. Still no dedicated avionic though.
-
-
im getting 50 fps without changing anything in graphic settings!! is that normal???
i7-4790k @4.4ghz 1.19v | ASUS SABERTOOTH Z97 MARK 2 / USB 3.1 | 16GB HyperX Savage @2133mhz CL11 1.6V | EVGA GeForce GTX 970 SC ACX 2.0 4GB | Samsung 850 EVO 250GB | WD Caviar Black 1TB | WD Caviar Blue 500GB | Thermaltake Smart M850W Modular 80+ Bronze | MS Sidewinder Precision 2 | TrackIR 5 | LG IPS LED 23MP55HQ 23" - 1920x1080@60hz - Noganet Stormer NG-CP767
-
You have a very similar rig to mine. Check Vsync settings and/or Nvidia settings. If you have DSR on it may bog things down idk.
You should easily run max settings and get well over 60, maybe even 100+ fps
-
everything is default in nvidia, vsync on in game
-
-
I never touch nvidia panel, isnt there other way that is not lowering graphics?
-
Ingame, turn your Vsync off!!
C9
-
already done that, still 55 fps.
I run DCS at 180 fps at high settings so imagine my frustration here.
-
Driver issue? My rig: i7-4790K @ 4.0GHz | MSI Gaming Z97 Mobo | 16 GB Ripjaws ram | NVIDIA GTX 970 4GB | ACER H236HL 23" IPS Monitor
I just checked the fps on mine. I get over a 100 fps in orbit (outside) view and just over 60 in the cockpit, which works out nicely with my monitor. But my game is modded a bit, so you might expect better results.
edit: FYI i have multisampling set to quality level 2, not 3 in the in-game settings
-
already done that, still 55 fps.
I run DCS at 180 fps at high settings so imagine my frustration here.
What reso you running in??
I doubt, I mean, you’re not, lol, gonna achieve what DCS is at. Falcon is ancient compared to DCS’s graphics engine.
Have you messed around with the Shaders in Configuration?
C9
-
already done that, still 55 fps.
I run DCS at 180 fps at high settings so imagine my frustration here.
55 is more than needed. 180 or 600fps what is the point. You have more than 25 good … more than 50, perfect! … so, stop watching you FPS counter en enjoy the flight.
Start flying … You will come back on optimization later.
-
55 is more than needed.
Agree. Nothing magical is going to happen above 55 FPS.
I think it has something to do with the “Peter” Pan Syndrome……
C9
-
I get 22 max with my ancient rig. Highres airbases kill my system, but I am using AS pit. I have enjoyable flights. Dee-Jay is right, turn off the counter and have fun…
-
I tend to disagree with “anything over 25 is good”
It’s good enough for visuals–but frame rate also determines game updates (To be clear up front: This is an assumption based on my observations, I have NOT dug through the old code to verify this, read on for explanation). Every frame is a cycle of the game loop, which means everything that happens in the 3D, AI decisions, campaign, comm updates, weapons in the air, other entities out there, etc… everything gets updated every frame (Or at specific intervals, depending on how it’s coded, but my observations lean toward every frame for most things in the game–both explained later). So it’s not JUST about not having any tearing or visual glitches and lag, it’s also about the underlying object management in the code. For instance:
Let’s say the game is coded to update EVERYTHING every frame: An aircraft flying somewhere in 3D gets a position update every frame, so it looks at the amount of time that has passed since the last frame, multiplies that by the vector speed (direction and speed) and uses the product to update the position. Granted 1/30th of a second (30fps) vs 1/60th of a second (60 fps) isn’t MUCH difference, but at roughly M1, that’s a 5m difference per frame. Two aircraft closing on each other for ACM at high speeds, that can be a difference of 10-20m EVERY FRAME. Over time, the object ends up in roughly the same 3D space/position because it’s just math (speed * time elapsed). BUT, assuming the entire game loop executes every frame this means the AI now has 1/2 as many chances to execute a maneuver, react to the player, employ a weapon, fire flares, etc… Negative impact on realism, and erratic AI behavior . Every decision the game makes is limited by the number of frames. This was very common in most games of any type until mid-late 2000s because there just wasn’t THAT much going on to really bog down a decent CPU. RAM and HDD space often prevented a lot of huge environments, and GPUs couldn’t keep up anyways, so the CPU never got pushed to it’s full potential. Therefore, every frame could run through the entire update process with minimal impact on performance.
Most modern games now run updates based on elapsed time (Combined with some more event driven functionality), instead of every frame–with the exception of position updates; because position updates make pretty graphics, and pretty graphics make happy gamers. But drawing to a DX pipeline also uses a lot of CPU–much more so in older versions of DX because they cannot leverage multiple cores during the render stage. This is a big reason why most modern games sluffed off a lot of AI-type updates to only every few frames, or in some case every few seconds. This practice wasn’t widely adopted until the last 10 years or so because better graphics, RAM, and storage allowed for larger, more dynamic environments. That means there is a pretty good chance the Falcon code still executes a full game loop cycle every frame (think original Falcon with not so great graphics…not all that intense on the CPU or GPU, it was fine to execute like this when it was originally designed)–which is why super beefy systems still get a ton of frame rate drop, regardless of GPU and VRAM, and having 16GB of system RAM available with a huge CPU (Watch it in Resource Monitor, only 1 core ever REALLY gets taxed…thanks a ton old DirectX). In the same scenario as above, using a timed method, the position gets updated every frame using the time elapsed * vector speed equation, but all other functions are set on a timer. Every frame (Game loop cycle) the code tells each function that needs to happen to check if enough time has passed, and if it has, then execute said function–if it hasn’t, move on to the next function. This means 1 quick “if” test for each function, is required each frame, but potentially saves a ton of CPU because the rest of the function may not need to be executed. Realistically speaking, AI only needs to make most decisions once every second, or maybe .5 seconds for some better or “smoother” combat. But does it need to examine the ACM environment 30-100x per second while in a turning fight? Not really. Real time is great, and with unlimited CPU power would always be ideal, but for a “simulated realistic environment”, 1x per second is generally acceptable, and 2x per second is pretty good. The same concept applies to everything else in the game, do ground units need to check position and verify orders EVERY FRAME? No. In most cases, for this environment, 1x every 5-10 seconds would probably be more than acceptable. ATC making decisions for aircraft, campaign engine updating unit orders, logistics, etc etc… none of which need to be updated more than 1x every few seconds at the most, or even minutes in some cases. I’m almost positive a lot of the campaign stuff is a little more time-elapsed based instead of per-frame based, or more accurately “game time based”, which means a lot of it happens at certain times on the game clock instead of calculating how much time has passed, so some of this can be extracted from the FPS conversation. Some big exceptions to this would be launching intercepts or reacting to dynamic player driven events happening in the campaign.
So, what does this all mean for Falcon? Or more importantly what does a higher FPS mean for Falcon? Well let’s look at time sensitive functions, such as CCRP bombing. There is a window built in to allow a release to happen–this MUST be the case because there is no way to ensure the game loop will update to a PERFECT position for the release characteristics to be met. Dropping a bomb at 350 knots means roughly 3m elapse every frame at 60fps. If the window to allow a release to occur is less than 3.3m (3 on the back side of a “perfect position” + 10% overhead), there is a 50% you would fly through the release window every attempt to launch and never be able to pickle a weapon–which increases non-linearly as that window gets smaller (90% miss rate at <2m). At 20fps, roughly 9m elapse every frame. This means the window now has to be around 10m to maximize release potential. The flip side to having large windows like that, however, means that the weapon will launch, fall (We’re talking unguided), and impact on a path equivalent to the distance from “perfect solution” where the code saw all conditions being met and dropped the weapon. This means in an 8m window (Ideal size for 25fps), you could potentially drop a weapon up to 8m long of the target marked by the CCRP. So, the code has to be written in a way as to find a balance between a large enough window that lower FPS systems can still function, but small enough so that the weapon can still “hit” the intended target.
If we acquiesce to say anything over 25fps is “good enough”, and the coders adhere to those statements, then we are saying EVERYTHING in the game can never be more than 25fps accurate. In the same example from above, dropping a bomb from 350 knots, that means the bombs can never be more accurate than a 7m radius, but randomness and chance can make it appear so. Fortunately for us, this works out in a sim because RL ordnance delivery tends to have a little wiggle room in accuracy. But, at 900 knots closure speed for 2 aircraft in a merge, that’s a 15.4m change in position every frame. Most of the aircraft in the game are nowhere near 15m in length, which means the AI will either fly through the window for the gun 80% of the time, or we have to make the hit box extremely large to account for lower FPS–same dance as before trying to find a balance. Again, it’s fortunate for us in a game like this we can add a little wiggle room for when the AI decides to shoot, and it has the same effect as a pilot firing just a little early to hit a moving target, or a little early/late due to varying pilot skill. But in a PvP environment, this could really cause some issues. Thus, in MY OPINION only, higher frame rates DO make a significant difference in overall game play, but I agree with you that from a strictly visual perspective, 25 is adequate and 50 is not much different than 100.
-
Ohh boy! Here we go…
Wait, I’m grabbing the pop corn! [emoji897] -
50-60 is more than enough for flight simulation, in fact it’s perfect. Beyond 60 it’s just an ego thing, which I get it, but you should never complain about 60 FPS, it’s good as gold!
-
In the future version, F-18 will have its own dedicated (and realistic) FLCS. Still no dedicated avionic though.
Do you know if this is going to be a dynamic selection based on DAT files, or hard coded? The former opening the doors for flight computer models for all aircraft, and the latter just coding one specific for the F-18.
-
Ohh boy! Here we go…
Wait, I’m grabbing the pop corn! [emoji897]Why you gotta hate? Most people have no idea that FPS effects more than just the picture on the screen. Pardon me for trying to help people understand some of what happens beneath the hood.
-
In the waaaay past from coders was said the exact opposite. Era before dx9.
They said above 30 cause more trouble.
The reasoning was like the code doesn’t have the time to make the calculations cause it runs at x speed and awaits at x speed the results while your system runs 2x or 3x and this causes trouble.
I’m not saying this is the case now, just how it was mentioned.sent from my mi5 using Tapatalk