Intel i7 14700K + nvidia 4070ti super 4K?
-
@m1tp2king Since the i9-14900k single core speed is only about 7% fadter than the i7-13700k, is a heat and power hungry monster and you wont use the majority of those cores ever or be able to upgrade the cpu in the future without a new motherboard, so not even future proofing, then yes IMHO Id say it seems like a massive overkill of spending for BMS in 4k. Also 96 gb RAM seems like an even worse waste of money for BMS since it has nothing to do with helping you with 4k and BMS can’t use that much anyway. Wondering why you didn’t pick the 4090 with double the video memory of the 4070ti which is the one thing that might help you a bit more with 4k. Or for VR. Unless you are using this configuration for other things like AI or image processing or something like that, its way over the top as you could spent much less money for pretty similar results in BMS. Maybe better would be i7-13700k, 64 gb ram 4070ti. Or wait until Q3 or Q4 for the LGA 1851 socket and have cpu upgrade possibilities on that board in future. Please correct me if I am wrong. I am considering upgrading for BMS also.
-
The other thing to think about is the eventual 4.38 release, Like others have mentioned, I would redirect some of the spend on the RAM into the GPU. E.g. 4080 Super.
-
@Icarus said in Intel i7 14700K + nvidia 4070ti super 4K?:
single core speed is only about 7% fadter than the i7-13700k,
Is it even just about the core speed? I presume you mean “clock” speed? Does IPC (instructions per cycle?) have a factor to do with it as well? It’s been a few years and I could be mis-remembering, but the battle between AMD and Intel had AMD down on clock speed but more IPC and thus could do productivity tasks faster than an Intel with a higher clock speed. Does that mean AMD can do calculations faster and thus an AMD CPU would be less of a potential bottleneck in a system build?
-
@Atlas i9-14900k is only 7% faster than i7-13700k. Intel made few gains in 14000 series over the 13000 series likely due to heat and the i9-14900k gets pretty damn hot. 14000 is end of the line for that architecture. I don’t follow the AMD stuff but yes AMD made big challenge to Intel.
-
@Icarus so your telling me to avoid i9. Go with the i7 and nvidia 4080 super.
That works perfectly fine for me. Looking to spend around £2800. Is that a good price for the uk? -
@Icarus
I think this was around the 12000 series where AMD was a clear winner in productivity tasks. Not kept up with the “battle” since then. But my question is — does IPC matter and at which point does clock speed negate any IPC advantage?@m1tp2king said in Intel i7 14700K + nvidia 4070ti super 4K?:
@Icarus so your telling me to avoid i9. Go with the i7 and nvidia 4080 super.
I don’t think he’s telling you one thing or another, just saying that you probably won’t see a difference between the two in terms of performance. A simpler way of putting it is that a single-core CPU running at, say, 4GHz will be out-performed by a quad-core CPU running at 4GHz simply because the PC will be able to utilise the other cores for other background processes. However, this does not mean that a 2GHz quad-core will out-perform the 4GHz single-core… this does not also mean that a 8-core 4GHz CPU will perform 2x better than a 4-core 4GHz CPU.
In most cases, single-core speed is still important because not many programs are multi-threaded and even those that are MT doesn’t mean they are MT’d properly, so although tasks can be assigned to other cores, if one core struggles with its task, the other cores will still end up waiting on that core before any more computing can be done.
Then there’s the issue of power consumption and heat — where you might need a really beefy cooler which may not work if you’re looking at a small form factor build… etc…
That works perfectly fine for me. Looking to spend around £2800. Is that a good price for the uk?
Depends on what you’re getting for that price. Again, what’s up with the 96GB RAM?? Was that a typo?
A 4080 Super is around £1,100, an i7 14700K is £400, 64GB DDR5 5600 is around £220, so where is that other £1,100 going?
-
@Atlas they’re making 48 Gb sticks now
-
@airtex2019
Still, what is up with that? Are programs really using that much RAM nowadays? Since when was 32GB or even 64GB not enough?As for his build, bumping up from 64GB to 96GB will be another £70-£100, so that still leaves out £1,000. Unless it’s a full system including monitors and such?
-
@Atlas no it’s pretty crazy, for an individual desktop system – unless you’re doing video editing … or big-data analysis … training ML models, that sort of thing
but people doing those things, know exactly how much ram they need (answer: as much as they can get!) and won’t ever ask.
-
Capacity of that RAM means nothing without a speed rating. What is the CAS latency and what are the memory timings?
Best RAM is the fastest, lowest latency sticks you can afford in the capacity you require which are also present on the Memory QVL (Qualified Vendors List) of the motherboard you are planning to use them in.
I can’t imagine many workflows that would make use of 96GB of memory - wasn’t stated above yet why so high on RAM - but speed should never be sacrificed for higher capacity memory.
-
I don’t bother counting FPS but I’m running a 13600K and an NDVIDIA 2060 and it’s butter smooth at 4K in every campaign I’ve run thus far. You may be engaging in overkill! As is 96mb ram. I run 32 on this machine and have yet to encounter any issues with… well… anything really, and it’s my work machine…
-
@m1tp2king I’m pointing you in a direction for you to do your homework before you buy and see if you find the same.
-
@Atlas If IPC is high due to multicore and virtual cores, it won’t help with most games or sims. It obviously doesn’t do much for the i9-14900k if its only around 7% faster than the i7-13700k.
-
@airtex2019 said in Intel i7 14700K + nvidia 4070ti super 4K?:
but people doing those things, know exactly how much ram they need (answer: as much as they can get!) and won’t ever ask.
Or would say so right off the bat. Or would say nothing as they would know at that size, it’s irrelevant.
Besides, for those of us in the know, we all confirm we can just download more RAM!!@SemlerPDX said in Intel i7 14700K + nvidia 4070ti super 4K?:
Capacity of that RAM means nothing without a speed rating. What is the CAS latency and what are the memory timings?
Even so, you quickly approach a point of diminishing returns. I’ve always gone with the idea of getting mid-range speeds or slightly above, but bleeding-edge speed of RAM does not justify the extra cost and that money can be better spent elsewhere for more tangible results.
@Icarus said in Intel i7 14700K + nvidia 4070ti super 4K?:
@Atlas If IPC is high due to multicore and virtual cores, it won’t help with most games or sims. It obviously doesn’t do much for the i9-14900k if its only around 7% faster than the i7-13700k.
Not sure what you mean?? Again, you are comparing Intel vs. Intel, plus I am not sure about the IPC difference between 14 vs 13. I was comparing AMD vs Intel with AMD having higher IPC while Intel having higher clock speeds, yet AMD was trading blows with Intel on gaming while absolutely dominating Intel on productivity. This was before those X3D chips.
-
@Atlas I was comparing Intel to intel since I was comparing i7-13700k and i9-14900k. All I am saying is difference in gaming/simming between those two chips is so minimal its not really worth the much bigger price and is not a necessary upgrade for 4K gaming/simming. I am not talking about AMD at all.
-
@Icarus
Yes, of course, I realise that. I was just adding that clock speed is not be-all and end-all as AMD had shown that lower clock speed but higher IPC beats Intel’s high clock speed but lower IPC in productivity tasks. Now if we think of simulation calculations as “productivity tasks,” there could be a case made that higher IPC is also a factor to look at. -
@Atlas Agreed but between the two chips I was comparing there was little difference in gaming/sim benchmarks so in that case neither clock speed nor IPC made any difference except a savings of about $250 and lower electricity bills.
-
As usual chiming in a bit late to the party, but:
First lets ask what this system is going to do, what is supposed workload is going to be like?- 96GBs of RAM needed only if you’re going to run lots of VMs at the same time, or work on huge datasets for tasks AI training, video editing, CAD modeling maybe …
- 14Gen i7 not a bad choice, but again it’s good at tasks that can be parallelized a lot, like code compilation, 3D renders using engines that do tiled rendering .
From purely BMS + typical home/office tasks POV I think best option is AMD R7 7800X3D, 32GB of low latency PC6000 DDR5 RAM should be enough, 'tho if you want to be on safe side 64GB is not bad option too. GPU if you really need something really strong now then RTX 4080 Super is the way to go, eventually if you’re brave RX 7900 XTX. But if it just for flat panel 4k then I’d get something like second hand 3080 or RX 6800 preferably sub 400$. It should handle BMS in this resolution just fine. Once RTX 50x0 and RX 8x00 are out in late 2024/early 2025 you can reassess your option. IMHO from BMS perspective RTX 40x0 and RX 7x00 series of GPU are in a bit of awkward position, a bit overkill for flat screen 4k but not fast enough to handle comfortably 4.38 VR in every situation.
-
@Xeno said in Intel i7 14700K + nvidia 4070ti super 4K?:
IMHO from BMS perspective RTX 40x0 and RX 7x00 series of GPU are in a bit of awkward position, a bit overkill for flat screen 4k but not fast enough to handle comfortably 4.38 VR in every situation.
Not even 4090? Is this possible?
-
@dumba
Yep 4090 it definitely a top dog here, but 2k$ price ain’t seem to be worth over 1-1.2k$ for 4080 Super.
As noted at the end of my post i’m not big advocate of buying current gen GPUs now.
From what it seems next gen is gonna bring substantial uplift in both pure perf and FPS/$ metrics. My rough calculation from I-Hawk posts related to expected 4.38 hardware requirements, you need something 15-20% stronger than 4090 to run future BMS in VR maxxed out with at least solid 60fps. It might be just me, but if I’d be going to spend 2 grands on GPU I’d wanted it to handle everything I can throw at it with ease for next few years.