Jump to content

Recommended Posts

Posted (edited)

Yea IL-2 wants
#1 cache
#2 maximum single threaded performance.

Like the majority of games core count does nothing.

 

7800x3d will be the best il-2 cpu when its released 

 

I have a 5600x and its fast enough for IL-2 flatscreen, but a little bit too slow for VR.
A 5600x is about 100% faster than your current CPU

image.thumb.png.710894813d0c755a658150de76de4e29.png

 

Edited by RossMarBow
WheelwrightPL
Posted
On 1/25/2023 at 9:54 PM, RossMarBow said:

Yea if your CPU limited a faster GPU is actually going to slow down you FPS.

You are only limited by the CPU at low resolutions like 1024p, but when you run at 4k and at High image quality, you are limited by the GPU. This is a general rule that applies to most games, and I don' t believe IL-2 is somehow an exception. Sure it does some physics calculations, but it is not like BeamNG which simulates highly detailed car physics 2000 times per second, and you can have 10 cars or more at the same time.

  • Like 1
  • Haha 1
  • Upvote 1
Posted (edited)
On 1/27/2023 at 4:47 PM, WheelwrightPL said:

You are only limited by the CPU at low resolutions like 1024p, but when you run at 4k and at High image quality, you are limited by the GPU. This is a general rule that applies to most games, and I don' t believe IL-2 is somehow an exception. Sure it does some physics calculations, but it is not like BeamNG which simulates highly detailed car physics 2000 times per second, and you can have 10 cars or more at the same time.


image.thumb.png.b3702fcf18518fab03cad5d621a08b68.png

 

image.thumb.png.3ff6efa537f10220d89c247652da2032.png

 

Since you don't know.

IL-2 is one of the most CPU demanding games. 
If you get negative scaling as seen above in that game IL2 will negatively scale for sure.

 

Negative scaling by increasing resolution is not purely about GPU load you are wrong.
People have already done lots of work on this.
Check out some of the stuff around Vander bench. 
Vander bench is CPU limited on all VR systems and most flat screen systems.
Vanderbench isn't that complex of a scene its two flights of planes and one train.
Also checkout stuff about VR performance.
I have seen the performance figures from two 4090 users and they are CPU limited
Or some of my posts on performance.

SYN_Vander BENCHMARK v6 to measure IL-2 performance in monitor & VR - Hardware, Software and Controllers - IL-2 Sturmovik Forum (il2sturmovik.com)

 

Theirs also a spreadsheet on how much each graphics setting increases CPU load

 

TL;DR
Increasing graphics resolution and detail increases CPU load

Picard is 100% cpu limited for sure and buying a faster GPU will be a bad idea
Being CPU limited is far worse than being GPU limited as it feels a lot worse because input latency and smoothness is ruined

 

Please miss me with generic statements about gaming performance you heard one time at the pub.
I actually know what I'm talking about.

Edited by RossMarBow
  • Thanks 1
WheelwrightPL
Posted (edited)
4 hours ago, RossMarBow said:


image.thumb.png.b3702fcf18518fab03cad5d621a08b68.png

 

image.thumb.png.3ff6efa537f10220d89c247652da2032.png

Why do you continue to only show 1080p graphs ? They prove my point, not yours. You seem to be trolling.

Edited by WheelwrightPL
Posted (edited)
7 hours ago, RossMarBow said:

Vander bench is CPU limited on all VR systems and most flat screen systems

 

Just to be precise here:

 

SYN_Vander bench has 5 tests. The purpose of them is to bottleneck CPU (CPU test) or GPU (4K test) or both (VRtest1 to bottleneck CPU mostly, VRTest2 to bottleneck both, VR test3 to bottlenck GPU).

7 hours ago, RossMarBow said:

Vanderbench isn't that complex of a scene its two flights of planes and one train.

There are a number of trucks&cars as well. Regarding planes there are about 4-5 planes in one direction and about 6 planes in the other direction.

It was created by @SYN_Vander for this purpose, and I think it has served well for that so far.

7 hours ago, RossMarBow said:

Theirs also a spreadsheet on how much each graphics setting increases CPU load

I think here you refer to this one:

https://forum.il2sturmovik.com/topic/57586-performance-impact-of-every-graphics-option/

 

But be aware that this study was done before they changed the rendering thing (Forward to Referred), the clouds tech and sky tech. So, things could be different now.

Edited by chiliwili69
  • Thanks 1
Posted

We're not talking about a low tier 12th or 13th gen CPU vs a high-tier 12 or 13th CPU... this is about a 4770 (a ~10-year-old piece of kit) vs pretty much any desktop 12th or 13th gen.

 

Even if we only focus on single thread CPU performance, that old CPU alone will be vastly outclassed by a low-mid/mid tier 12th or 13th gen one - just look at Passmark's single thread performance rating in that screenshot I posted further up.

 

And that's before we get to other factors like support for more modern OS', DDR3 vs DDR4/5, L1/2/3 cache size, etc. Plus there might be some more "trivial" problems demanding other upgrades like the existing system's PSU lacking certain new features, power or simply the correct cables/plugs to supply more modern components.

 

I've usually always tried to keep my systems "afloat" for as long as I could by throwing in a more modern GPU - remember when that was a pretty cost effective way of doing this? *sigh*. But even back then I would run into the point where upgrading the GPU wouldn't have made much of a difference or certain basic components of the system were too far out of date and it was time to build a new rig. And these days when a GPU upgrade isn't €/$200, 300 or 400 but more like 600 to 1000+, building a new system might actually be "cheaper" and more effective - if you do happen to already have a modern and fast GPU at your disposal like the OP does with his 3060 Ti.

 

Like I said already: I wouldn't dream of spending €900 on a 4070 Ti to pair it with an i7-4770. I'd rather spend that money to build a new system to put my existing 3060 Ti into.

 

 

 

S.

 

  • Thanks 2
Posted (edited)
18 hours ago, 1Sascha said:

We're not talking about a low tier 12th or 13th gen CPU vs a high-tier 12 or 13th CPU... this is about a 4770 (a ~10-year-old piece of kit) vs pretty much any desktop 12th or 13th gen.

 

Even if we only focus on single thread CPU performance, that old CPU alone will be vastly outclassed by a low-mid/mid tier 12th or 13th gen one - just look at Passmark's single thread performance rating in that screenshot I posted further up.

 

And that's before we get to other factors like support for more modern OS', DDR3 vs DDR4/5, L1/2/3 cache size, etc. Plus there might be some more "trivial" problems demanding other upgrades like the existing system's PSU lacking certain new features, power or simply the correct cables/plugs to supply more modern components.

 

I've usually always tried to keep my systems "afloat" for as long as I could by throwing in a more modern GPU - remember when that was a pretty cost effective way of doing this? *sigh*. But even back then I would run into the point where upgrading the GPU wouldn't have made much of a difference or certain basic components of the system were too far out of date and it was time to build a new rig. And these days when a GPU upgrade isn't €/$200, 300 or 400 but more like 600 to 1000+, building a new system might actually be "cheaper" and more effective - if you do happen to already have a modern and fast GPU at your disposal like the OP does with his 3060 Ti.

 

Like I said already: I wouldn't dream of spending €900 on a 4070 Ti to pair it with an i7-4770. I'd rather spend that money to build a new system to put my existing 3060 Ti into.

 

 

 

S.

 

Yes

TL;DR

The CPU is going to be holding back the 3060 ti a lot.
A faster GPU won't help at all.

 

 

Edited by RossMarBow
  • Haha 1
Posted

We will see , i buy 1 anyway when the price drops dramatic 

 

When the card is fore example around 500 euro .

 

Ore indeed u buy a compleet new system and put my 3060 ti in 

 

Time wil tell 

 

Robin

Posted
3 hours ago, RossMarBow said:

Yes

TL;DR

The CPU is going to be holding back the 3060 ti a lot.
A faster GPU won't help at all.

 

 

i agree , i owned a 3060 ti before getting a 3090 ti and i can say it is a very good card , i had paired it with intel i5 10600k cpu and had very decent VR experience with my G2. Lowering cpu demanding setting as shadows, mirrors, distant landscape details e.t.c with ultra preset graphics , medium clouds, 2XMSAA!!, 2660x2600 resolution.. i had a very acceptable Vr playing..

  • Thanks 1
Posted (edited)

Just one more tidbit of useless info before I go ? :

I've already posted my 2070 S vs 3070 Time Spy results further up , but since I have these shots lying around anyway and I'd forgotten to also post the 1060 3GB results, here they are:

 

1931667062_3dMark130322.thumb.jpg.4268d15475c8cebedfe69247702b3898.jpg

^ i5-12600K + 1060 3GB

 

157337380_Resized3DMark3070UV9251935Mhzmem800Mhznobackgroundcrap.thumb.jpg.3179bf8a35598bbbf727206464d4c3ed.jpg

^ i5-12600K + RTX 3070

 

The only difference in config between those two was the GPU (1060 3GB vs RTX 3070). I think it basically illustrates the same point of mixing old and new components but CPU and GPU are swapped in this example (current CPU, old GPU). Note that in this case, the age gap is "only" i5-12600K vs 1060 3GB and thus quite a lot smaller than 4070 Ti vs i7-4770.

 

 

S.

 

 

Edited by 1Sascha
Posted
On 1/29/2023 at 2:16 AM, RossMarBow said:


I actually know what I'm talking about.

Then why don't you talk about what you actually know ?

  • Thanks 1
Posted (edited)
17 hours ago, 1Sascha said:

Just one more tidbit of useless info before I go ? :

I've already posted my 2070 S vs 3070 Time Spy results further up , but since I have these shots lying around anyway and I'd forgotten to also post the 1060 3GB results, here they are:

 

1931667062_3dMark130322.thumb.jpg.4268d15475c8cebedfe69247702b3898.jpg

^ i5-12600K + 1060 3GB

 

157337380_Resized3DMark3070UV9251935Mhzmem800Mhznobackgroundcrap.thumb.jpg.3179bf8a35598bbbf727206464d4c3ed.jpg

^ i5-12600K + RTX 3070

 

The only difference in config between those two was the GPU (1060 3GB vs RTX 3070). I think it basically illustrates the same point of mixing old and new components but CPU and GPU are swapped in this example (current CPU, old GPU). Note that in this case, the age gap is "only" i5-12600K vs 1060 3GB and thus quite a lot smaller than 4070 Ti vs i7-4770.

 

 

S.

 

 

Timespy doesn't predict IL-2 performance very well.

Lower end GPU is actually better than a lower end CPU.

If you want to maximise fps/$ or stick to a budget

its actually smartest to get a 13600/58x3d/7600x
with a slower GPU

 

than it is to get a slower/older CPU with a faster GPU

Edited by RossMarBow

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...