Jump to content

SYN_Vander BENCHMARK v6 to measure IL-2 performance in monitor & VR


chiliwili69
 Share

Recommended Posts

dburne
Posted (edited)

From reports I am seeing (outside of this thread) 11th Gen Intel is not fairing all that great currently.

 

Not really sure either how every fps counts in VR.

Bottom line - you are either going to be gaming with reprojection, or without depending on how much graphics you run or willing to give up.

Flight sims I settle for reprojection on and forced, in my case Reverb G2 at 45 fps,  so I can have high graphics details and the game looks very good.

So I am always at 45 fps.

 

I could lower graphics and try to maintain non reprojection of 90 fps, but it for me it just does not look that great if I do.

But really that is it. I shoot for either 45 fps, or 90 fps. Any others are inconsequential.

 

For other games that are made for VR like Half Life Alyx or Medal of Honor Above and Beyond, I shoot for non-reprojection and usually can maintain it. Either 90 fps for my Reverb G2, or even 120 Hz (120 fps) in my Index.

Edited by dburne
Link to comment
Share on other sites

1 hour ago, chiliwili69 said:

 

As you wish (playing in 1440p I would not be concerned too much if you get 160 or 180fps, but in VR with latest headsets every fps counts).

 

But anycase, I think the current SYN_VANDER benchmark is far from being a dark art. All the test we have done a pretty much consistent between them, and we have learnt from those tests several important items like:

- The Zen3 CPUs was top performing, beating previous Intel line. But Zen 2 was not good.

- The new Intel 11th is as great as the Zen3

- Overclocking in Intel line is important.

- RAM plays a factor

- 6700XT, 6800XT & 6900XT has an issue with IL-2 in monitor and VR.

 

At least it is something we can use to build experiences and gain some specific IL-2 performance numbers, which are the one that matter for our Game (in my case the only one). Otherwise we are just lost on the abyss of obscurantism and subjectivity.

 

I didn't single out your benchmark, I was stating any benchmarking of IL2 is a dark art considering its age.  At 2D 4K with most details maxed (other than pointless levels of MSAA) I get over 100FPS average on my 3080 rig and the 6700XT get 75 FPS average.  Both are more than enough to keep my 60 Hz Freesync montor inside its VRR range and it is impossible to tell them apart while gaming.  At lower resolutions most recent mid tier and up GPUs are giving more than excellent FPS and the difference between AMD and Nvidia tier for tier is ~20% at best and if we are arguing over 100 or 120 FPS then it's kinda moot.

 

As for VR, dburne makes a very good point and if a G2 powered by a 3090 has to compromise on settings to stay out of reprojection in IL2, then the problem is not the GPU.

Link to comment
Share on other sites

RufusK
Posted (edited)
On 5/21/2021 at 7:45 PM, ICDP said:

For reference my 6700XT got - Avg 106, min 89 and max 147.  I was also able to increase my score by CPU score test by 6% on the 3080 rig by overclocking.  If this test was trully not GPU bound then these anomolies should not happen.  When I use 4K settings below the performance delta between the 3080 and 6700XT PCs are right where they should be with the 3080 being roughly 40-45% ahead at 4K.

 

Humour me and do a test using the following settings to see where your 6900XT lies.  If you don't have a 4K monitor just enable AMD Dynamic Super Resolution.  Make sure to enable along with max building draw distance as well.  At least it will give you something to compare against.

 

1979376735_Settings4K.thumb.png.fdef6979dde2d7882830d2a570f9b646.png

 

 

System 1:  RTX 3080 FE, 5900X CPU. 32GB DDR 3600 CL16.

System 2:  6700 XT, 5600X CPU, 32GB DDR 3400 CL14.

 

Results.png.b104a702963ddef8740ad98fd34f3d6a.png

 

 

I ran the benchmark with the settings suggested by @ICDP.

Frames: 5475 - Avg: 91.250 - Min: 80 - Max: 132

 

I have a 4K monitor (although I don't use it for the game as it has a low refresh rate). Could it be that MSAA or at least MSAA x8 doesn't play well with AMD 6000 series?

 

Edited by RufusK
Link to comment
Share on other sites

chiliwili69
7 hours ago, RufusK said:

I ran the benchmark with the settings suggested by @ICDP.

Frames: 5475 - Avg: 91.250 - Min: 80 - Max: 132

 

I have a 4K monitor (although I don't use it for the game as it has a low refresh rate). Could it be that MSAA or at least MSAA x8 doesn't play well with AMD 6000 series?

 

This confirms that your sysytem is being limited by the 6900XT issue (general issue for latest AMD cards with IL-2).

From the test you have done:

 

1.- Using OP CPU test settings (to load just CPU). (1080p, Clouds low, no MSAA):

Avg:100.3 Min:83 Max:143

 

2.- Using IDCP suggested settings (load both CPU and GPU). (4K, Clouds Extreme, no MSAA):

Avg:91.2 Min:80 Max:132

 

3.- Using OP GPU test settings (load just GPU). (4K, Clouds Extreme, MSAAx8):

Avg: 77.1 Min:66 Max:90

 

The first one already show the issue, since a 5800X should deliver around 120-130 fps in the CPU test. And 1080p is not a demanding load for a 6900XT.

In the OP CPU test, other people with a 5600X/5800X and a 1080 or 1080Ti were achieving 120-130 fps.

 

From there, second and third test just load more the GPU, delivering less and less fps. So, the issue with AMD cards is not MSAA related.

 

 

  • Thanks 1
Link to comment
Share on other sites

Posted (edited)
8 hours ago, RufusK said:

 

I ran the benchmark with the settings suggested by @ICDP.

Frames: 5475 - Avg: 91.250 - Min: 80 - Max: 132

 

I have a 4K monitor (although I don't use it for the game as it has a low refresh rate). Could it be that MSAA or at least MSAA x8 doesn't play well with AMD 6000 series?

 

 

I assume these are at 4K?

 

As I stated a few posts up, there is clearly an issue on AMD in IL-2 and tier for tier Nvidia has a ~20% advantage.  Your results are slightly down on what I would expect by ~10% but if you wanted to go hunting for the cause then good luck :)

 

Your 6900XT is 10% down on minimums compared to my 3080 results, 15% down on average and 7% faster on max FPS.  Against the 6700XT you are 30% faster on minimums, 21% faster on average and 50% faster on max FPS.

 

IL-2 BoX is clerly one of those games where Nvidia GPU get a bonus but because it is an old engine even a mid tir AMD GPU is delivering more than playalbe FPS anyway.  I can't see AMD doing devoting resources to a 8 year old sim engine unfortunately.  Maybe the devs will keep working at it.

Edited by ICDP
  • Thanks 1
Link to comment
Share on other sites

DBCOOPER011
On 5/21/2021 at 1:17 PM, RufusK said:

Because @chiliwili69 noted a possible improvement with AMD GPUs, I reran the benchmark with the RX 6900XT. My hardware is the same as the previous test, quoted below. I have updated drivers and the game is now version 4.601.

 

CPU test
Frames 6019 - Avg. 100.3 - Min. 83 - Max. 143 -> about a 5% improvement

 

GPU test
Frames 4560 - Avg. 77.1 - Min. 66 - Max. 90 -> very little change

 

 

 

 

 

I have the same motherboard and CPU as you (X570 tuf/5800X). You should be getting a lot better performance in the CPU test. I just ran the test and got the results below. I am using the curve optimizer function in the bios with a +100 auto overclock. It peaks at 4950MHZ and my cores are all set for a negative voltage curve at: 10,20,15,5,20,20,20,20. Yours will be different though.


A thread that has a tool for testings your cores is below. Im getting a little better performance having my ram at 3800MHZ/1900 uncore, but I had to take 2 sticks of my memory out to get it. The max I could get with 4 is 3733, and all are single rank DDR4.

 

Frames: 7789 - Time: 60000ms - Avg: 129.817 - Min: 113 - Max: 172
Frames: 7562 - Time: 60000ms - Avg: 126.033 - Min: 109 - Max: 174
Frames: 7750 - Time: 60000ms - Avg: 129.167 - Min: 112 - Max: 173

 

https://www.overclock.net/threads/corecycler-tool-for-testing-curve-optimizer-settings.1777398/

Ram timings.PNG

Link to comment
Share on other sites

chiliwili69
23 hours ago, ICDP said:

Your 6900XT is 10% down on minimums compared to my 3080 results, 15% down on average and 7% faster on max FPS.  Against the 6700XT you are 30% faster on minimums, 21% faster on average and 50% faster on max FPS.

 

This comparison is a bit misleading since Rufusk is using a 5800X CPU (with 6900XT) and your system1 is using 5900X CPU (with 3080). And using your suggested setting the CPU could be also acting as bottleneck. In order to do a fair comparison the system should be identical except for GPU.

13 hours ago, DBCOOPER011 said:

You should be getting a lot better performance in the CPU test.

 

The problem here is not CPU related. It is the 6900XT with IL-2. It is somehow constraining the CPU test at 1080p. Definetely something to be investigated by Dev team.

It was reported in the ATI Raedon pinned thread of Bug Report: https://forum.il2sturmovik.com/topic/61739-ati-radeon-troubles-after-4006-update/?do=findComment&comment=1070743

 

  • Thanks 1
Link to comment
Share on other sites

Jaws2002
20 hours ago, chiliwili69 said:

Definetely something to be investigated by Dev team.

 Yeah. The problem is that every time I asked the developers about the crippling bug in Il-2, with AMD cards, they never answer. The always skip the AMD questions, like they are not even there.

  • Upvote 1
Link to comment
Share on other sites

Posted (edited)

Maybe they should do a Xbox X and PS5 version of the sim and that would force the devs to optimise for RDNA2.  :)

 

I know I keep repeating myself but IL-2 is an older engine and it already works fine on any half decent AMD GPUs.  At 1080p anything from a RX480 up will give playable framerates at medium/high settings.  If 10% margins on minumum FPS is make or break in Il2 (or any game really) then that person is better served dropping the graphical settings than buying a new GPU.

Edited by ICDP
Link to comment
Share on other sites

RufusK

Yes, I think that is right. A 5800X and 6900XT is more than capable of playing IL-2. It is certainly an interesting question why that combination doesn't perform as well at 1080p as a 5800X with an RTX3080 and why the 6000-series doesn't perform as well at 4K as one would expect based on other games and benchmarks, but at a certain level those are academic questions. It doesn't affect my enjoyment of the game. 

 

I would say, however, that if the 3080 had more reasonable availability, I might pick one up and get rid of the 6900. In this time of GPU shortages, it just happened that when I was building my system a 6900 was available at a price close to MSRP, so I bought it.

  • Like 1
Link to comment
Share on other sites

Ironically I would be the other way inclined.  The fact I got my 3080 for MSRP and getting a 6800XT was impossible was why I bit the bullet on it.  I did have a 6800 (non XT) that was back with OCUK for RMA at the time so when the 3080 popped up I jumped at it.  Even though I knew the 10GB would be an issue in some scenarios.  There is/was a thread on another forum where people claim 10GB is enough and when I pointed out that it tanks a 3080 in DCS with Normandy map I was told "yeah but that doesn't count because it's not a AAA game).

 

I have played a grand total of 2 games that had RT, Cyberpunk and Watchdogs Legion and I personally felt Cyberpunk was crap.  So while RT and even DLSS are nice to have they are not must haves.  On the otherhand the 3080 ran into massive VRAM issues in DCS Normandy map on my Pimax 8KX.

  • Upvote 1
Link to comment
Share on other sites

Daisy_Blossom

I'm looking into upgrades for my now apparently aging system, figured I'd benchmark before I buy anything so there's more data available to everyone:

 

Motherboard: Gigabyte Z370 Aorus gaming 7
CPU:               8700K (delidded) 
CPU Freq:       5.1 Ghz
L3 cache:       12 MB
Cores:             6
Threads:         12
RAM type:      DDR4
RAM size:       32Gb (4x8GB)
NB Freq:         4000 MHz
RAM Freq:      3200 MHz
RAM timings:  14-14-14-34-560
Ram type: G.Skill F4-3200C14D-16GTZR (x2)
GPU:      EVGA 1080ti SC gaming (11G-P4-6393-KR)
HMD: Valve Index
Cooling: Full custom water loop CPU + Mobo + GPU

 

GPU drivers: 466.47

IL2 Version 4.601, Benchmark V6

 

CPU Test 1080p:

Frames: 5790 - Time: 60000ms - Avg: 96.500 - Min: 80 - Max: 126

 

VR Test 1: (90hz @ 106 super-sampling)

Frames: 4378 - Time: 60000ms - Avg: 72.967 - Min: 52 - Max: 91
 

 

GPU Passmark (3d graphics mark): 17247

Link to comment
Share on other sites

chiliwili69
11 hours ago, Daisy_Blossom said:

figured I'd benchmark before I buy anything so there's more data available to everyone

 

Thank you for your tests. For the upgrade, are you going to AMD or to Intel?

Link to comment
Share on other sites

Daisy_Blossom
9 hours ago, chiliwili69 said:

 

Thank you for your tests. For the upgrade, are you going to AMD or to Intel?

 

AMD without a doubt. I built my current computer specifically for VR and this game. At that time Intel was the clear winner and I am super happy with how this rig has performed over the last 4ish years. The data is pretty clear at present: Ryzen + Nvidia is the best combo for this game in VR in 2021 (assuming you can find anything in stock). Who knows what will happen in another few months or years, though.

Link to comment
Share on other sites

chiliwili69
14 hours ago, Daisy_Blossom said:

The data is pretty clear at present: Ryzen + Nvidia is the best combo for this game in VR in 2021

 

This was fully true until the new 11th gen of CPUs appeared a couple of months ago. There are still a few test of the 11gen, but they are also doing quite well in the VR tests.

 

So current AMD Zen3 or Intel 11th gen are safe bets for IL-2 either monitor or VR.

In the GPU arena, for high resolutions or VR, much better going to NVidia.

Link to comment
Share on other sites

WallterScott
01.06.2021 в 09:45, chiliwili69 сказал:

 

This was fully true until the new 11th gen of CPUs appeared a couple of months ago.

Why then such poor results in the СPU test?

Link to comment
Share on other sites

chiliwili69
7 hours ago, WallterScott said:

Why then such poor results in the СPU test?

 

The 5600X&5800X are in the range of 115-130 fps in the CPU Test.

 

For the 11th gen we only have a couple of tests and they are in the 117. But in VR they perform as good as Zen3.

 

We will need more test from the 11thgen with higher RAM speed to see the full potential of 11th gen.

 

But the 5900X&5950X are definetely the best in the CPU tests.

Link to comment
Share on other sites

Nadelbaum
Posted (edited)

Here's data from a bit older system:

 

Motherboard: ASUS Prime Z390-A
 CPU:                 i7-9700K
 CPU Freq:        4.9 GHz
 L3 cache:          12 MB

 Cores:               8
 Threads:           8
 RAM size:        32Gb (4x8GB)
 RAM Freq:        3200 MHz
 NB Freq:          4300 MHz
 RAM timings:  16-18-18-38-560
 RAM type:      G.Skill RipjawsV DDR4 3200

 GPU:                 2070 Asus ROG Strix O8G Gaming

 GPU driver: 466.63
 Il-2 version: 4.602b

 

CPU test:

Frames: 5152 - Time: 60000ms - Avg: 85.867 - Min: 76 - Max: 114

 

GPU test:

Frames: 4104 - Time: 60000ms - Avg: 68.400 - Min: 52 - Max: 82

Edited by Nadelbaum
Corrected GPU driver version
Link to comment
Share on other sites

Bird*dog

Motherboard: ASRock B450 Gaming-ITX/ac
CPU:               R7 3700X
CPU Freq:      4.3 Ghz (PBO Enabled)
L3 cache:       2x16 MB
Cores:             8
Threads:         16
RAM type:      DDR4
RAM size:       16Gb (2x8GB)
Uncore Freq:  1600 MHz
RAM Freq:      3200 MHz
RAM timings:  16-18-18-18-36
Ram type: Patriot Viper Steel 8 GB DDR4-3200 CL16 (x2)
GPU:      EVGA 2070 Super 8 GB Black Gaming (08G-P4-3071-KR)

HMD:     HP Reverb G2
IL2 Version 4.602, Benchmark V6

 

CPU Test 1080p:

Frames: 4575 - Time: 60000ms - Avg: 76.250 - Min: 66 - Max: 103

 

VR Test 1: (90hz @ 50% super-sampling)

Frames: 3083 - Time: 60000ms - Avg: 51.383 - Min: 43 - Max: 64

 

GPU Passmark (3d graphics mark with OC): 21019

Link to comment
Share on other sites

chiliwili69
22 hours ago, Nadelbaum said:

Here's data from a bit older system

 

Thank you for your tests, All is well aligned with other tests.

11 hours ago, Bird*dog said:

Motherboard: ASRock B450 Gaming-ITX/ac
CPU:               R7 3700X

Thanks for your test. All numbers are as expected.

 

If one day you would want to upgrade your PC, I would try first to upgrade just the CPU to a 5600X which is well in stock with normal prices. It seems that some months ago the ryzen 5xxxx is also suported in B450 motherboards. You may get a good fps increase when your 2070S is not limiting the fps.

  • Thanks 2
Link to comment
Share on other sites

Nadelbaum

Updated results with new 3080Ti card installed, nothing else changed.
 

Motherboard: ASUS Prime Z390-A
 CPU:                 i7-9700K
 CPU Freq:        4.9 GHz
 L3 cache:          12 MB

 Cores:               8
 Threads:           8
 RAM size:        32Gb (4x8GB)
 RAM Freq:        3200 MHz
 NB Freq:          4300 MHz
 RAM timings:  16-18-18-38-560
 RAM type:      G.Skill RipjawsV DDR4 3200

 GPU:                 3080Ti MSI Suprim X

 GPU driver: 466.63
 Il-2 version: 4.602b

 

CPU test:

Frames: 5152 - Time: 60000ms - Avg: 85.867 - Min: 76 - Max: 114

 

GPU test:

Frames: 7515 - Time: 60000ms - Avg: 125.250 - Min: 108 - Max: 166

 

 

Unfortunately my 4K native monitor supports only 120 Hz (with full color range), so I guess I'm monitor limited?

  • Like 1
Link to comment
Share on other sites

chiliwili69
13 hours ago, Nadelbaum said:

GPU:                 3080Ti MSI Suprim X

 

Thank you for this first test of a 3080Ti.  That´s a very nice boost. You were achieving less than other 3080 card because in this test you might be CPU limited.

 

If your monitor is 120Hz, then anything above 120fps is not captured by the monitor.

  • Thanks 1
Link to comment
Share on other sites

Nadelbaum
Posted (edited)

Thanks @chiliwili69. I'm not too tech savvy so I don't really have a clue what limits what. In any case what I do know is that the monitor is capable of 144 Hz (native 4K), but with limited color range. I'm sticking to 120 Hz to benefit from the whole color range (and HDR). I also have G-Sync enabled in Nvidia settings (V-sync off and naturally the rest of the settings in Il-2 like in the benchmark instructions). I'm happy to run other tests if it helps, but would need specific instructions on what to try out.

 

In any case the next upgrade will be the CPU, but no plan yet for it. With the 3080Ti (and current CPU) I'm basically able to max out everything in Il-2 and get an average of 70-80 FPS in the scripted campaigns. I'm extremely happy. Don't know about MP in congested servers. Kind of unknown territory still for me.

Edited by Nadelbaum
Typos ...
  • Like 1
Link to comment
Share on other sites

chiliwili69
10 hours ago, Nadelbaum said:

I also have G-Sync enabled

 

With G-Sync you are then fine. Running at 120Hz and 4K your 3080Ti will do the job well (assuming no MSAA is used)

Depending on the scene and the settings you use, your CPU will bottleneck your system  (ie less than 120fps), but in that case G-Sync enters in action and will just eliminate the frames which are not ready, so you should not notice it.

 

I don´t know if your monitor has other frequencies (60 or 90 Hz). You can try them as well. Personally I don´t notice any difference from 90 to 120Hz or 144Hz. But every human is diferent.

Link to comment
Share on other sites

Nadelbaum
2 minutes ago, chiliwili69 said:

 

With G-Sync you are then fine. Running at 120Hz and 4K your 3080Ti will do the job well (assuming no MSAA is used)

Would you be so kind to elaborate what you mean by "no MSAA is used"? The FPS I mentioned is with MSAA x8. Previously with my 2070 I had to settle only for FXAA x4, but now I can easily run MSAA x8. 

Link to comment
Share on other sites

chiliwili69
1 minute ago, Nadelbaum said:

Would you be so kind to elaborate what you mean by "no MSAA is used"? The FPS I mentioned is with MSAA x8. Previously with my 2070 I had to settle only for FXAA x4, but now I can easily run MSAA x8

 

The synvander GPU test at 4K tries to load the GPU as much as possible, that´s why it uses MSAAx8 and Extreme clouds, everything alse is relatively low or off excep preset.

The reason for that is to load just GPU (and not CPU) and have the GPU as the limiting factor during all the test. But the problem appears when we have unbalanced system (a top GPU like 3090/3080Ti with moderate-low CPUs) , so the CPUs constrain the test as well. But this is just for benchmarking purposes and compare performances of different GPUs in IL-2.

 

In real play game, using 4K, you will not need to use MSAAx8, probably just MSAAx2 o nothing.

MSAA loads considerabily the GPU, (look this) ,but in your case having 70-80 fps in play game, your load in the GPU will be no more than 60%, so I think you can use MSAAx2 or more with no problems.

  • Thanks 1
Link to comment
Share on other sites

Nadelbaum

It is clear now, many thanks for taking the time to clarify. Much appreciated! 

 

I also want to correct my previous comment: In normal simming I use Rowdyb00t's clouds and in this case my clouds setting is in fact High instead of Extreme, as suggested for this particular mod. For the benchmark I naturally used Extreme as instructed.

Link to comment
Share on other sites

xeotion

Motherboard: Asus X570 Tuf Gaming Pro
CPU:               5800x
CPU Freq:       4.8 Ghz
L3 cache:       32 MB
Cores:             8
Threads:         16
RAM type:      DDR4
RAM size:       32Gb (2x16GB)

Uncore Freq: 1900 Mhz
RAM Freq:      3800 MHz
RAM timings:  14-15-14-28-304-1T
Ram type: G.Skill F4-3600C14D-32GTZR
GPU:      EVGA RTX3080 FTW3 Ultra (10G-P5-3897-KR) 
Cooling: Noctua NH-D15

 

GPU drivers: 

IL2 Version 4.602b, Benchmark V6

 

CPU Test 1080p:

Frames: 8057 - Time: 60000ms - Avg: 134.283 - Min: 120 - Max: 185

 

GPU Test 4k:

Frames: 9231 - Time: 60000ms - Avg: 153.85 - Min: 118 - Max: 183

 

 

GPU Passmark (3d graphics mark): 31924

Edited by xeotion
  • Like 1
Link to comment
Share on other sites

DBCOOPER011
22 hours ago, xeotion said:

Motherboard: Asus X570 Tuf Gaming Pro
CPU:               5800x
CPU Freq:       4.8 Ghz
L3 cache:       32 MB
Cores:             8
Threads:         16
RAM type:      DDR4
RAM size:       32Gb (2x16GB)

Uncore Freq: 1900 Mhz
RAM Freq:      3800 MHz
RAM timings:  14-15-14-28-304-1T
Ram type: G.Skill F4-3600C14D-32GTZR
GPU:      EVGA RTX3080 FTW3 Ultra (10G-P5-3897-KR) 
Cooling: Noctua NH-D15

 

GPU drivers: 

IL2 Version 4.602b, Benchmark V6

 

CPU Test 1080p:

Frames: 8057 - Time: 60000ms - Avg: 134.283 - Min: 120 - Max: 185

 

GPU Test 4k:

Frames: 9231 - Time: 60000ms - Avg: 153.85 - Min: 118 - Max: 183

 

 

GPU Passmark (3d graphics mark): 28117.9

That's a seriously nice score you have. I have almost the same setup as you but with a 3090, but cannot reach what you got. What OC settings are you using for your 5800X/3080? Thanks

Link to comment
Share on other sites

xeotion
2 hours ago, DBCOOPER011 said:

That's a seriously nice score you have. I have almost the same setup as you but with a 3090, but cannot reach what you got. What OC settings are you using for your 5800X/3080? Thanks

I'm impressed with the results myself! Below is the summary of settings I'm running on my CPU and GPU.

 

5800X Settings:

  • PBO Enabled with mostly Auto selected (except below)
  • +200 MHz
  • Curve Optimizer: -15

 

3080 Settings:

  • Memory Clock: +650 MHz
  • GPU Clock: +125 MHz
  • Power Target: 113%
  • GPU Temp: 87C

 

Hopefully that's helpful! I have also spent a fair amount of time on my RAM settings which is almost certainly contributing. Below are my PassMark scores for another comparison:
image.png.45ae0c789816aebe7c63b8fa73a2c384.png

Link to comment
Share on other sites

DBCOOPER011
Posted (edited)

Thanks for the information, much appreciated! I just ran a passmark run with my daily driver settings, and got the below results.My Gpu is a Gigabyte 3090 Vision (2 Pin) with a 370 Watt power limit, and your GPU passmark just blows mine out of the park. It appears the power limit on mine is the limiting factor. I'm somewhat of an amateur memory overclocker, but my ram settings are below, best I could get with 4 sticks. What settings are you using?


I'm less then a day away on the 3090 FTW3 Ultra Queue with EVGA. This pretty much makes up my mind to get it and sell my Vision for what I bought it for...

 

5800X Settings:


+100 MHZ (appears to be clock stretching above this)
Curve optimizer: 10,20,15,5,20,20,20

 

3090 Settings:


Memory clock: +300
GPU Clock +125
Power Target: 105%
GPU Temp: 60C

passmark bench.PNG

3800 timings.PNG

Edited by DBCOOPER011
Link to comment
Share on other sites

WallterScott

I don't understand anything. For some reason, the Benchmark6 mission does not run on version 4.602 b

Other missions are working. What could be the reason?

Link to comment
Share on other sites

dburne
2 hours ago, WallterScott said:

I don't understand anything. For some reason, the Benchmark6 mission does not run on version 4.602 b

Other missions are working. What could be the reason?

 

Probably an update broke it.

 

Link to comment
Share on other sites

Heya. I’ve skimmed through the thread. Really helpful info. 

I have a 3080TI arriving today for my Index.

 

(Upgrade from my 1080GTX)

 

I will do a bench to contribute some data. 
 

However, I’m still a little vague on what settings I should be using for actual playing.

 

High Preset, 90Hz, 106 or 216 SS? Motion smoothing off? What AA?

 

I only play multiplayer so spotting is important.

 

Any advice very welcome.  

Link to comment
Share on other sites

chiliwili69
On 6/30/2021 at 5:54 PM, xeotion said:

Frames: 8057 - Time: 60000ms - Avg: 134.283 - Min: 120 - Max: 185

Thank for this test. Your performance is in the league of the 5950X/5900X!

On 7/2/2021 at 4:48 PM, WallterScott said:

I don't understand anything. For some reason, the Benchmark6 mission does not run on version 4.602 b

 

I have just tested the benchmark and it runs OK with the 4.602b. Try to delete the mission and download again from the OP. Or just delete the .msnbin file.

On 7/3/2021 at 8:54 AM, Mewt said:

I have a 3080TI arriving today for my Index.

 

(Upgrade from my 1080GTX)

 

That´s a nice upgrade. You can run the tests before and after the change.

On 7/3/2021 at 8:54 AM, Mewt said:

However, I’m still a little vague on what settings I should be using for actual playing.

 

High Preset, 90Hz, 106 or 216 SS? Motion smoothing off? What AA?

 

I only play multiplayer so spotting is important.

 

Well, this thread is not exactly for recommending the best settings for spotting or ID or whatever, it is just to measure hardware/software.

 

There are some pinned threads int he VR section (Fenris' one is the way to go), but in your case I would give you some suggestions:

 

Index: I always use the 80Hz mode to have more room in the CPU side and in the GPU side. This is a very nice mode of the Index and I don´t notice the difference from 80Hz vs 90Hz. 

 

Since your GPU is powerfull (and I suposse your CPU will be also good) you will not need to use Motion Smoothing. So better to put it OFF.

 

For the SS, your 3080Ti will support quite well 150%SS, beyond that there is not too much visual gain. But SS could affect negatively spotting since it create a softer dot.

 

For AA, you can use MSAA x2 which I think is enough to smooth some edges.

 

High Preset is fine for me. With High Shadows and High clouds. Mirrors OFF.

 

Some settings load the CPU and other load the GPU (clouds and AA).

 

I would recommed to use the fpsVR tool which is perfect to visualize frametimes while you play.

Link to comment
Share on other sites

Thanks @chiliwili69


I’m familiar with those threads and fpsVR; just wanted your take on it. 
 

I’ll send some data. 
 

Cheers. 

Link to comment
Share on other sites

RAAF492SQNOz_Steve
On 7/2/2021 at 1:29 PM, DBCOOPER011 said:

Thanks for the information, much appreciated! I just ran a passmark run with my daily driver settings, and got the below results.My Gpu is a Gigabyte 3090 Vision (2 Pin) with a 370 Watt power limit, and your GPU passmark just blows mine out of the park. It appears the power limit on mine is the limiting factor. I'm somewhat of an amateur memory overclocker, but my ram settings are below, best I could get with 4 sticks. What settings are you using?


I'm less then a day away on the 3090 FTW3 Ultra Queue with EVGA. This pretty much makes up my mind to get it and sell my Vision for what I bought it for...

 

5800X Settings:


+100 MHZ (appears to be clock stretching above this)
Curve optimizer: 10,20,15,5,20,20,20

 

3090 Settings:


Memory clock: +300
GPU Clock +125
Power Target: 105%
GPU Temp: 60C

passmark bench.PNG

3800 timings.PNG

 Be aware that if you are not running some of the 3D tests on a 4K monitor that your Passmark scores can be penalised (in my case, by 38% for one of the tests).

 

I wonder if that can skew the GPU 3D scores somewhat.  My results with a 3K monitor below. Also running a Power limited MSI RTX 3090 GPU. 

 

Great memory mark score you have got there by the way. I have to run at a 2000 MHz clock to keep up!

 

For bench scores, if I had a 980 Pro SSD running at PCIE 4 I would be really cooking! Would not make any difference for games though in the real world.

 

CPU undervolt 15 for these tests.

 

image.png.ba40a92ef16b02bc3cbf240088213615.png

Link to comment
Share on other sites

DBCOOPER011
Posted (edited)
13 hours ago, RAAF492SQNOz_Steve said:

 Be aware that if you are not running some of the 3D tests on a 4K monitor that your Passmark scores can be penalised (in my case, by 38% for one of the tests).

 

I wonder if that can skew the GPU 3D scores somewhat.  My results with a 3K monitor below. Also running a Power limited MSI RTX 3090 GPU. 

 

Great memory mark score you have got there by the way. I have to run at a 2000 MHz clock to keep up!

 

For bench scores, if I had a 980 Pro SSD running at PCIE 4 I would be really cooking! Would not make any difference for games though in the real world.

 

CPU undervolt 15 for these tests.

 

image.png.ba40a92ef16b02bc3cbf240088213615.png

 

Thanks for the information, your setup looks like its tuned very nicely! I tried running my memory above 3800 Mhz, but kept getting whea errors. I don't think the memory clock on my 5800X can handle those higher frequencies..


I actually sold my 3090 vision a few days back to an individual for a good price, and just received an 3090 FTW3 ultra from EVGA yesterday. Been tinkering with it for a little bit and it seems to be a good card. I just set it up to my 4K monitor and ran passmark and got a little better score then I had with my 3090 vision. I think I'm doing something wrong with passmark, as I'm getting much better performance in 3dmark timespy/port royal but not so much in passmark...

EVGA 3090 passmark.PNG

EVGA 3090 Port Royal.PNG

Edited by DBCOOPER011
Link to comment
Share on other sites

Voyager

Looking through the results people have posted, and is there any difference between the 3080 Ti and the 3080? I do notice that the 3090 is up to 20% faster than the 3080, but the Ti numbers seem about the same as the 3080 plain? 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...