Jump to content
chiliwili69

Intel 9th generation CPUs and Nvidia RTX 20xx GPUs, should increase VR performance?

Recommended Posts

From now to Christmas we will be exposed to many reports, youtubers reviews and statistics.

 

But, will this new CPUs and GPUs will help to increase the VR performance of IL-2 in our Rifts, Vives and WMRs???  (don´t expect any big bump in VR headsets for PCs along next year)

 

9th gen CPUs:

Don´t be impressed but the number of cores of these new CPUs. They could help in other games/app, but the true reality is that IL-2 is very very dependent of the single-thread performance of the CPU (this is becasue IL-2 use 4 threads but one of them is the heavy one and bottleneck the whole game)

So, when looking for a CPU upgrade look for the CPU single-thread performance which is reported in all CPU benchmarks. This single-thread performance is also directly dependent of the frequency at which you run the CPU (more GHz, more CPU instructions per time). So it is important to get a CPU which is able to run at high freq.

According to the test we have done with IL-2 in VR, the i7-8700K and i5-8600K at 5.1 or 5.0 GHz deliver the best performance:

https://docs.google.com/spreadsheets/d/1gJmnz_nVxI6_dG_UYNCCpZVK2-f8NBy-y1gia77Hu_k

 

This new 9th-gen is not going to go much further in CPU freq, 5.0 or 5.1 or 5.2 will be most likely the freqs at which you will be able to run those new CPUs with a good cooler. (of course extreme Overclockers can go to extreme freqs, but we are talking about what we will do with those chips)

 

20XX series GPUs:

Anyone with a 1080Ti will check that the GPU load is always below 70% when playing IL-2 VR with a Rift and SS at 200%. So, it is never bottle-necking IL-2 in VR. 

It does mean that if tomorrow (or September) I buy a 2080Ti card, my new load could be in 40% or 50%, but my fps performance will remain the same.

This new GPUs could be good for monitor users ( 2K, 4K, double or triple monitor, at 144Hz, etc) where the GPU cold be the bottleneck.

So, before upgrading your GPUs for better VR performance, be sure it is the bottleneck of your system.

 

Most of us (VR flyers) has been tweaking our settings to achieve the best quality/performance experience either single or online players.

We will optimistically tend to believe that these new CPUs/GPUs will help us to either increase performance or raise the settings for better visual quality.

Overall, we have a vague and subjective idea about how every setting affects the performance and a better idea (but still vague) about how they affect visual quality.

 

Despite of the many tests we have done (like how horizon distance influence performance https://forum.il2sturmovik.com/topic/35266-horizon-draw-distance-performance-hit-40km-100km-150km/

) I still don´t know well about how every option of the settings load the CPU and the GPU.

 

For example, the option for 4K textures: Does load the CPU or the GPU? or both? by how much?

And so on for every single option of the settings.

 

A question for Jason:

Jason and the Dev team could tell us (they know the code of IL-2 VR) if every option will load CPU or load GPU. Of course, we can infer that by doing many tests for every option but it will be a big effort.

Also, Jason could consider to include a short (1 or 2 minutes) performance flight inside IL-2 (independent of BOS, BOM, BOK, BOBP ownership) just to measure the performance of the flight in VR and in monitor. So I will not need to take care anymore of the performance test and will simplify greatly (no need to use fraps) the testing and tweaking of dev team and community, allowing all to have more objective data and share results.

Edited by chiliwili69
  • Like 1
  • Upvote 5

Share this post


Link to post
Share on other sites

I already have the funds set aside for a new build.

I will absolutely go with a new 2080 Ti with the virtual link and a new 9700K build as soon as each are available.

 

I am quite confident I will see a nice performance increase while playing all my games in VR.

 

 

 

Share this post


Link to post
Share on other sites
8 minutes ago, dburne said:

I already have the funds set aside for a new build.

I will absolutely go with a new 2080 Ti with the virtual link and a new 9700K build as soon as each are available.

 

I am quite confident I will see a nice performance increase while playing all my games in VR.

 

 

 

 

 

Lets recall that the current generation offers minimal improvements over the previous generations. Lets wait for benchmarks on the 2080ti to see for sure, but intel will hardly offer anything competitive with its own prior generations as it is still using the 14nm process node.

 

Why not invest all of that cash and wait for the next year's 7nm GPUs and 10nm intel? (Or hell even 7nm AMD threadripper if you want to really go with a monster)

Share this post


Link to post
Share on other sites

I think the situation around GPU usage could be different with high-resolution and wide-FOV headsets that are coming. When the PiMAX 8kX comes out, it will require a higher render resolution than the 4k it displays. Dual 4k+SS at 90FPS will bring most GPUs to their knees.

  • Upvote 1

Share this post


Link to post
Share on other sites
21 minutes ago, dburne said:

I will absolutely go with a new 2080 Ti with the virtual link and a new 9700K build as soon as each are available.

I am quite confident I will see a nice performance increase while playing all my games in VR.

 

Seems they will be available pretty soon.

Once you have it, you know I will kindly ask you to run the test, right?

 

Just curiosity, but what VR game is loading your GPU at 100% with your Rift?

Share this post


Link to post
Share on other sites
Just now, chiliwili69 said:

 

Seems they will be available pretty soon.

Once you have it, you know I will kindly ask you to run the test, right?

 

Just curiosity, but what VR game is loading your GPU at 100% with your Rift?

 

I really have no interest in benchmarks, I am all about how it performs for me in gameplay.

No idea on your question. I do know I got a nice improvement when I went to the 1080 Ti (even from 1080), and certainly expect an improvement with an upcoming 2080 Ti.

Share this post


Link to post
Share on other sites
5 minutes ago, coconut said:

GPU usage could be different with high-resolution and wide-FOV headsets that are coming

 

Right, but it will take a while until Pimax8K is on the market as a product. Optimistically Q1-2019 or later.

I went from 1070 to 1080Ti early this year only because I preordered Pimax8K.

 

Apart from that no other VR device on the one-year horizon.

Share this post


Link to post
Share on other sites
1 minute ago, chiliwili69 said:

 

Right, but it will take a while until Pimax8K is on the market as a product. Optimistically Q1-2019 or later.

I went from 1070 to 1080Ti early this year only because I preordered Pimax8K.

 

Apart from that no other VR device on the one-year horizon.

 

That has been officially announced...

;)

Share this post


Link to post
Share on other sites
3 minutes ago, dburne said:

certainly expect an improvement with an upcoming 2080 Ti.

This is exactly what I mean here. With a rift in IL-2 you will not get a performance increase going from 1080Ti to 2080Ti.

In fact, you can check that by doing a gradual upgrade. First move to 9700K keeping your 1080Ti and test it. Then upgrade to 2080Ti.

Share this post


Link to post
Share on other sites
3 minutes ago, chiliwili69 said:

This is exactly what I mean here. With a rift in IL-2 you will not get a performance increase going from 1080Ti to 2080Ti.

In fact, you can check that by doing a gradual upgrade. First move to 9700K keeping your 1080Ti and test it. Then upgrade to 2080Ti.

 

Say what??

 

If that is what you believe then certainly you should not even consider getting a 2080 card.

I will be in line for one. As long as it is 2080 Ti.

 

Edited by dburne

Share this post


Link to post
Share on other sites
21 minutes ago, dburne said:
23 minutes ago, chiliwili69 said:

Apart from that no other VR device on the one-year horizon.

 

That has been officially announced..

 

Yes, this is just speculation. But I will try to bring some reasoning.

 

First, current PC VR devices (Vive, VivePro, WMR and Rift) are already quite good in terms of resolution and FOV for most of the mass market VR games. (IL-2 is still not).

The main reason for PC VR still not entering in every home is one: Money!!

 

Money of the device itself and most importantly the required PC:

https://forum.il2sturmovik.com/topic/33300-why-you-are-still-not-in-vr/

 

So, I don´t think the three major players (Oculus, HTC and WMR) are going to produce more expensive devices (with much larger FOV and resolution) since their priority is to enter in the mass market videogame.

 

In fact I can bet you on something that Oculus will not release the next PC-VR devie in one year time. do you take my bet?

Edited by chiliwili69

Share this post


Link to post
Share on other sites
36 minutes ago, dburne said:

I will be in line for one. As long as it is 2080 Ti.

There is no guarantee 2080Ti will provide any substantial performance boost, it is vastly different arch which is not necessarily focusing on raw compute power but adds elements into chip that support ray tracing and make it possible to reach near live performance. It may be twice or more fast in some cases than Pascal arch, while in others provide no substantial difference.

I'm personally looking forward to 2080Ti due to DCS limitations (BoS cant utilize properly my Ryzen and my GTX 1080Ti so I dont even bother or expect any improvements here) but it all depends on improvements it provides and price.

In that last case, some leaks indicate prices between 1.000 $ to mind boggling 1.200 $ for 2080 Ti.

Source: https://videocardz.com/77505/zotac-geforce-rtx-2080-ti-amp-edition-to-cost-1199-usd

Share this post


Link to post
Share on other sites
1 hour ago, chiliwili69 said:

 

Yes, this is just speculation. But I will try to bring some reasoning.

 

First, current PC VR devices (Vive, VivePro, WMR and Rift) are already quite good in terms of resolution and FOV for most of the mass market VR games. (IL-2 is still not).

The main reason for PC VR still not entering in every home is one: Money!!

 

Money of the device itself and most importantly the required PC:

https://forum.il2sturmovik.com/topic/33300-why-you-are-still-not-in-vr/

 

So, I don´t think the three major players (Oculus, HTC and WMR) are going to produce more expensive devices (with much larger FOV and resolution) since their priority is to enter in the mass market videogame.

 

In fact I can bet you on something that Oculus will not release the next PC-VR devie in one year time. do you take my bet?

 

Nope not a betting man. But I have been holding to the same thought process, CV2 likely in 2020 at earliest.

Until, I saw Nvidia already implementing the new VirtualPort in some of their 20 series cards. That leads me to think there might be a better possibility of a 2019 release of an Oculus CV2, but I am kind of straddling the line there with my thoughts. I am sure Nvidia is way more versed in what is coming down the pike. And I somehow don't think they are doing this at this time just for StarVR( who have announced a commercial product utilizing this new port).

 

But - Oculus ain't saying at this time - will be most interested in seeing what they say at OC5 next month.

 

51 minutes ago, =362nd_FS=Hiromachi said:

There is no guarantee 2080Ti will provide any substantial performance boost, it is vastly different arch which is not necessarily focusing on raw compute power but adds elements into chip that support ray tracing and make it possible to reach near live performance. It may be twice or more fast in some cases than Pascal arch, while in others provide no substantial difference.

I'm personally looking forward to 2080Ti due to DCS limitations (BoS cant utilize properly my Ryzen and my GTX 1080Ti so I dont even bother or expect any improvements here) but it all depends on improvements it provides and price.

In that last case, some leaks indicate prices between 1.000 $ to mind boggling 1.200 $ for 2080 Ti.

Source: https://videocardz.com/77505/zotac-geforce-rtx-2080-ti-amp-edition-to-cost-1199-usd

 

I have seen early reports (who knows if accurate) of a reported 50% performance boost.

Until we see real end user reports though I am not counting on anything. I will still grab that 2080 Ti though, as soon as EVGA has an offering with their ICX cooling technology.

Yeah I like being an early adopter on some things. After all someone's gotta do it. ;)

 

Dang I am going to be on the golf course during Nvidia's LiveSteam announcement today, bummer.

Edited by dburne

Share this post


Link to post
Share on other sites
10 hours ago, dburne said:

 

Nope not a betting man. But I have been holding to the same thought process, CV2 likely in 2020 at earliest.

Until, I saw Nvidia already implementing the new VirtualPort in some of their 20 series cards. That leads me to think there might be a better possibility of a 2019 release of an Oculus CV2, but I am kind of straddling the line there with my thoughts. I am sure Nvidia is way more versed in what is coming down the pike. And I somehow don't think they are doing this at this time just for StarVR( who have announced a commercial product utilizing this new port).

 

But - Oculus ain't saying at this time - will be most interested in seeing what they say at OC5 next month.

 

 

I have seen early reports (who knows if accurate) of a reported 50% performance boost.

Until we see real end user reports though I am not counting on anything. I will still grab that 2080 Ti though, as soon as EVGA has an offering with their ICX cooling technology.

Yeah I like being an early adopter on some things. After all someone's gotta do it. ;)

 

Dang I am going to be on the golf course during Nvidia's LiveSteam announcement today, bummer.

Must be nice!

Share this post


Link to post
Share on other sites

Regarding IL-2, in my opinion a CPU upgrade is the primary; given that my GTX1080 isn't bottlenecking the game. 

 

Increasing the SuperSampling is detrimental to spotting targets due to engine scaling them in the backend. Aircraft will quickly be too small to be displayed on a pixel of the screen and thus vanish or flicker. Couple that with clouds, and you get too many surprises.

 

This refers to Multiplayer. 

 

 

For our friends in Singleplayer, those who fly with icons, these new flagship GPUs look quite tempting indeed, I assume 😁

Share this post


Link to post
Share on other sites
On 8/21/2018 at 7:35 AM, SCG_Fenris_Wolf said:

Increasing the SuperSampling is detrimental to spotting targets due to engine scaling them in the backend.

 

 

Sorry, do you mind explaining what that means?  (The effect is clear but not the cause.)

 

Thanks,

Ceowulf<><  

Share this post


Link to post
Share on other sites
On 8/20/2018 at 3:15 PM, dburne said:

I have seen early reports (who knows if accurate) of a reported 50% performance boost.

So far, it looks it is only faster when using their new RTX antialiasing instead of traditional methods. RTX has (AFAIK) to be supported by the game directly.

 

I‘m waiting so see how thise parts are doing for P3D. There, you can easily bottleneck any GPU.

Share this post


Link to post
Share on other sites
On 8/20/2018 at 3:15 PM, dburne said:

I have seen early reports (who knows if accurate) of a reported 50% performance boost.

 

Left: 1080 Ti (  35 - 39 FPS ) with TAA

Right: 2080 Ti (  47 - 76 FPS ) with Nvidia NGX DLSS

-> Without Nvidia NGX DLSS and TAA the Performance will be much higher on a 4k monitor where antialiasing is not important............

 

 

 

TuringVsPascal_EditorsDay_Aug-B01.jpg

 

TuringVsPascal_EditorsDay_Aug-B03.png

Edited by Livai

Share this post


Link to post
Share on other sites

Yep, that big performance boost is due to a new form of AA that the developers must add in order to be leveraged. The actual boost in existing games has yet to be determined.

Share this post


Link to post
Share on other sites

A propos 9th generation CPU, here‘s what you can expect from Intel. They changed the license for their CPU microcode (the stuff you had to upfate because of Spectre etc.). What‘s new? Imagine this:

 

You will not, and will not allow any third party to […] (v) publish or provide any Software benchmark or comparison test results.

 

Expect getting performance hits with every fix, now that CPU microcode is officially a hackable item.

Share this post


Link to post
Share on other sites
2 hours ago, =362nd_FS=Hiromachi said:

Apparently two more cores are something super extra since new i9 9900k with vat will cost over 800 euro in this preorder. 

https://videocardz.com/newz/intel-core-i9-9900k-and-i7-9700k-available-for-preorder

These are great days we're living, bros. 

 

 

Do you think Intel or Nvidia cares if you can or not afford the prices what they summon same as ferrari, rolls royce, porsche, mustang, corvette? As long we have customers who are willing to pay moon prices or going to pre-order without waiting to see the benchmarks how much they get for their money how this ends we see already..............................

 

BTW the prices I always compare the mm2 die size from CPUs or GPUs because this is for what you pay for. Compared how big the  mm2 die size were ago and how much less we paid for them. Good example 580 GTX vs 680 GTX = 520 mm² vs 294 mm²  = $500 vs $500 = same prices -> now we have for 50% more performance for 85% more money and there is no end to see in this.

 

Skylake-Package__left_-pcgh.jpgCurved_CPU_bottom-pcgh.JPG

 

 

 

 

Edited by Livai

Share this post


Link to post
Share on other sites
3 minutes ago, BeastyBaiter said:

I'm pretty happy with my 8700k but it will be interesting to see what AMD does with 7nm while Intel is still stuck on 14nm++++++++++.:P

Intel is dead in the server space in the coming years and they know it. What is even worse, with AMD going 7 nm, Intel will lose performance per watt and eventually performance per core as well. Right now, they enjoy their last days in squeezing every cent from customers, as with the arrival of AMD's Rome CPU they have no competitive products anymore.

 

While you may think that tacking two more cores on the 8700 and charging $800 is a big increase, then you have to consier that Intel is currently forcing clients on server side to Purley. that one is beasically same ol' as before, maybe 6-8% performance increase. But price goes up from ~$13'000 USD to ~$20'000! Current EPYC is ~14% slower at 1/4 of the cost. But by Q1 2019 when Purley emerges, Rome will be out soon (Q2 2019) as well.

 

Intels financials are solid only because of this enormous price increase. The delay of the upcoming CPUs however made several partners look for AMD when it comes to the luctarive multi socker market. As Rome is expected to beat upcoming Cascade Lake by about 50% at lower costs. This makes the projected (by Brian Kranich) market loss of 15-20% a rather low estimate. They know they cannot up prices anymore, and the only way to keep those margins is ceding market share.

 

In short, over the next 3 to 4 years Intel will not be competitive anymore besides some fringe income by making CPUs for game boxes. But we can expect AMD beating Intel in performance per Watt and per core in next year. It will require Intel to do a fresh start like after Pentium 4 hit the wall. At least they have the money to survive in this time as well as eventually getting n track again. Right now, all they can do is FUD.

Share this post


Link to post
Share on other sites

I am also pretty happy with my 8700K  overclocked and my TI 1080.

This setup runs Il2 pretty smooth with ultra settings (all ultra but mirrors).

Runs great with the Rift and also was great with the Vive Pro.

 

Only will upgrade when new gen VR devices come out.

Share this post


Link to post
Share on other sites
1 hour ago, =362nd_FS=Hiromachi said:

https://videocardz.com/77739/intel-core-i7-9700k-overclocked-to-5-5-ghz 

So it seems this CPUs can be overclocked a bit higher than average i7 8700k, however if comments are correct (I cant read that, but seems others managed to) voltage required to reach 5.5 Ghz was around 1.536. 

Really don't care what the voltage is as long as you can cool it, with the new soldered chips it might be possible. Definitely interesting.

Share this post


Link to post
Share on other sites

1.5v+ is the kind of thing that kills chips after 6-12 months of use. It doesn't matter if you can cool it or not, that isn't a safe voltage from what I've read.

Share this post


Link to post
Share on other sites
1 minute ago, BeastyBaiter said:

1.5v+ is the kind of thing that kills chips after 6-12 months of use. It doesn't matter if you can cool it or not, that isn't a safe voltage from what I've read.

 

Absolutely. I certainly would never pump that kind of voltage into that chip. Especially if I intended to keep the chip for a while, which would be the case for me.

Edited by dburne

Share this post


Link to post
Share on other sites
1 hour ago, 15th_JonRedcorn said:

Really don't care what the voltage is as long as you can cool it, with the new soldered chips it might be possible. Definitely interesting.

Considering that last soldered generation was Sandy Bridge and I have overclocked it to 5.1-5.15 Ghz (though daily it was better to keep it running at 5.0 Ghz) I can easily relate to what guys above say. My i7 2600k was running at 5.0 Ghz around 1.395 V which I consider a pretty good chip and I could keep it cool. But if I would push it to the limit it could easily jump to 1.45+ V which soon would result in degradation. 

For daily usage voltages are not a problem as my 2700X spikes to 1.5 V by its design. But voltage spikes are not the same as running chip 24/7 at such voltage. So I dont think its smart to aim for such clocks if you wish to keep the chip for some time. 

Share this post


Link to post
Share on other sites

I think Intel is becoming outwardly misleading with the i9-9900K. With eight cores pumping at these frequencies, you are seriously out of the "95W" TDP. By a factor of two at least! If you want to go at these frequencies, do not go past 6 cores by all means. The total power budget is increasing exponentially making a decent OC very, very tedious.

 

An eight core CPU with integrated graphics. You'd actually use that still broken part of the silicon? And only 16 PCIe lanes for dicrete graphics. Sure, you'd buy such a CPU to have SLI run on 8 lanes each. And the rest is hooked on the meagre QPI. Yes, your awesome USB3.1 as well. so much for fast periphery. It is just ridiculous.

Share this post


Link to post
Share on other sites
5 hours ago, chiliwili69 said:

This are the first OC results of the new i7-9700K and i9-9900K.

https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9900k-i7-9700k-i5-9600k-review/22

 


About that 9700k, 5.2ghz at 1.25v sounds nice and cold! Comparing that to my 7700k running a comfortable 4.8 at the same voltage. Would probably translate to + 8-12 fps for perhaps 5-600$ including a motherboard? Considering the fps reductions in the latest updates I might need it ;D

Edited by SvAF/F16_radek

Share this post


Link to post
Share on other sites
4 hours ago, SvAF/F16_radek said:


About that 9700k, 5.2ghz at 1.25v sounds nice and cold! Comparing that to my 7700k running a comfortable 4.8 at the same voltage. Would probably translate to + 8-12 fps for perhaps 5-600$ including a motherboard? Considering the fps reductions in the latest updates I might need it ;D

 

Looking at the FCAT results I'd suggest IL2 will prefer a 9700k with 280mm AIO plus the lowest latency RAM you can get your hands on. The extra HT on the 9900k will just add heat for IL2.

 

Mind you there's slightly larger cache on the 9900k, which might make some kind of a difference.

 

At the end of the day, though, IL2 is very CPU hungry on anything above Balanced settings and a rock-solid 90 fps may be out of the question. Personally I've given up on it and am accepting 45 fps ASW during some sections of flight, although aiming for mostly 90. That's on an 8086k @ 5ghz, 3333-CAS-16 ram, and an RTX 2080 so it's not a slow rig.....

Share this post


Link to post
Share on other sites

Same here, with migoto mod installed and prop off, 45fps is not that bad. However as each new patch and ad on does cost a few fps more often than gives, eventually and perhaps already with the release of Bodenplate I fear sub45 dips during combat will only be solved by keeping up to date with current hardware. This is in no way expressing frustration, a game that keeps up with hardware improvements is a good thing. But there's no comparison between performance just after VR implementation and what what we have now post-kuban, mirrors shadows, new particles and so on. 

Share this post


Link to post
Share on other sites

I am looking forward to finding out for myself how my new system will perform in VR, but still a little ways away.

Still waiting on my EVGA RTX 2080 Ti FTW3 Ultra ( pre-order), i9 9900k ( pre-order), and the EVGA Z390 Dark MB ( not released yet).

 

I will say with my current rig I am getting pretty good performance in Kuban flying a PWCG campaign. 

 

 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...