Wulfen Posted October 4, 2018 Posted October 4, 2018 (edited) 7 minutes ago, dburne said: Yeah I have an EVGA RTX 2080 Ti FTW3 Ultra on pre-order and am very anxious to get it. Going to sell my 1080 Ti FTW3 as soon as I have it installed and am comfortable with it. Last time I played a 2d game was Jan 14th 2017. That`s a good 1080ti that should sell well and the 2080ti is a serious card. But I just couldn`t justify the roughly 1600 euro price tag for a roughly 25% increase in muscle over the 2080 for near half the price. These prices are absolute madness. Are you in the U.S, the dollar price would be a fair bit lower probably with taxes etc. Edited October 4, 2018 by Wulfen
dburne Posted October 4, 2018 Posted October 4, 2018 1 minute ago, Wulfen said: That`s a good 1080ti that should sell well and the 2080ti is a serious card. But I just couldn`t justify the roughly 1600 euro price tag for a roughly 25% increase in muscle over the 2080 for near half the price. These prices are absolute madness. Yep I agree, it is madness for sure. I already had funds set aside and allocated for a complete new system build coming up here soon, so I was just like wth might as well go all out.
Wulfen Posted October 4, 2018 Posted October 4, 2018 Just now, dburne said: Yep I agree, it is madness for sure. I already had funds set aside and allocated for a complete new system build coming up here soon, so I was just like wth might as well go all out. Keep some funds aside for the CV2 . I think us VR nuts will be compelled to splurge on that, the CV1 has more than paid for itself and is still bombproof.
dburne Posted October 4, 2018 Posted October 4, 2018 2 minutes ago, Wulfen said: Keep some funds aside for the CV2 . I think us VR nuts will be compelled to splurge on that, the CV1 has more than paid for itself and is still bombproof. I will always have funds for a CV2 ! Yep my CV1 has been performing flawlessly for me , almost on a daily basis.
chiliwili69 Posted October 5, 2018 Posted October 5, 2018 11 hours ago, BlackMambaMowTin said: I'm just wondering why Oculus Home or Steam VR are stealing cycles from the core that's busy doing the IL-2 sim calculations In the past, we were measuring the influence of running Oculus Home ON-OFF for IL-2 VR and there were not measurable difference.
Wulfen Posted October 5, 2018 Posted October 5, 2018 (edited) Pulled the trigger on a Zotac Gaming 2080 Amp last night. At ~ 840 euro it`s ~80 euro more than a Evga black 1080ti and cheaper than a lot of other 1080ti`s. With prices so close, I think the 2080 is a better buy for future proofing, it has better performance with more potential and better resale in the future. The 20 series seem to be better performers in VR than the 10 series and it should be a decent step up from the 1070 I sold. Edited October 5, 2018 by Wulfen
BlackMambaMowTin Posted October 5, 2018 Posted October 5, 2018 I think I should have upgraded to an i7 8700K instead of going from a 1080 ti to a 2080 ti for VR. It would have been cheaper and had more impact. I didn't realize that CPU performance was such a big issue in VR. He's getting great framerates with an i7 8700K and 1080 ti on a pimax. 1
Alonzo Posted October 6, 2018 Posted October 6, 2018 2 hours ago, BlackMambaMowTin said: I think I should have upgraded to an i7 8700K instead of going from a 1080 ti to a 2080 ti for VR. It would have been cheaper and had more impact. I didn't realize that CPU performance was such a big issue in VR. He's getting great framerates with an i7 8700K and 1080 ti on a pimax. His settings are as follows: 8700k @ 5.0 ghz (AVX offset unspecified) Balanced preset, shadows low, mirrors off, distant landscape normal, horizon draw distance 70km, landscape filter off, grass normal, clouds medium, AA off, sharpen on. 80% SuperSample in SteamVR, Pitool render quality 1.0 A couple of things he said in the video: "Low shadows are still great." "Distant landscape detail at Normal, the lowest setting, is still looking great." My personal preference is for at least Medium shadows and High clouds, and 4x landscape detail (otherwise I can't see airfields from very far away). I'd be very interested in someone who's played a lot of IL-2 who can try the headset and tell us a bit more about the improved FOV and detail, for example for spotting/IDing at a distance.
VBF-12_Stick-95 Posted October 6, 2018 Posted October 6, 2018 (edited) I tried to find user benchmarks for 3DMark VRMark Blue Room for as close as possible to SweViver's hardware to see what the potential increase in performance may be by going to a 2080Ti. Here's what I found. 3DMark VRMark Blue Room actual user benchmarks: i7-8700K @ 4.7, MSI 2080Ti - benchmark 5249, avg fps 114.43, target fps 109 i7-8700K @ 5.0, MSI 1080Ti - benchmark 3600, avg fps 78.48, target fps 109 I couldn't find someone running a 8700K at 5.0 with the 2080Ti, though it may be out there. This 2080Ti comparison reflects an increase of ~46%. If you adjust the lower frame rates he mentions he ran into under certain circumstances in IL-2, e.g. ~50 to 60 fps on the deck, the increase the 2080Ti may produce under the same conditions results in 73 - 88 fps. Much of the fps he was seeing were more in the 70 - 90 range. With the 2080Ti this may be more like 102 - 131 fps. This "extra" fps may allow for higher in-game or SteamVR/PiTools settings while still maintaining 80 to 90 fps. Hopefully he'll have his 2080Ti soon and share the new IL-2 results. EDIT: corrected % increase and related fps estimates. Edited October 6, 2018 by VBF-12_Stick-95
BlackMambaMowTin Posted October 7, 2018 Posted October 7, 2018 On 10/5/2018 at 9:24 PM, Alonzo said: His settings are as follows: 8700k @ 5.0 ghz (AVX offset unspecified) Balanced preset, shadows low, mirrors off, distant landscape normal, horizon draw distance 70km, landscape filter off, grass normal, clouds medium, AA off, sharpen on. 80% SuperSample in SteamVR, Pitool render quality 1.0 A couple of things he said in the video: "Low shadows are still great." "Distant landscape detail at Normal, the lowest setting, is still looking great." My personal preference is for at least Medium shadows and High clouds, and 4x landscape detail (otherwise I can't see airfields from very far away). I'd be very interested in someone who's played a lot of IL-2 who can try the headset and tell us a bit more about the improved FOV and detail, for example for spotting/IDing at a distance. What I found interesting in his video is that his CPU is only spending 7 to 8 ms to render a frame. Mine is spending 12 to 13 ms. Do you know how long your CPU is taking to render a frame?
SCG_Fenris_Wolf Posted October 7, 2018 Posted October 7, 2018 On 10/6/2018 at 8:24 AM, Alonzo said: His settings are as follows: 8700k @ 5.0 ghz (AVX offset unspecified) Balanced preset, shadows low, mirrors off, distant landscape normal, horizon draw distance 70km, landscape filter off, grass normal, clouds medium, AA off, sharpen on. 80% SuperSample in SteamVR, Pitool render quality 1.0 A couple of things he said in the video: "Low shadows are still great." "Distant landscape detail at Normal, the lowest setting, is still looking great." My personal preference is for at least Medium shadows and High clouds, and 4x landscape detail (otherwise I can't see airfields from very far away). I'd be very interested in someone who's played a lot of IL-2 who can try the headset and tell us a bit more about the improved FOV and detail, for example for spotting/IDing at a distance. To be honest, his settings are bad and I wouldn't go with his recommendations. He does not specify the AVX offset He ignores the problematic cloud rendering on balanced preset, which gets even worse with clouds medium He ignores the lack of shadows which are important for spotting on low His expressions "low shadows are still great" is wrong, because a lot of shadows are completely missing. How to have an opinion on them if they aren't there. Distant landscape at this rate makes it hard to orientate without HUD, whose performance impact he also doesn't mention at all. He makes good videos and I have been subscribed to him for VR news for quite a while - but clearly he hasn't dived that much into IL-2 yet, that he knows what he's doing. I'm glad he promotes the game a bit through his videos though, the reach can work wonders sometimes. 1
VBF-12_Stick-95 Posted October 7, 2018 Posted October 7, 2018 (edited) 34 minutes ago, SCG_Fenris_Wolf said: To be honest, his settings are bad and I wouldn't go with his recommendations. He admits to only flying IL-2 for short periods, like an hour. Maybe you should send him your recommended settings to try with Pimax. All of us want to know what the Pimax is capable of producing for IL-2, both visually and fps, with decent settings. Edited October 7, 2018 by VBF-12_Stick-95
Alonzo Posted October 8, 2018 Posted October 8, 2018 23 hours ago, SCG_Fenris_Wolf said: To be honest, his settings are bad and I wouldn't go with his recommendations. I completely agree, but I got into trouble in this thread by suggesting he's a bit of a Pimax shill, so decided to just document his suggestions here for others to make up their own minds. In answer to "how is his CPU only spending 7ms per frame?" I think it's because he has a bunch of stuff turned off/low (which Fenris details above) which mean the CPU is doing many less draw calls but result in a game experience that is substandard. Personally I cannot stand Clouds on anything less than High -- they move very weirdly and are very distracting on Medium. Similarly Shadows on Low means that I get a pixelated crappy shadow moving across my canopy and it's very very distracting. It's great to see Pimax working on their software, though. In another thread it looks like they are continuing to find ways to improve performance a few percentage points at a time, and those kinds of fixes will add up over time.
chiliwili69 Posted October 8, 2018 Posted October 8, 2018 It is a nice video, one of the most complete VR performance reviews in a video. He has done a good job, with 4000 views in just 4 days!, Nice for IL-2 VR! Apart from the details of the shadows at low and clouds at medium (he recognise he is not a heavy users, like most of the IL-2 users, I don´t consider myself a heavy user), it gives a good feeling about how a good rig perform (8700K with 1080Ti). The recording software and the fpsVR Hud is also eating some frames. He also talks about the 3 pre-rendered frames in the NVIDIA panel. I thought it has no influence but I will need to study that. But the important item here is not what setting is he using for that, the important thing is that his GPU is always at 60-72% load using normal fov and 80%SS. That´s quite great. Basically it means that he will not improve his fps with the 2080Ti (for the same SS), basically because the 1080Ti is not loaded at all. At that means that the clear bottleneck is, as usual, the CPU!! and we will not need to spend our budget in the expensive 2080Ti. So go just for the best CPU (in terms of OC) in the market. I will probably go to the new intel 9th gen as soon is available (19th-Oct)
BlackMambaMowTin Posted October 8, 2018 Posted October 8, 2018 2 minutes ago, chiliwili69 said: It is a nice video, one of the most complete VR performance reviews in a video. He has done a good job, with 4000 views in just 4 days!, Nice for IL-2 VR! Apart from the details of the shadows at low and clouds at medium (he recognise he is not a heavy users, like most of the IL-2 users, I don´t consider myself a heavy user), it gives a good feeling about how a good rig perform (8700K with 1080Ti). The recording software and the fpsVR Hud is also eating some frames. He also talks about the 3 pre-rendered frames in the NVIDIA panel. I thought it has no influence but I will need to study that. But the important item here is not what setting is he using for that, the important thing is that his GPU is always at 60-72% load using normal fov and 80%SS. That´s quite great. Basically it means that he will not improve his fps with the 2080Ti (for the same SS), basically because the 1080Ti is not loaded at all. At that means that the clear bottleneck is, as usual, the CPU!! and we will not need to spend our budget in the expensive 2080Ti. So go just for the best CPU (in terms of OC) in the market. I will probably go to the new intel 9th gen as soon is available (19th-Oct) I notice that his CPU was taking 7 ms to render. Mine is taking 12 to 13. (i7 6700K @ 4.3 GHz ... dud won't OC any higher). Can you tell me how long your CPU is taking to render a frame? If his CPU is only taking 7 ms to render a frame then a more powerful GPU should help. Also a better GPU might allow him to push the super sampling a bit. About pre-rendered frame. I heard other claim that it should be set to application controlled otherwise it causes input lag and stutter.
chiliwili69 Posted October 8, 2018 Posted October 8, 2018 17 minutes ago, BlackMambaMowTin said: If his CPU is only taking 7 ms to render a frame then a more powerful GPU should help WHY? In his video you don´t have a single frame drop due to GPU (it is never at 100%) 20 minutes ago, BlackMambaMowTin said: lso a better GPU might allow him to push the super sampling a bit Yes, this is true, but the image gain might not be worth the value of the card.
SCG_Fenris_Wolf Posted October 9, 2018 Posted October 9, 2018 (edited) You have collected 20+ performance runs on different systems via the test tracks on these forums and a largely standardized method, yet the not-standardized individual record is somehow more important to people than any other single dataset because it's on YouTube. Without even telling anything new - we knew about the cpu bottleneck all along. The video should not matter, as it lacks the same premises and base for comparison. I thought the youngest generation was so simple to be so affected by YT, now I see, that the older ones are as well, sadly. Don't get me wrong, I'll like it, if IL 2 gets more exposure. And I like the YouTuber, he made interesting vids in the past. But you shouldn't even draw conclusions out of a false, individual incomparable set of data. Edited October 9, 2018 by SCG_Fenris_Wolf
BlackMambaMowTin Posted October 9, 2018 Posted October 9, 2018 14 hours ago, chiliwili69 said: WHY? In his video you don´t have a single frame drop due to GPU (it is never at 100%) Yes, this is true, but the image gain might not be worth the value of the card. But is he CPU bound? His CPU render time is around 7 ms. I notice that it's when my CPU takes 10 or more ms to render that I start having problems. I'm still unclear about CPU and GPU render times in the performance HUD. My CPU render times are about 12 ms and GPU render times are below 1 ms.
chiliwili69 Posted October 9, 2018 Posted October 9, 2018 1 hour ago, BlackMambaMowTin said: But is he CPU bound? His CPU render time is around 7 ms If you look at this video, you will see that when he is achieving 90fps, his CPU render time is below 10 ms. But in some scenes, with more complex planes or smoke or large village, he is below 90fps and it is because the CPU time is more that 11 ms. So, even at his settings he is also sometimes bounded by the CPU. 15 hours ago, SCG_Fenris_Wolf said: any other single datase in our datasets there is no test of the Pimax. That´s the key point of his revision. 15 hours ago, SCG_Fenris_Wolf said: Without even telling anything new This video is telling me quite a bit. My expectations were a bit low on the performance of 1080Ti with Pimax5K+ with normal FOV. But he just show that the bottleneck is not the GPU, just the CPU as usual. But he is not conscious about that. But it doesn´t matter. 15 hours ago, SCG_Fenris_Wolf said: I thought the youngest generation was so simple to be so affected by YT, now I see, that the older ones are as well, sadly ?? Hey, YouTube is just one channel more of getting info but should not be the only one. I always try to get as much input as possible from a topic. another matter is the credit that I give it. At the end of the day my subjective feeling will be the one which really matter to me. So hope to receive the Pimax before the end of the year and test it myself. But meanwhile I know now that I can upgrade my PC by changing CPU/RAM but not the 1080Ti.
BlackMambaMowTin Posted October 9, 2018 Posted October 9, 2018 49 minutes ago, chiliwili69 said: But meanwhile I know now that I can upgrade my PC by changing CPU/RAM but not the 1080Ti. I was really hoping the 2080 ti would allow me to play IL-2 perfectly smoothly at a constant 90 fps. I doubt that even the coming i9 9900K will make that possible. It seems IL-2 in VR needs a 8 GHz processor.
JonRedcorn Posted October 10, 2018 Posted October 10, 2018 21 hours ago, BlackMambaMowTin said: I was really hoping the 2080 ti would allow me to play IL-2 perfectly smoothly at a constant 90 fps. I doubt that even the coming i9 9900K will make that possible. It seems IL-2 in VR needs a 8 GHz processor. IT needs some real architectural changes in new cpu's, we need more ipc. Intel's been using basically the same arch for 10+ years. There's no gains to be had until we switch to 7nm/10nm. the 9900k will do little in the way of getting this game running well. It's got more cores which are useless and the same speed as previous gens. There's really nothing to see here besides the 500 dollar price tag. You can spend all the money you want but until we get some revolutionary changes in design of the chips there will be nothing but wasted money.
Alonzo Posted October 10, 2018 Posted October 10, 2018 To be fair, most modern chips ought to be enough to run even a demanding sim like IL2 or DCS at a full 90 fps, but those game engines are not designed for modern multicore CPUs. If you watch Battlefield 1 it spreads its load evenly across all the threads on your CPU and scales extremely well with more cores and more frequency. Sadly IL2 does not do this, so the only thing we can do is throw frequency and IPC at the problem. It's been clear for a decade that multicore is where the future is, no more More's Law for single thread improvements. I still hope that as part of an optimization pass for Bodenplatte the engine will get some tweaks. Maybe not radical improvements but just offloading a few things from the main processing thread, tweaking a hot spot here and there, might allow a much more favourable VR experience. I suspect that VR will get accidental improvements from developers improving the main game, rather than special love and attention, because it *is* still a niche market.
Ehret Posted October 10, 2018 Posted October 10, 2018 1 hour ago, 15th_JonRedcorn said: IT needs some real architectural changes in new cpu's, we need more ipc. Intel's been using basically the same arch for 10+ years. There's no gains to be had until we switch to 7nm/10nm. There is a reason why the arch is the same - it's the best one possibly can be build. It just hit diminishing returns and the Intel had tried others - iAPX32, i960, i860 and Itanium - all failed to get a wider acceptance. The lower manufacturing pitch will not improve situation much, if at all. The problem lies somewhere else - we lost the Dennark scaling over a decade ago.
A_radek Posted October 11, 2018 Posted October 11, 2018 (edited) As I understand it, and someone correct me if I’m wrong on anything here as I’m far from a coder, multithreading a game is very time consuming due to games having so many dependencies. You can’t just throw physics on one core and graphics on another as naturally graphics needs physics to finish before knowing what to render, so one would still end up with only one core at a time working. The alternative would be splitting each game system over several cores/threads. However, splitting say gameplay logic or physics would multiply complexity of each and naturally, with complexity comes slower performance, memory consumption and hard to find bugs. In the end dividing a task between just a few cores is not always faster than using one and not all developers can afford the investment of an engine that scales and makes use of more than 4 cores. Dice (battlefied 1 devs) did a fantastic job. I’ve worked for studios like that. They employ hundreds with titles taking years of work. Those guys can have a team of coders working for months on copy protection alone. While netcode in il2, from dev diaries, sounds like a task that gets picked up now and then by one guy at a time. 16 hours ago, Alonzo said: I still hope that as part of an optimization pass for Bodenplatte the engine will get some tweaks. Maybe not radical improvements but just offloading a few things from the main processing thread, tweaking a hot spot here and there, might allow a much more favourable VR experience. I hope so too. When I first heard they were doing bodenplate rather than the programming intensive pacific theatre (carriers, torpedoes and what not). I got my hopes up it would release some time for that lead engineer to work on core engine. Then came mention of physics changes for the 262 ? Edited October 11, 2018 by SvAF/F16_radek
SAS_Storebror Posted October 11, 2018 Posted October 11, 2018 (edited) 43 minutes ago, SvAF/F16_radek said: You can’t just throw physics on one core and graphics on another as naturally graphics needs physics to finish before knowing what to render No Sir. You can have physics in a parallel thread. Think of it like V-Sync and triple buffering. You end up having 3 sets of physics data: Set A is what your renderer uses to work with. Set B is what your physics use to work with. Set C is the back buffer for both. Flag is an atomic flag telling the Render that there's new Physics Data for him. For instance, at frame "x" the situation may be like this: Set A <- Render Set B <- Physics Set C Flag = false Now let's say Physics finishes updating, then the situation after physics update is this: Set A <- Render Set B Set C <- Physics Flag = true See what happened? Physics is now working on Set C, and Set B holds the last fully updated set of physics data. Now let's say the Render is the next to finish his task and wants a fresh set of physics data. He would first check whether Physics has finished an update since the last time he has finished himself, and if so, swap sets with the back buffer: Set A Set B <- Render Set C <- Physics Flag = false You get the picture: The only thing that needs to be synced in this scenario is an atomic flag that tells the Render when Physics have completed an update+set swap. Mike Edited October 11, 2018 by SAS_Storebror 1
A_radek Posted October 11, 2018 Posted October 11, 2018 You obviously have a much better understanding of this than I do. But from your explanation I take it your having physics/graphics working simultaneously by guesstimation of next frame player input?
Ehret Posted October 11, 2018 Posted October 11, 2018 (edited) It's not clear what you got there with those sets... Looks like a software pipe-lining and a semaphore? Both concepts have uses and own perils. A semaphore can result in ugly locks and pipe-lining introduces latency. If parallelizing was such easy thing to universally implement then we would be running Transputers not super-scalar out of order processors like the modern x86, today. Edited October 11, 2018 by Ehret
SAS_Storebror Posted October 11, 2018 Posted October 11, 2018 Guys, I'm not trying to say that parallelization of existing applications was a cakewalk. All I've been trying to get across is that physics and renders can indeed be operated in parallel with minimum locks (and latency), as is done in many other game engines. No doubt that in the bigger picture of a full-blown game there's much more to this. No need to lose your poise because of this either. Mike
dburne Posted October 11, 2018 Posted October 11, 2018 19 hours ago, Alonzo said: To be fair, most modern chips ought to be enough to run even a demanding sim like IL2 or DCS at a full 90 fps, but those game engines are not designed for modern multicore CPUs. If you watch Battlefield 1 it spreads its load evenly across all the threads on your CPU and scales extremely well with more cores and more frequency. Sadly IL2 does not do this, so the only thing we can do is throw frequency and IPC at the problem. It's been clear for a decade that multicore is where the future is, no more More's Law for single thread improvements. I still hope that as part of an optimization pass for Bodenplatte the engine will get some tweaks. Maybe not radical improvements but just offloading a few things from the main processing thread, tweaking a hot spot here and there, might allow a much more favourable VR experience. I suspect that VR will get accidental improvements from developers improving the main game, rather than special love and attention, because it *is* still a niche market. Agree fully. I have started to collect the components for my upcoming new build. Have an EVGA RTX 2080 Ti FTW3 Ultra on pre-order. Also managed to snag an i9 9900k on Amazon pre-order yesterday. I want this new build to last as long as the one I currently have - 5 years. Hopefully there will be some improvements in the games I play to take more of an advantage of this technology during this time, will see.
TWHYata_PL Posted October 11, 2018 Posted October 11, 2018 (edited) On 10/9/2018 at 1:08 AM, chiliwili69 said: He also talks about the 3 pre-rendered frames in the NVIDIA panel. I thought it has no influence but I will need to study that This is actually a good find by him. Tested this option in my Vive on Kuban map and after 1~2 minuts of flying I recived more constant fps near ground level. Like 10 ~12 fps more than default settings. Edited October 11, 2018 by TWHYata_PL
dburne Posted October 11, 2018 Posted October 11, 2018 (edited) I tried changing the pre-rendered frames to three, and did not see any improvement at all so went back to one. I am sure it also is somewhat system dependent. Edited October 11, 2018 by dburne
TWHYata_PL Posted October 11, 2018 Posted October 11, 2018 (edited) 8 hours ago, dburne said: I am sure it also is somewhat system dependent might be CPU/memory related.. but have you fly longer than 2 minutes during your test ? as I said, it give you fps benefit after a while, I guess when full map is loaded to memory or so.. with thia option I have near ground level the same fps like in high alt, before it was always less fps ( 500 meters and below) Edited October 11, 2018 by TWHYata_PL
dburne Posted October 11, 2018 Posted October 11, 2018 14 minutes ago, TWHYata_PL said: might be CPU/memory related.. but have you fly longer than 2 minutes during your test ? as I said, it give you fps benefit after a while, I guess when full map is loaded to memory or so.. with thia option I have near ground level the same fps like in high alt, before it was always less fps ( 500 meters and below) Oh yes, I always fly longer than 2 minutes. Typically a full mission. 2
Alonzo Posted October 12, 2018 Posted October 12, 2018 I upgraded from the 1070 to a 2080. Here are a few findings: On balanced graphics with 1.0 SS and no AA, performance improved from 82.4 FPS average to 85.3 FPS average on Chili's spitfire bombers track. This might seem like the 2080 isn't giving much, but... Applying in-game 4xAA on the 2080 it matches the 1070 performance at 82.0 FPS, and I can then apply 1.2 supersample (OTT number, this is 1.44x the total pixels) and still be at 82.2 FPS overall. So the 2080 lets me push higher settings especially when they are GPU bound such as AA and SS. Using the Migoto mod even with all its shader fixes disabled DROPS performance significantly, in my testing from ~82.8 fps down to ~75 fps. It also seems to be quite taxing on the GPU, with much less headroom available (96% usage with Migoto vs 80% without) My best FPS settings have been with Migoto disabled, balanced overall setting, 4x AA, 1.2 OTT supersample. This gives ~83.7 fps on the demo track at about 80% CPU usage. I played a bunch of Wings of Liberty multiplayer like this and it was very smooth, and could even go to High graphics without issue. This is pretty much what I expected, although I had hoped for more. The 2080 is a decent upgrade over a 1070, but only really allows you to push GPU-based settings such as AA and SS. For everything else we're still highly CPU dependent. If you have a 1080ti, don't bother with a 2080 or a 2080ti, it will make very little difference to your IL2 experience. If you are considering building a rig for IL2 VR, wait for the Intel 9000 series chips and get ready to overclock everything including your memory. 1
dburne Posted October 12, 2018 Posted October 12, 2018 Thanks for sharing. That is what I have been thinking, when I get my 2080 Ti FTW3 Ultra it should allow me to increase my settings.
Wulfen Posted October 12, 2018 Posted October 12, 2018 (edited) 2 hours ago, dburne said: Thanks for sharing. That is what I have been thinking, when I get my 2080 Ti FTW3 Ultra it should allow me to increase my settings. Just got my Zotac 2080 Amp today and picked up VRMark on sale on steam. Some overclocking on top of the OC Scanning run and I added another 55 mhz on the core giving ~2025-2050 mhz and a steady OC with a very decent jump of 1225 mhz on the memory, going in excess of 8200 mhz. Running VRMark, the Blue room test gave me a result of just over 3900, with a temp ~60c and fans running 60-70% and quiet at that. The 1080ti and Titan Xp are putting out 3025 and 3257 respectively in previous benchmarks, with the 2080ti at 4586. Looks like the 2080 beats the 1080ti and Titan Xp hands down in the VR results. https://hothardware.com/reviews/nvidia-geforce-rtx-performance-and-overclocking?page=4 Edited October 12, 2018 by Wulfen
Alonzo Posted October 12, 2018 Posted October 12, 2018 45 minutes ago, Wulfen said: Looks like the 2080 beats the 1080ti and Titan Xp hands down in the VR results. Absolutely, in the optimized-for-VR benchmarks. IL2 is a bit of a "VR added on" game though -- have you benched IL2 on your system and compared results to your previous card? I'd be very interested in your 2080 results, whether they match mine, etc.
Wulfen Posted October 12, 2018 Posted October 12, 2018 (edited) 1 hour ago, Alonzo said: Absolutely, in the optimized-for-VR benchmarks. IL2 is a bit of a "VR added on" game though -- have you benched IL2 on your system and compared results to your previous card? I'd be very interested in your 2080 results, whether they match mine, etc. Well my core clocks are steady at 2055 mhz as it throttles down from a high of 2085, with the memory at 8224 mhz. On water, 2100 on the core would be a given I`d say. GPU usage hovers around the 50-60% in Kuban on Ultra, with the CPU ~40-50% @ 4.6ghz 4790K OC. The GPU temps are 50-51c with the fan hitting 70% and fairly quiet (manual setting). Fps jumps around as usual but I had it in the high 60`s to 70 over the forest areas. It need optimization as discussed before, the GPU isn`t being maxed out for the majority of the time from what I can see from a quick test flight. I think we need Vulcan, DLSS, and of course the CV2. Edited October 12, 2018 by Wulfen
dburne Posted October 12, 2018 Posted October 12, 2018 52 minutes ago, Wulfen said: Well my core clocks are steady at 2055 mhz as it throttles down from a high of 2085, with the memory at 8224 mhz. On water, 2100 on the core would be a given I`d say. GPU usage hovers around the 50-60% in Kuban on Ultra, with the CPU ~40-50% @ 4.6ghz 4790K OC. The GPU temps are 50-51c with the fan hitting 70% and fairly quiet (manual setting). Fps jumps around as usual but I had it in the high 60`s to 70 over the forest areas. It need optimization as discussed before, the GPU isn`t being maxed out for the majority of the time from what I can see from a quick test flight. I think we need Vulcan, DLSS, and of course the CV2. You would likely see more GPU utilization if you increased some settings, like SS.
Wulfen Posted October 12, 2018 Posted October 12, 2018 (edited) 9 minutes ago, dburne said: You would likely see more GPU utilization if you increased some settings, like SS. I have the resolution set to 150% (2016 x 2400) in the steam VR settings as is for BoX. My cpu is going slightly over 50% at times, while the GPU is maybe hitting ~50 % usage but the fps around 45-50 mark. The software seems not able to max out the GPU. I could try the Oculus debug tool or go for 200% on the steam VR. Edited October 12, 2018 by Wulfen
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now