Jump to content

Recommended Posts

Posted (edited)

I'm currently using an RTX 2070 Super, driving a Rift-S, which works fine in IL-2 with some eye-candy switched off. Since I don't think GPU-prices over here in the EU will go much lower than they currently are - and I don't want to wait for the 40xx-series and NVidia's potentially insane pricing for these things, I'm thinking about an upgrade.

 

I have my eye on an RTX 3070 which, on paper, should give me a nice uplift over the 2070 Super, but...: Since I'm pretty sure I'll want to upgrade to a higher res VR set in the near future (G2 or better I guess), I can't help but wonder if the 3070 would be enough to drive those things and, with that in mind, if I shouldn't go for a 3080 instead. I'd like to avoid the additional cost (and having to upgrade my PSU) but if the 3070 can't cut it at higher VR resolutions, the upgrade might prove a bit pointless in the future. Still: I could get a 3070 at just under €600 at the moment while a 3080 would be around 800 plus another 100 or so for a more powerful PSU.

 

So... : Is anyone running/has anyone used that 3070 with a G2 or higher and care to share his experience?

 

Thanks!

 

 

S.

Edited by 1Sascha
SCG_Fenris_Wolf
Posted

3070 undercuts what the G2 can achieve for IL-2. You'll want to begin with a 3080 and go upwards from there.

 

The 3070 just doesn't fill enough pixels to maintain 90fps at the G2's required resolution of 2600x2600 upscaled to 3100x3100 by NIS/Fidelity.

 

You can undercut the G2 of course, but why get a high-res headset if the GPU can't feed it and you get either pixelation or blurriness, depending on your sharpening filter? Beats the purpose.

  • Thanks 1
Posted (edited)

Hello 1Sascha this is Jim.

 

I currently fly IL2 with G2 and nvidia rtx 3060 ti and 32 gb RAM so quite close to 3070 i could say. i started with Q2 and now i fly only with G2. So i will tell about my experience....

 

If you play MP and you want to be quite effective and not been shot down every minute you must not run your headset to high resolutions in order to spot enemy planes far enough before its too late. I currently fly with 2600x2600 final resolution and 80% fsr upscaling using openxr toolkit and opencomposite, that setting gives me about 90 fps most of the times and maximum fps drop about 15 fps to a very demanding scenery with many planes all around. Of course i lowered some eye candy settings like clouds to medium, draw distance to 40 km, no mirrors , no shadows. To be honest i checked with higher resolutions...about 3100x3100 and didnt see any great difference in image quality.

Its up to you what you really want. If you want to fly inside an eye candy environment keeping 90 fps all the time then i will agree with SCG_Fenris_Wolf about 3070 card not being enough.

For some reason G2 not working in full resolution is better than Q2 (probably due to video compression?) having better clarity. I also see better clarity using opencomposite+openxr toolkit than steamvr and fsr upscaler.  

Edited by dgiatr
  • Thanks 1
Posted
4 hours ago, dgiatr said:

If you play MP and you want to be quite effective and not been shot down every minute you must not run your headset to high resolutions in order to spot enemy planes far enough before its too late.

 

First of all: Thanks for taking the time to share your experience, much appreciated! :)

 

I gotta say that I don't use IL-2 in MP that much - I don't like its stock icons and I don't like scanning for pixels in "full real" only to get bounced again and again - plus my MP WW2 fix of choice is an ancient MMO-sim that is far away from supporting VR, so .. :) I mostly fly IL-2 offline for the "I can't believe I'm actually sitting in a plane - oh wait... I'm not!"-feeling VR gives me.

 

Gotta say that my first experience with IL-2 in VR

was on a friend's PC back in May who had just upgraded to an RX 6900 XT to drive his G2 (he was the one who sold me the 2070 Super I currently use, because it wasn't enough for the G2) and while the pic was quite a bit sharper than on my Rift-S, it still left a bit to be desired WRT sharpness. While the cockpits looked a lot better at higher res, there was still that weird blurriness when looking at planes at medium to long range. And that was with pretty ambitious GFX-settings enabled, since that 6900 isn't exactly a slouch.

 

I'm not really expecting to be able to crank everything to the max with a new card and, down the road, a higher res VR-set... I'm more interested in general smoothness and a less "pixel-y" experience. And since you're running your G2 on the slightly slower 3060 Ti I'm kinda thinking I'll hold off on the crazy just a little and go with the 3070. Although some benchmark results I've seen seem to suggest that there isn't *that* much in it between the 3060 TI and the 3070, but I think that at EU prices, the 3070 is a slightly better deal when it comes to bang-per-buck.

 

6 hours ago, SCG_Fenris_Wolf said:

3070 undercuts what the G2 can achieve for IL-2. You'll want to begin with a 3080 and go upwards from there.

 

I hear ya and after all I said up there, I'm still very, very tempted. Oh well, maybe I'll just set a price alert for the 3080 and hope for the best - but I really doubt we'll see any more significant drops in price on the 30xx cards, so I think I'll eventually chicken out, go 3070 and then remember your comment once I get a new VR-set... ?

 

 

 

S.

Posted

I have probably close to a month of testing and tuning the G2 to get it to function well and look good, which typically is one or the other. I realized that my 3060 WAS capable of running the G2, it simply wasnt enough. I could get 60-65FPS, but it was clarity that I needed. I couldn't identify shit until they were right on top of me, which made it super frustrating coming from TrackIR with 280FPS on a 34" flat screen. 

 

So I upgraded to a 3090, and the G2 is taxing on that thing, but can deliver a solid 90FPS WITH exceptional clarity. But again, it took a month of testing and tuning, adjusting Nividia control panel settings and everything in between. It was not an easy process and was very frustrating at times. 

 

So, to answer your question, if you want to be able to run the G2 at full resolution, a 3080 should be the minimum, and should work wonders over what you have now. 

 

  • Thanks 1
  • Upvote 1
SCG_Fenris_Wolf
Posted (edited)

By that logic @dgiatr, if you could call it logic, you could play on a 1080p screen with instant snapviews looking back 180°, and spot stuff at a distance. No more head cranking, instant overview to spot at a distance.

 

The OP said he'd want to run the G2 on a higher resolution, and I quote: "if the 3070 can't cut it at higher VR resolutions, the upgrade might prove a bit pointless in the future". 

 

 

So guys, we could also just recommend him an OLED Vive Pro with Samsung's GearVR lenses. That way he'd spot best, has infinite contrast ratio, a low pixel count in a pentile matrix, contacts properly sticking out, clarity of Varjo Aero-like lenses. Why do I need to answer with a proper move in chess if I can just tilt the board and change the frame? 

 

:joy:

Edited by SCG_Fenris_Wolf
  • Upvote 1
Posted

I recently upgraded from the RiftS to the G2 when it went on sale beginning of July. Had been running the RiftS on the 2060/6gb recycled from my last PC. It was workable in IL2 with a lot of things toned down. 

 

Enter the G2. At the base 90fps setting at the higher res it really crushed the 2060 so, even more lowering of settings ensued! Taking it to 60fps helped--but not by much. 

 

I was finally able to complete the PC build I started at the start of the pandemic grabbing the planned 3080 for it a couple of weeks ago when one I liked (Strix 12gb)went on sale to closer to normal price. Night and day performance difference over the 2060 goes without saying! I'm able to run at WMR 90fps setting at max res which, even with the 3080, yields a steady 75fps in IL2 with a lot of stuff turned back up high. Still tweaking it but everything is much improved. Acuity is incredible. Better spotting of things at a distance of everything on the ground and in the air. 

 

FYI, I'm also running OpencompositeVR, XRNS and XR Toolkit with IL2. 

  • Like 1
No.23_Starling
Posted

Is anyone able to hop on Discord and explain how to setup and config OpenComoposite, XRNS and XR toolkit? Sounds like you’re getting much better FPS than me with the same GPU. I suspect it’s SteamVr that’s limiting it 

Posted
15 minutes ago, Drano said:

I recently upgraded from the RiftS to the G2 when it went on sale beginning of July. Had been running the RiftS on the 2060/6gb recycled from my last PC. It was workable in IL2 with a lot of things toned down. 

 

Enter the G2. At the base 90fps setting at the higher res it really crushed the 2060 so, even more lowering of settings ensued! Taking it to 60fps helped--but not by much. 

 

I was finally able to complete the PC build I started at the start of the pandemic grabbing the planned 3080 for it a couple of weeks ago when one I liked (Strix 12gb)went on sale to closer to normal price. Night and day performance difference over the 2060 goes without saying! I'm able to run at WMR 90fps setting at max res which, even with the 3080, yields a steady 75fps in IL2 with a lot of stuff turned back up high. Still tweaking it but everything is much improved. Acuity is incredible. Better spotting of things at a distance of everything on the ground and in the air. 

 

FYI, I'm also running OpencompositeVR, XRNS and XR Toolkit with IL2. 

I am interested about the setting you used with your old GPU, because mine is not a rocket too.

I have NVIDIA Quadro P4000, HP G2 and  I run the same programs you are using.

I have done a !ot of test, reaching 45 fps minimun but with a non excellent graphic. 

I tought maybe you could suggest some setting to improve performance and, prinipally, quality. 

If you want I can post my setting, so you can check it.

 

Thank you!

Posted

Well I'd add I had the 2060 OC'd a bit as well to keep up. Whatever afterburner found on its auto OC feature. 

 

I don't have my settings saved for when I ran the 2060, sorry. Best I could manage still produced a lot of flappy wings on other planes. 2060 had a hard time. That's no longer an issue. 

 

The setup for opencomposite, etc., is in other thread(s) here. That's where I found out about it. There are also discord servers for OC, XRNS and XR Toolkit with lots of info for each. 

 

I'd post the links but I'm on my phone just now. 

Posted
On 7/8/2022 at 18:36, Drano said:

Entra nel G2. Con l'impostazione di base di 90 fps a una risoluzione più alta, ha davvero schiacciato il 2060, quindi è seguito un ulteriore abbassamento delle impostazioni! Portarlo a 60 fps ha aiutato, ma non di molto. 

I tried to downgrade the frequency of my G2 headphones from 90Hz to 60Hz in WMR, but my FPS dropped dramatically so I went back to 90Hz. So I made some small tweaks to the game and OpenXR Toolkit settings. I can now see an average of 48-50 FPS in the career mission with, I think, the best quality I can achieve with my poor hardware.

If anyone is in a similar situation and is interested in the settings I have adopted, I can publish them.

Scusate la dimensione del font, era dovuto ad un copia-incolla...

Posted (edited)

Well, since our summer is pretty warm still and so I'm not doing a lot of VR at the moment (sweaty face and all that), I've decided to do this:

 

Just ordered a Gigabyte RTX 3070 Eagle OC, since it was the cheapest 3070 I could find (€579) and as a little bonus, Gigabyte are running a cash-back thing ATM, dropping the price by €20.

 

I figure that with crypto on the decline and the 40xx launch sorta looming, it'd be better to sell my 2070 Super now instead of later this year (or next year) when the 40s do come out. I've been checking winning bids on ebay for days and right now, I'm guessing €250 to €300 is a reasonable expectation for the 2070 Super. I don't have the box for it, since the buddy who sold me the card didn't keep it, but even then, I think 250 is more than realistic.

 

I don't expect the 4070 to come out this year. All HW-buffs seem to agree that it'll be a 4080 or even 4090 that'll launch first and that 4060, 4070, etc could well get postponed until early 2023. Once that thing does come out and I feel I need more power, I'll just sell the 3070 and will probably still get a decent deal (I do keep my GPU boxes ? ) and buy the 4070 - if Nvidia don't get too greedy with their 40xx-pricing. *Then* I'll start looking for a G2 or an equivalent set to replace my Rift-S.

 

For now, I'm very sure the 3070 will give me a very decent uplift on the Rift-S and in non-VR gaming and when the time comes, the 4070 will probably be all I'll need for a G2 - if previous performance level jumps from one gen to the next are any indication. I'm expecting the 4070 to be at least as powerful as a 3080, perhaps even a 3080 Ti, which should be plenty for me, even on a G2.

 

 

S.

Edited by 1Sascha
SCG_Fenris_Wolf
Posted (edited)

That is correct, good choice.

 

Afaik only the 4090 will launch this year. For 4070, 4080 and more we'll have to wait for Q1 and Q2 2023.

 

Most interesting may be the new VR headsets about to make a release, Pico Neo 4 and Quest 2 Pro (Cambria), especially due to their Aero-style pancake lenses.

Edited by SCG_Fenris_Wolf
Posted
15 hours ago, SCG_Fenris_Wolf said:

Most interesting may be the new VR headsets about to make a release, Pico Neo 4 and Quest 2 Pro (Cambria), especially due to their Aero-style pancake lenses.

Yeah, but let's see at what sort of prices those will launch. ?

 

Since I have some early adopter folks in my old squad, I'm fairly certain I might be able to snag a lightly used G2 for a decent price once the newer sets come out (and if they launch at prices I'd consider unacceptable).

 

 

S.

Posted

Tadaaa!

GB rtx 3070.jpg

Posted

Congratulations.

 

Soon it will be time for screen shot art. ?

Posted
2 hours ago, AngleOff66 said:

Soon it will be time for screen shot art. ?

Well... already did an up-to-date "3DMark before"-pic for the ebay-sale of the 2070 Super and I hope I'll be able to do the swap before the weekend - RL commitments permitting.

173829599_RTX2070VerkaufTimeSpy.thumb.jpg.3c28311027494a43d5f2b94505d32799.jpg

 

 

Gotta admit I don't expect a huge boost in Time Spy, but I do believe I'll now be able to turn on some more eye-candy in IL-2 VR and still hit my 80 FPS target. Too bad it's still so freakin' hot over here, so VR sessions really aren't as tempting at the moment as they should be ... ?

 

 

S.

Posted (edited)

Man you shoulda already had that thing swapped!! Lol.

Edited by dburne
Posted
22 minutes ago, dburne said:

Man you shoulda already had that thing swapped!! Lol.

Trust me, I wish I could ... dealing with all sorts of geriatric medical hardware getting delivered for my mom here instead ... ?

 

S.

  • Like 1
Posted
2 hours ago, 1Sascha said:

Trust me, I wish I could ... dealing with all sorts of geriatric medical hardware getting delivered for my mom here instead ... ?

 

S.

 

:good:

Great reason!! I have been there myself before.

Posted (edited)

Oh well....

 

Afterburner Auto OC running now as I'm typing this ... let's see what I can get out of this thing. Power draw seems comparable to the 2070 Super so far, but I've yet to run any benchmarks.

 

Oops.. it just finished. Looks like Afterburner wants to go +84 MHz on the GPU and +200 MHz on the memory.

 

S.

GB RRTX 3070 installed.jpg

Edited by 1Sascha
Posted
16 minutes ago, 1Sascha said:

Afterburner Auto OC

 

It's much more beneficial and efficient to undervolt it to a state where it can maintain a steady overclocked boost speed instead. Auto OC will just try to get as much boost out of an aggressive voltage curve which doesn't help much in the end.

  • Thanks 1
Posted (edited)
16 minutes ago, Firdimigdi said:

It's much more beneficial and efficient to undervolt it to a state where it can maintain a steady overclocked boost speed instead. Auto OC will just try to get as much boost out of an aggressive voltage curve which doesn't help much in the end.

If I knew what I was doing, I'd surely try that... ?

 

The card seems to be power limited at factory settings (didn't play around with power limits, since I don't want to overtax my 550W PSU), but here's the result with the Auto-OC - forgot to get an out-of-the-box baseline result, but I can always revert the settings and do that I suppose.

 

 

image.thumb.png.e08738204b1ccb9d0b1bc8e58629f878.png

 

 

Looks like a 30% boost in GPU performance, which isn't too shabby IMO, considering what I paid for the card and what I'll "earn back" by selling the 2070S.

Power draw of the 3070 seems completely acceptable, considering I'd been running this rig for a couple of months now problem-free with the 2070S, which sucked only a little less (~215 W in 3D Mark IIRC).

 

And at least I'm now well above the "gaming laptop" category, where this PC was with the 1060 3GB I had to use initially. ?

 

image.thumb.png.b7d7bc0e814efbfc39b087c3bcc65fe0.png

 

 

S.

Edited by 1Sascha
Posted (edited)
4 minutes ago, 1Sascha said:

since I don't want to overtax my 550W PSU

 

That's the other benefit of undervolting it, it'll use considerably less energy and have less demands on your PSU (and it will run cooler as well as a result). Do a quick search online, there's a plethora of video and written tutorials on undervolting and it's not that hard to do honestly.

 

Edited by Firdimigdi
  • Thanks 1
Posted (edited)

Just for §hits and giggles, here's the same system with the 1060 3GB in it:

 

 

3dMark 130322.jpg

 

.... and what Passmark's benchmark had to say:

 

image.thumb.png.f97a9b900fecddc8137d1d4f16effbcf.png

Edited by 1Sascha
Posted (edited)

Been running my 3090 FTW3 Ultra at + 130 clock and +400 Mem for games. Have been very happy with it. I don't mind the fan noise though( run them at 100%)as I use earbuds and do not hear it when gaming.

Obviously I go back to default for just desktop use.

Edited by dburne
Posted
1 hour ago, dburne said:

Been running my 3090 FTW3 Ultra at + 130 clock and +400 Mem for games. Have been very happy with it. I don't mind the fan noise though( run them at 100%)as I use earbuds and do not hear it when gaming.

Oh yeah... I had already used a custom fan-curve for my 2070S at your suggestion... doing the same for the 3070 now. I'm gaming with a head-set so I don't really care about noise and at 1000 RPM or so for desktop-stuff/browsing, the fans aren't really audible.

 

That three-fan cooler seems more efficient than the two-fan one on the older card. Or perhaps the 3070 is generally a cooler-running card? Not sure, but my GPU-temps are definitely lower. South-Bridge still gets pretty toasty (~60°C at current high ambient temps while gaming), despite that rear blow-through design of the Gigabyte-card.

 

S.

Posted

I undervolted my RTX 3070 for a saving of 90Watt and drop in GPU temp of 8-10 degrees C for loss of only 2 FPS. The performance difference to me is nothing but the power and heat saving is really excellent. (My room gets hot enough without my PC pounding out unnecessary heat ...and pumping up my power bill !)

 

I followed a YT that worked fine for me (but as Firdimigdi says there are heaps of them) https://youtu.be/eqwKkGkILzs?t=138

 

  • Thanks 1
Posted (edited)
On 8/11/2022 at 7:58 PM, Firdimigdi said:

It's much more beneficial and efficient to undervolt it to a state where it can maintain a steady overclocked boost speed instead. Auto OC will just try to get as much boost out of an aggressive voltage curve which doesn't help much in the end.

Oookay.... I just followed a 3 min tutorial video and it really seems quite simple. I did crash Heaven and my Explorer when I went a bit too high on the frequency (didn't use the continuous max frequency but the peak values that only popped up for a second or so between scenes). 1935 Mhz @0.9v was too much so I set it to 1890 (which was the continuous max) and that seems to work fine. HW Mon still reports 1905 max, but that could've just been a short peak or something.

 

Temps are down and power draw is down (max never above 190W in Heaven, slightly above 200W in 3D Mark), so that looks like a power savings of ~25 W (Afterburner always seems to report slightly higher power-draw than HWMonitor for some reason, so I'm only comparing HW Monitor values before and after).

I also gave it a cautious 600 MHz OC on the memory (tutorial dude said 800 would be fine for any 3070), then let Heaven run for a bit and tried a Time Spy run after that. System seems to run stable with lower GPU temps and less power draw and I even got a slight boost in my Time Spy score.

I think I'll run some sort of stress-test or just keep Heaven running for an hour or so to be sure, but, yeah... this undervolting stuff really seems to work.. ? Thanks for the tip, guys!

 

Before with Auto-OC:

image.png

 

After, reverted to stock and undervolted to 0.9V:

 

2129490110_3dMark3070undervolt09-600memRES.thumb.jpg.fceaa38da593b8f5b669e94209f3f4c1.jpg

 

 

EDIT: Passmark sees it a bit differently, giving me slightly lower scores for CPU, 2D and 3D performance, but I'm pretty sure I'll be able to live with that. :) Especially since power draw in this benchmark was more than 40W lower than with the Auto OC/without undervolting.

 

EDIT2: Seems that Passmark is a bit touchy when I run it after a 3DMark run - I would get a weird after-image of the first 3D-test (jet fighters circling) which will persist through the rest of the tests and has to be terminated in task-manager afterwards. That seems to have produced the lower score ... I just re-ran the benchmark right after I booted the PC and here's what happened on a clean run and with slightly altered settings.

 

Upped Voltage to 0.95V, increased GPU frequency to 1905 and raised GDDR speed to 7800 MHz.

 

Clean runs - before with Afterburner Auto OC:

1104862854_GBRTX3070Passmark.thumb.jpg.49d4cd316417be0332fe779acb017632.jpg

 

... and after, undervolted and with slightly raised clockspeeds:

 

image.thumb.png.0f52cc2c5a70e4e05deef7edacf7f899.png

 

 

 

Basically that same level of performance (although still slightly lower, unlike in Time Spy where the rating's been slightly higher), but still with significantly lower temps and power-draw.

 

 

 

 

 

 

S.

Edited by 1Sascha
Posted

Time Spy result with 0.95V and slightly raised clockspeeds:

 

image.thumb.jpeg.4d9d2d585e916cb4092b10897a58033d.jpeg

 

S.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...