=RS=Funkie Posted April 18, 2020 Posted April 18, 2020 I've seen a couple of times now, that the graphic programmer is going to switch the engine from Forward Renderer to Deferred. I've seen around the VR subreddits that Forward is the preferred rendering method due to it's performance. Does that mean when it switches over, we're going to see a performance hit?
coconut Posted April 18, 2020 Posted April 18, 2020 Not necessarily. Part of the reason they are doing it is to shift some of the computation from cpu to gpu. So depending on your hardware and your supersampling value, you might lose or win frames
=RS=Funkie Posted April 18, 2020 Author Posted April 18, 2020 Interesting! Thanks coconut. My 2080ti is more interested in this now.
Alonzo Posted April 18, 2020 Posted April 18, 2020 I think it's very hard to say what the effect might be. In the run-up to the recent damage model patch there were comments that the damage modeling code had improved performance, but then the patch notes at release said the opposite -- the new modeling was heavier (especially for servers) than before. I'm not meaning to criticise, just relating an anecdote. But to me, it means we'd need the new rendering tech in our hands and to test it up, down and sideways before we know the performance implications.
Chamidorix Posted June 27, 2020 Posted June 27, 2020 (edited) Deferred rendering is absolutely atrocious for VR performance for 3 reasons: 1. The biggest is is destroys the value of MSAA. Deferred shading means that lighting shaders will be computed at different times, meaning that they can run after MSAA has done its anti aliasing work. So the shader just comes in a recolors everything that has been anti aliased, lowering the value of if not completely destroying the antialiased jaggies. In VR with its far lower perceived pixel density vs a monitor, this greatly exacerbates shimmering. 2. As mentioned, the cpu stops handling the lighting staging in the pipeline and instead just has simple calls to fire off API batches to the GPU. In VR you are pretty much always GPU bottlenecked as opposed to CPU bottle necked (even with an overclocked watercooled 2080ti, like I have), so adding more GPU work is NOT a good thing. 3. It becomes far easier to throw pixel shaders at the game over time, during development, since it no longer requires pipeline refactoring to handle. As much as devs have good intentions, deferred rendering always seems to lead to worse and worse per pixel shading performance over patches. Just see DCS World. Back in 2.5.0, the game ran beautifully in VR (faster than current IL2). Then they started moving to deferred shading in 2.5.1, and totally converted all pixel shaders to deferred in 2.5.6, and performance has just been absolutely abysmal in VR. The game is shimmery and you can't do anything about it, since MSAA doesn't work on most edges and the shading cost is far too high to effectively super-sample even on top-end hardware. The only way to have a deferred renderer and combat shimmering effectivly is to have well implemented TAA. For example, a lot of oculus exclusives. But still, you get all sorts of complaints of the fundamental blurriness that is hard to solve with TAA. And it is HARD to make a good TAA implementation; the heuristics to reduce pixel jitter are highly game specific and not easily developed. There is a reason pretty much every single graphically performant but beautiful VR game (see Half life Alyx, boneworks, Saints and SInners) uses forward rendering with MSAA. Edited June 27, 2020 by Chamidorix 1
Alonzo Posted June 27, 2020 Posted June 27, 2020 4 hours ago, Chamidorix said: 2. As mentioned, the cpu stops handling the lighting staging in the pipeline and instead just has simple calls to fire off API batches to the GPU. In VR you are pretty much always GPU bottlenecked as opposed to CPU bottle necked (even with an overclocked watercooled 2080ti, like I have), so adding more GPU work is NOT a good thing. Have you actually played IL2 in VR, before and after the deferred rendering patch? Before the patch, the CPU was a bottleneck. After the patch, our GPUs are free to actually stretch their legs. This is measurable. Most VR pilots really liked the deferred rendering update and said so, pretty loudly, in the patch thread. This is your first post. Are you a DCS refugee giving us an opinion on deferred rendering based on problems with that other game? (If you are a DCS refugee, welcome! This game's great, you'll like it!)
DD_Arthur Posted June 27, 2020 Posted June 27, 2020 5 hours ago, Chamidorix said: Just see DCS World. Back in 2.5.0, the game ran beautifully in VR (faster than current IL2). Then they started moving to deferred shading in 2.5.1, and totally converted all pixel shaders to deferred in 2.5.6, and performance has just been absolutely abysmal in VR. Sorry but this is simply not true. I have both DCS and GBS and enjoy them both in VR. So far GBS has always run better in VR than DCS and current DCS performance is certainly not 'absolutely abysmal' either.
dburne Posted June 28, 2020 Posted June 28, 2020 1 hour ago, DD_Arthur said: Sorry but this is simply not true. I have both DCS and GBS and enjoy them both in VR. So far GBS has always run better in VR than DCS and current DCS performance is certainly not 'absolutely abysmal' either. I have and use both in VR and would agree with that as well.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now