chiliwili69 Posted January 7, 2021 Posted January 7, 2021 (edited) I just would like to know if anyone in this forum uses an AMD 6800XT or 6900XT graphic card and how well it run IL-2 on it. Thanks Edited January 8, 2021 by chiliwili69 2
Jaws2002 Posted January 14, 2021 Posted January 14, 2021 (edited) I'm curious as well. From what I heard, in DCS the new AMD cards are doing really well, better than 3080. With the rumors that 3080Ti is postponed indefinitely, I'm having a second look at my options. Edited January 14, 2021 by Jaws2002
mattebubben Posted January 15, 2021 Posted January 15, 2021 While i dont have a 6800XT or 6900XT i do have a Reference 6800 Non XT card. And since there have not been any other 6000 series responses i figure that might still be useful info. I have not been able to play a lot since i got it (two weeks ago) due to monitor and health issues. But it has worked perfectly so far. I upgraded to 1440P at the same time and maxed out at 1440P the 6800 (Non XT) runs great for both IL2 and DCS. So the 6800XT and 6900XT should be even better.
Voyager Posted January 15, 2021 Posted January 15, 2021 7 hours ago, mattebubben said: While i dont have a 6800XT or 6900XT i do have a Reference 6800 Non XT card. And since there have not been any other 6000 series responses i figure that might still be useful info. I have not been able to play a lot since i got it (two weeks ago) due to monitor and health issues. But it has worked perfectly so far. I upgraded to 1440P at the same time and maxed out at 1440P the 6800 (Non XT) runs great for both IL2 and DCS. So the 6800XT and 6900XT should be even better. Do you think you could run Vander's benchmark on it, so we can get some hard number? SYN_Vander BENCHMARK v6 to measure IL-2 performance in monitor & VR - Hardware, Software and Controllers - IL-2 Sturmovik Forumhttps://forum.il2sturmovik.com/topic/66924-syn_vander-benchmark-v6-to-measure-il-2-performance-in-monitor-vr/ 1
skline00 Posted January 16, 2021 Posted January 16, 2021 Also have a RX 6800 coupled with a 5900x. Extremely fast!
Alonzo Posted January 16, 2021 Posted January 16, 2021 I'm very interested in this, too. I am running out of VRAM on the jets simulator in multiplayer, in VR with a 3080 10GB. So even if the 6800XT or 6900XT is relatively similar in performance, they might be a lot better. IL2 runs great on the 10GB 3080, I think the programmers here actually know a thing or two about freeing memory once they're done with it! ?
PA_Willy Posted January 16, 2021 Posted January 16, 2021 Problem with these AMD cards, is the absence of DLSS. Nvidia is better for high resolutions with DLSS enabled. AMD has not got any similar. I fly at 4K (3840*2160). Thats the reason I'm waiting and reading opinions before buying. Our sim has not got raytracing. But DLSS can be important.
Voyager Posted January 16, 2021 Posted January 16, 2021 15 hours ago, skline00 said: Also have a RX 6800 coupled with a 5900x. Extremely fast! @mattebubben Have you run the Vander benchmarks on it? The one guy who did run them showed it to be only about 60% the speed of a 3080. That's why were asking people to directly empirically measure their rigs, because the one rig someone has actually tested on has been, objectively, slow. It may be an anomaly, but until *someone* else tests their GPU, we will not know. 1 2
mattebubben Posted January 16, 2021 Posted January 16, 2021 (edited) Not yet. And i dont know when i will be able to. Im currently mostly unable to spend time at my PC for medical reasons (https://en.wikipedia.org/wiki/Labyrinthitis) So i have only been able to test it very briefly. And to meet the requirements for that benchmarks it feels like it would require far more time/effort then i feel like im able to do atm. Though from my limited testing it does seem like there might be some performance issues with it in Il2. Since the breif tests i did when i first got the GPU showed better FPS in DCS then in IL2 and that does not feel right. (Though it had no problems running either) Edited January 16, 2021 by mattebubben
Ribbon Posted January 17, 2021 Posted January 17, 2021 8 hours ago, PA_Willy said: Problem with these AMD cards, is the absence of DLSS. Nvidia is better for high resolutions with DLSS enabled. AMD has not got any similar. I fly at 4K (3840*2160). Thats the reason I'm waiting and reading opinions before buying. Our sim has not got raytracing. But DLSS can be important. I think Il2 doesn't support DLSS, in Cyberpunk DLSS save the day.....it does bring a big smoother performance boost while keeping image quality. They say DLSS isn't hard or too much work to implement, i don't know what il2 devs are waiting for!
Voyager Posted January 17, 2021 Posted January 17, 2021 10 minutes ago, =VARP=Ribbon said: I think Il2 doesn't support DLSS, in Cyberpunk DLSS save the day.....it does bring a big smoother performance boost while keeping image quality. They say DLSS isn't hard or too much work to implement, i don't know what il2 devs are waiting for! As I understand it, they need to both collaborate with nVidia and then ensure that it does not impact spotting. Given the heat that spotting has generated, I imagine it is the latter that would be the most worrying. Also, it wouldn't surprise me if they needed to do it each time they added a major new art asset to the game. It is, as I understand it, generating an algorithm to regenerate high resolution art assets from lower resolution ones, and I imagine the AI algorithm generator would need to see the art asset to learn it.
Ribbon Posted January 17, 2021 Posted January 17, 2021 Just now, Voyager said: As I understand it, they need to both collaborate with nVidia and then ensure that it does not impact spotting. Given the heat that spotting has generated, I imagine it is the latter that would be the most worrying. Also, it wouldn't surprise me if they needed to do it each time they added a major new art asset to the game. It is, as I understand it, generating an algorithm to regenerate high resolution art assets from lower resolution ones, and I imagine the AI algorithm generator would need to see the art asset to learn it. Ahh shame!
Voyager Posted January 17, 2021 Posted January 17, 2021 4 minutes ago, =VARP=Ribbon said: Ahh shame! So, they do have dynamic resolution on the options however. I dialed mine down to 0.8, and it does give a health boost to minimum and average framerates. I haven't tried it yet in a mission (spent the last couple of weeks working on my computer upgrades and memory overclocking) so we'll see how well it works. Hopefully I'll see higher resolutions during the long parts of the flights and higher frame rates during the dog fights. We shall see.
Scott_Steiner Posted January 17, 2021 Posted January 17, 2021 Only a few popular games support DLSS, not many older as well as niche games. Besides, I think you will destroy your ability to spot enemies at a distance by running DLSS, do you really want to compress visuals and make them blend together and be out of place at the pixel level? Be careful what you are actually asking for. I as well have a 6800 non-xt, reference model, it was what I could actually buy, and was not overpriced like the AIB non-reference cards that are now getting outrages tariffs. The rest of my system is older but I would be happy to run some tests to help show how a card like this performs in IL-2.
Alonzo Posted January 17, 2021 Posted January 17, 2021 2 hours ago, Voyager said: As I understand it, they need to both collaborate with nVidia and then ensure that it does not impact spotting. Given the heat that spotting has generated, I imagine it is the latter that would be the most worrying. Also, it wouldn't surprise me if they needed to do it each time they added a major new art asset to the game. It is, as I understand it, generating an algorithm to regenerate high resolution art assets from lower resolution ones, and I imagine the AI algorithm generator would need to see the art asset to learn it. NVidia claim that DLSS 2.0 works across games, without needing per-game (or per-asset) training. How good that actually is remains to be seen -- they always release a driver with new AAA games, and it may be that the driver contains DLSS-specific tweaks for games. But you also need the game developer to integrate it as an option. I would also be quite worried about the impact on spotting. 1 hour ago, Voyager said: So, they do have dynamic resolution on the options however. I dialed mine down to 0.8, and it does give a health boost to minimum and average framerates. I haven't tried it yet in a mission (spent the last couple of weeks working on my computer upgrades and memory overclocking) so we'll see how well it works. Hopefully I'll see higher resolutions during the long parts of the flights and higher frame rates during the dog fights. We shall see. Dynamic resolution just makes the game look like pixelated garbage, personally I don't like it. I think it's overly aggressive. But it *is* the kind of feature that, if it worked well and was subtle, would help keep consistent framerates.
SakerVVS Posted January 17, 2021 Posted January 17, 2021 17 hours ago, mattebubben said: Not yet. And i dont know when i will be able to. Im currently mostly unable to spend time at my PC for medical reasons (https://en.wikipedia.org/wiki/Labyrinthitis) So i have only been able to test it very briefly. I hope you get better soon. I get these symptoms from allergies sometimes and I know how miserable that can be to feel sick even laying down with eyes closed! 1
thermoregulator Posted January 18, 2021 Posted January 18, 2021 6800 XT for my secondary system has been delayed, bud I should get it tomorrow, so I will try to benchmark it with 5600x. 4
SCG_Fenris_Wolf Posted January 18, 2021 Posted January 18, 2021 (edited) You could also benchmark it against your prime system @thermoregulator Then we'd see its performance against the RTX 3000 series. All you need to remember is to enable the resizable address bar / SAM. Edited January 18, 2021 by SCG_Fenris_Wolf
thermoregulator Posted January 18, 2021 Posted January 18, 2021 2 hours ago, SCG_Fenris_Wolf said: You could also benchmark it against your prime system @thermoregulator Then we'd see its performance against the RTX 3000 series. All you need to remember is to enable the resizable address bar / SAM. Will try. I just hope my Windows is operating under UEFI mode, but i think so.
Scott_Steiner Posted January 19, 2021 Posted January 19, 2021 (edited) So I did Vander's Benchmark. Specs: Motherboard: Alienware 07JNH0 CPU: Intel Core i7 3930K CPU Freq: 3.8 Ghz (Max Turbo, not OC'd) L3 cache: 12 MB Cores: 6 Threads: 12 RAM type: DDR3 RAM size: 32GB (4x8GB) RAM Freq: 1600 MHz RAM timings: 11-11-11-28-208 GPU: AMD RX 6800 (Reference) Results: 1920x1080 Frames: 2636 - Time: 60000ms - Avg: 43.933 - Min: 37 - Max: 58 3840x1600 (not quite 4k) Frames: 2455 - Time: 60000ms - Avg: 40.917 - Min: 33 - Max: 54 It's obvious that I am highly CPU limited. Very much due to IL-2's engine being built sometime when single-threaded performance from quad-core CPUs was pretty much the online thing. I am guessing that if I put my old 2500K quad-core back in with this card, I would probably get a little better framerates in a fully loaded scene then my current i7-3930K hexcore, though that would probably be pretty trivial. Now for some data that may be useful to others. Running the benchmarks in 1920x1080 and 3840x1600 resulted in about the same GPU and CPU usage, via looking at MSI Afterburner. About 50% on the GPU side and about 30% on the CPU side (due to the poor core and thread optimization). VRAM usage is at about 8GB (3080 and 3070 owners beware of what the future holds for you). RAM usuage was at about 17GB. So really, the GPU has to be throttled down so much to match the CPUs cycling that it is only working in about half-time. On an empty map in the quick mission builder I can get it to go up to 100% usage but only with Ultra settings and 4x MSAA, this will run at over 100fps. If I back it down to 2x MSAA, it will go passed 120fps and GPU usage will not be pegged at 100% all the time. So take that for what it is. Edited January 19, 2021 by Scott_Steiner Forgot to include computer specs 1
chiliwili69 Posted January 23, 2021 Author Posted January 23, 2021 On 1/19/2021 at 2:17 AM, Scott_Steiner said: 3840x1600 (not quite 4k) Frames: 2455 - Time: 60000ms - Avg: 40.917 - Min: 33 - Max: 54 Thank you for testing that. As you said, the limiting factor in the GPU test is the CPU, so this GPU test can not be used to give any usefull data about the potential of this GPU. The CPU % load shown in the afterburner is just and average of the % load of your 4 cores. The heavy thread (which bottleneck the core) of IL-2 is jumping from core to core for windows thermal reasons. So in general there is no useful info to look at CPU % load when playing IL-2.
skline00 Posted January 23, 2021 Posted January 23, 2021 This is the 1920x1080 score for my 5900x/RX6800 rig. 2021-01-22 21:41:13 - Il-2Frames: 5956 - Time: 60000ms - Avg: 99.267 - Min: 62 - Max: 137. Specs of the rig are: MB MSI X570 Unify CPU 5900x GPU Gigabyte RX6800-stock Ram 32g(2x16) Gskill DDR4-3600 CL16-19-19-39 Cooling Custom Watercooled (cpu=Optimus wb waiting for Alpha cool wb for gpu; 480mm+360mm rads;EK D5-140mm pump/res combo) SSDs OS Sabrent 1 tb NvMe-4-4; Data Sabrent 2Tb Nvme-3 Case Fractal Define 7XL PSU Corsair 850
chiliwili69 Posted January 23, 2021 Author Posted January 23, 2021 1 hour ago, skline00 said: This is the 1920x1080 score for my 5900x/RX6800 rig. 2021-01-22 21:41:13 - Il-2Frames: 5956 - Time: 60000ms - Avg: 99.267 - Min: 62 - Max: 137. Thank you for this. This is interesting since the 5900X chips usually deliver around 130fps in this test. So, perhaps the 6800 is limiting this somehow. I will update this to the table, but will mark in color the result just to be aware that could the GPU guilty.
Alonzo Posted January 23, 2021 Posted January 23, 2021 3 hours ago, skline00 said: Ram 32g(2x16) Gskill DDR4-3600 CL16-19-19-39 You might get some more juice out of the CPU by tweaking the RAM, here. If you can push it to 3800 and possibly tighten those secondary timings, it might help. I'm not an AMD expert (yet! I have the stuff on order....) but others here have repeatedly said that fast ram = fast Ryzen 5000.
skline00 Posted January 23, 2021 Posted January 23, 2021 Chiliwili69, reran the test and fixed some of the incorrect settings for the 1920x1080 run and stock 5900x/RX6800 with ram at 3600. 2021-01-23 13:55:12 - Il-2 Frames: 6124 - Time: 60000ms - Avg: 102.067 - Min: 85 - Max: 144
chiliwili69 Posted January 23, 2021 Author Posted January 23, 2021 55 minutes ago, skline00 said: Frames: 6124 - Time: 60000ms - Avg: 102.067 - Min: 85 - Max: 144 OK, thanks. Still low for your CPU. I see you have other GPUs, so you can try to put the 2080Ti or 1080Ti with the 5900X. So, you will how far you can go with the 5900X in the CPU test.
skline00 Posted January 24, 2021 Posted January 24, 2021 If I have time I'll try but right now having fun using it. 2 1
354thFG_Drewm3i-VR Posted July 16, 2021 Posted July 16, 2021 (edited) Bump, did the driver issues with the 6900 xt ever get resolved? I can buy a beastly system for almost 1k less than a similarly specced rtx 3080 ti system...trying to run a reverb g2 in il2 and dcs, where the amd cards are better supposedly. Edited July 16, 2021 by =AW=drewm3i-VR
RAAF492SQNOz_Steve Posted July 16, 2021 Posted July 16, 2021 1 hour ago, =AW=drewm3i-VR said: Bump, did the driver issues with the 6900 xt ever get resolved? I can buy a beastly system for almost 1k less than a similarly specced rtx 3080 ti system...trying to run a reverb g2 in il2 and dcs, where the amd cards are better supposedly. I would (strongly) recommend going for the 3080 Ti if your are using your PC with a HP Reverb G2 I would get a combo that used a Ryzen 5000 series CPU i.e. 5800x Generally speaking, the AMD GPU's struggle with 4k resolution's compared a 3080 Ti and running a HP G2 is a significantly higher resolution that 4K.
354thFG_Drewm3i-VR Posted July 16, 2021 Posted July 16, 2021 (edited) 56 minutes ago, RAAF492SQNOz_Steve said: I would (strongly) recommend going for the 3080 Ti if your are using your PC with a HP Reverb G2 I would get a combo that used a Ryzen 5000 series CPU i.e. 5800x Generally speaking, the AMD GPU's struggle with 4k resolution's compared a 3080 Ti and running a HP G2 is a significantly higher resolution that 4K. Yep, getting a 5800x paired with 32gb of 3600 mhz cl16 ram. Why would the 6900 xt struggle with 4k? It has more memory and benchmarks about the same? It is supposedly better than the 3080 ti in DCS. It's also more power efficient. Just wondering if it still performs poorly in IL-2...? Edited July 16, 2021 by =AW=drewm3i-VR
RAAF492SQNOz_Steve Posted July 16, 2021 Posted July 16, 2021 37 minutes ago, =AW=drewm3i-VR said: Yep, getting a 5800x paired with 32gb of 3600 mhz cl16 ram. Why would the 6900 xt struggle with 4k? It has more memory and benchmarks about the same? It is supposedly better than the 3080 ti in DCS. It's also more power efficient. Just wondering if it still performs poorly in IL-2...? I would consider that the 3080 Ti has the advantage and a superficial look at UserBenchMark results appears to support this. I do not have a 6900 Xt so my my comments are generic. The higher resolution performance advantage of the 3080 Ti may be related to the fact that it has considerably faster GDDR6X memory fitted. IL2 does not need more than about 8 Gb of RAM, when using the HP Reverb G2, so the extra RAM fitted to the 6900-XT does not improve IL2 performance.
354thFG_Drewm3i-VR Posted July 16, 2021 Posted July 16, 2021 They seem neck and neck in optimized games... I have no doubt the 3080 ti is slightly better, but if the amd drivers got fixed it should be close now. Maybe a 3070 ti or 3080 would still be a better option though. I just dont want to pay 3.5k for a computer just for vr when I already have an expensive gaming laptop.
LLv34_Flanker Posted July 16, 2021 Posted July 16, 2021 S! I tend to look at Guru3D benchmarks for performance, have a more varied game selection than userbenchmark. And AMD has a feature being released similar to DLSS. Look here: AMD FidelityFX Super Resolution | AMD Not propertiary as nVidia´s DLSS.
chiliwili69 Posted July 16, 2021 Author Posted July 16, 2021 4 hours ago, =AW=drewm3i-VR said: Just wondering if it still performs poorly in IL-2...? Lastest SYN_VANDER test of a 6900XT card was done by @RufusK at 21st-May with version 4.601 of IL-2, and it was still performing poorly. The test was this. The issue was reported to Dev team here. If the dev team needs a 6800XT or 6900XT to checkout this and solve it, I think we could altruistically agree to collect some funding from the IL-2 community and give that gift to the dev team... I would contribute with 20€ even having a 3080 right now.
RAAF492SQNOz_Steve Posted July 16, 2021 Posted July 16, 2021 (edited) 2 hours ago, =AW=drewm3i-VR said: They seem neck and neck in optimized games... I have no doubt the 3080 ti is slightly better, but if the amd drivers got fixed it should be close now. Maybe a 3070 ti or 3080 would still be a better option though. I just dont want to pay 3.5k for a computer just for vr when I already have an expensive gaming laptop. These test results actually reinforce my message. Below 4k, get the 6900 Xt. At 4K the RTX 3080 Ti is starting to kick the 6900 Xt's butt. This performance differential is only going to get (much) worse as you run at the higher resolutions that work best for the HP G2. AMD made some design compromises with smaller data bus bandwidth and using the cheaper memory, for this generation of GPU's, and it shows at higher resolutions regardless of IL2's driver optimisation issues. Yes, it is a lot of money but the price of 3080 Ti's is starting to come down at least here in Australia. Have dropped about 25% from their retail peak. Hopefully the same will occur soon in your country. Edited July 16, 2021 by RAAF492SQNOz_Steve Yet another typo and re-added screen capture
354thFG_Drewm3i-VR Posted July 16, 2021 Posted July 16, 2021 (edited) 11 hours ago, RAAF492SQNOz_Steve said: These test results actually reinforce my message. Below 4k, get the 6900 Xt. At 4K the RTX 3080 Ti is starting to kick the 6900 Xt's butt. This performance differential is only going to get (much) worse as you run at the higher resolutions that work best for the HP G2. AMD made some design compromises with smaller data bus bandwidth and using the cheaper memory, for this generation of GPU's, and it shows at higher resolutions regardless of IL2's driver optimisation issues. Yes, it is a lot of money but the price of 3080 Ti's is starting to come down at least here in Australia. Have dropped about 25% from their retail peak. Hopefully the same will occur soon in your country. Not trying to stir the pot, but 6.37% is kicking the butt of the 6900xt while drawing much more power? The 6900 XT is also about 5-10% better in DCS and I am only buying this PC for DCS, IL-2 and maybe MSFS. I have no dog in this fight, but I can't justify spending 2k on a gpu alone. FWIW, I've always had Intel/Nvidia systems, but the tactics of those two has turned me off to them. Perhaps a better question then, would be 3070 Ti/3080 vs. 6900 XT, because I really don't want to drop 2k on a gpu that I will likely upgrade in a year when the 4000 series cards come out. 11 hours ago, LLv34_Flanker said: S! I tend to look at Guru3D benchmarks for performance, have a more varied game selection than userbenchmark. And AMD has a feature being released similar to DLSS. Look here: AMD FidelityFX Super Resolution | AMD Not propertiary as nVidia´s DLSS. FSR looks great. 11 hours ago, chiliwili69 said: Lastest SYN_VANDER test of a 6900XT card was done by @RufusK at 21st-May with version 4.601 of IL-2, and it was still performing poorly. The test was this. The issue was reported to Dev team here. If the dev team needs a 6800XT or 6900XT to checkout this and solve it, I think we could altruistically agree to collect some funding from the IL-2 community and give that gift to the dev team... I would contribute with 20€ even having a 3080 right now. This is a shame because these are very good cards in terms of performance and also value. Edited July 16, 2021 by =AW=drewm3i-VR
Voyager Posted July 17, 2021 Posted July 17, 2021 21 hours ago, =AW=drewm3i-VR said: Yep, getting a 5800x paired with 32gb of 3600 mhz cl16 ram. Why would the 6900 xt struggle with 4k? It has more memory and benchmarks about the same? It is supposedly better than the 3080 ti in DCS. It's also more power efficient. Just wondering if it still performs poorly in IL-2...? From what I'm seeing on the ED forums, it's better in monitor mode, but worse in VR. As I understand it, AMD is using a very fast cache to compensate for using slower ram on a narrower bus. However, the cache itself seems to have ended up sized a little bit to small for 4K, so 4K ends up driving many more cache misses than lower resolutions. And this gets worse the higher the resolution gets. Given that that is also the big difference between RDNA1 and RDNA2, the issue with Il-2 may be more fundamental than a driver incompatibility. We've already seen in CPU testing that Il-2 is extremely sensitive to memory latency. That may be an issue here too. I also suspect that the nVidia design isn't as efficient at lower resolutions. Basically, the nVidia parts have many more, but apparently weaker shaders than the AMD cards do. I suspect at lower resolutions, it just isn't getting enough pixels to efficiently fills the card, so at 1440p and 1`080, more chunks of the card end up twiddling their thumbs. that could be why, in most applications, the 3090 isn't faster than the 3080, despite having 20% more GPU in the mix. That is, until us lunatics start trying to jam 19Mpixels through it, and you actually start seeing a 20% performance difference showing up in the numbers. But it really takes that "turn the dial to 11 and break the knob" sort of thing to see it. The next year will be interesting to see how things pan out. 3
354thFG_Drewm3i-VR Posted July 17, 2021 Posted July 17, 2021 50 minutes ago, Voyager said: From what I'm seeing on the ED forums, it's better in monitor mode, but worse in VR. As I understand it, AMD is using a very fast cache to compensate for using slower ram on a narrower bus. However, the cache itself seems to have ended up sized a little bit to small for 4K, so 4K ends up driving many more cache misses than lower resolutions. And this gets worse the higher the resolution gets. Given that that is also the big difference between RDNA1 and RDNA2, the issue with Il-2 may be more fundamental than a driver incompatibility. We've already seen in CPU testing that Il-2 is extremely sensitive to memory latency. That may be an issue here too. I also suspect that the nVidia design isn't as efficient at lower resolutions. Basically, the nVidia parts have many more, but apparently weaker shaders than the AMD cards do. I suspect at lower resolutions, it just isn't getting enough pixels to efficiently fills the card, so at 1440p and 1`080, more chunks of the card end up twiddling their thumbs. that could be why, in most applications, the 3090 isn't faster than the 3080, despite having 20% more GPU in the mix. That is, until us lunatics start trying to jam 19Mpixels through it, and you actually start seeing a 20% performance difference showing up in the numbers. But it really takes that "turn the dial to 11 and break the knob" sort of thing to see it. The next year will be interesting to see how things pan out. Very good post. Thanks for the technical explanation. I guess I'll look for a 3080...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now