Jaws2002 Posted December 16, 2020 Author Posted December 16, 2020 Evga is a good company. I had a gtx 285, a gtx 590 and a gtx 780 from them. The last two burned on me and that's why i switched to Gigabyte, but this generation i would buy an EVGA
Panzerlang Posted December 16, 2020 Posted December 16, 2020 From what I read about the 30-series cards flaking out on boost, being down to cheap-ass caps, Asus were the only ones confirmed to be using the required number of high-grade ones ("military grade" they claim). Pics of the open back of the GPU were referenced by reviewers.
Alonzo Posted December 16, 2020 Posted December 16, 2020 2 minutes ago, J3Hetzer said: From what I read about the 30-series cards flaking out on boost, being down to cheap-ass caps, Asus were the only ones confirmed to be using the required number of high-grade ones ("military grade" they claim). Pics of the open back of the GPU were referenced by reviewers. This was debunked a while ago, it was a driver issue. 8 minutes ago, Gambit21 said: Evga or Asus Whichever is in stock. 1
Gambit21 Posted December 16, 2020 Posted December 16, 2020 4 minutes ago, Alonzo said: Whichever is in stock. Yep That’s what I mean.
Drum Posted December 16, 2020 Posted December 16, 2020 (edited) Patience guys... Edited December 16, 2020 by Drum
Ribbon Posted December 18, 2020 Posted December 18, 2020 My first Evga card, got the one with black lips ? Running il2 max settings in VR Rift S and Cyberpunk on ultra settings with ray tracing smoothly on 2k monitor. Still waiting Reverb G2 (should be here in a week or two according to Bestware)! 1
Ribbon Posted December 18, 2020 Posted December 18, 2020 Forgot to say that i've tried DLSS in Cyberpunk 2077 and it makes a big performance improvement while keeping image quality. We need this in il2!!!
SYN_Vander Posted December 18, 2020 Posted December 18, 2020 47 minutes ago, =VARP=Ribbon said: Forgot to say that i've tried DLSS in Cyberpunk 2077 and it makes a big performance improvement while keeping image quality. We need this in il2!!! LOL, first we need the videocards! I have been waiting on my 3080 for two months I think, now number 93 in queue ?
Ribbon Posted December 18, 2020 Posted December 18, 2020 24 minutes ago, SYN_Vander said: LOL, first we need the videocards! I have been waiting on my 3080 for two months I think, now number 93 in queue ? Forgot to mention that DLSS test in Cyberpunk i did was on rtx2070s. I think any rtx gpu supports DLSS! 93 in queue could be very good news if they receive at least 100 gpu's in next batch.....???
Bernard_IV Posted December 18, 2020 Posted December 18, 2020 Nice, that cooler is huge. The XC3 model cooler is too small IMO. I've got my fans up too high to keep up. I'm going to trade this card for a TI model or a 3090 when I get the chance.
SCG_Fenris_Wolf Posted December 18, 2020 Posted December 18, 2020 Well if you need an RTX 3080 FE earlier, you can private message me. No sale/purchase/anything talks on the forums though, I'll let you know what I mean via PM.
Ribbon Posted December 19, 2020 Posted December 19, 2020 16 hours ago, Bernard_IV said: Nice, that cooler is huge. The XC3 model cooler is too small IMO. I've got my fans up too high to keep up. I'm going to trade this card for a TI model or a 3090 when I get the chance. Step up program from Evga is really nice feature in those circumstances. I still got the high temps on full load (up to 83°C) which makes me a bit uncomfortable but i've read somewhere that i need to adjust fan curves or undervolt it.....going to investigate that a bit further!
Bernard_IV Posted December 19, 2020 Posted December 19, 2020 The 3080ti Hybrid would be awesome to have. I bet it is dead quiet.
Alonzo Posted December 19, 2020 Posted December 19, 2020 12 hours ago, =VARP=Ribbon said: I still got the high temps on full load (up to 83°C) which makes me a bit uncomfortable but i've read somewhere that i need to adjust fan curves or undervolt it.....going to investigate that a bit further! I've undervolted my 3080 FTW3 and it's made quite a difference. Max load power draw dropped from 380W to ~300W, so there's literally 80 watts less heat the cooler needs to dissipate. 2
SCG_Fenris_Wolf Posted December 19, 2020 Posted December 19, 2020 (edited) My RTX 3080 FE pulled even with my MSI RTX 3080 Trio X Gaming across the board - 3DMark's Port Royal, Timespy Extreme, IL-2 V6 test. 3080 FE goes to a power target of 115%, Core +105MHz, no fan curve edited. Boosts to 2040MHz, holds 1985MHz indefinitely. Stays at 73°C which is not hot - it never temperature throttled, I monitored it. Note however - never OC the VRAM in series 3000 cards! I have to repeat this info again and again until people realise, I still see this mistake ever so often made. It's self-correcting RAM. If you change its frequency it'll start throwing errors quickly and throttles. This has been verified again and again by people all over the internet (and by me as well in IL-2 with both Trio X G and FE). The max OC'd RTX 3080 Founders Edition was even to an OC'd Trio X Gaming down to a frame-range of +-1 average. The result? As @Alonzo said: Buy whatever is in stock. 99% aren't binned. (The 1% is the Kingpin). Edited December 19, 2020 by SCG_Fenris_Wolf 1 1
dburne Posted December 19, 2020 Posted December 19, 2020 Running my 3090 FTW3 Ultra at +143 core, +500 Vram, 107% Power Target, full voltage. That is my gaming profile for the card. Gives me a constant 2115 MHz and rarely breaks 60c. I blast the fans though on the card, since gaming in VR with earbuds isolates the noise. My game profile runs the fans at 94%. 1
SCG_Fenris_Wolf Posted December 19, 2020 Posted December 19, 2020 (edited) That's a 3090. I was talking about 3080s. Addendum: But just to be sure, note that the VRAM OC will throttle you as soon as it starts throwing some errors, on the 3090 as well - but as opposed to the 3080s, I couldn't test that yet. My 3090 comes Monday. You should see that its performance plateaus before getting there. If you found some overhead that's fine, but does it give better fps? Best to bench it with the VRAM OC'd and with usual OC (but no VRAM), compare, and take the healthy alternative. Edited December 19, 2020 by SCG_Fenris_Wolf
Bernard_IV Posted December 19, 2020 Posted December 19, 2020 41 minutes ago, dburne said: Running my 3090 FTW3 Ultra at +143 core, +500 Vram, 107% Power Target, full voltage. That is my gaming profile for the card. Gives me a constant 2115 MHz and rarely breaks 60c. I blast the fans though on the card, since gaming in VR with earbuds isolates the noise. My game profile runs the fans at 94%. I think you got a really nice chip. That one is a keeper. 1
Alonzo Posted December 19, 2020 Posted December 19, 2020 I've got the 3080, but really want the 3080ti for the extra VRAM. I can EVGA step-up, but only if the new card comes out for step-up before Feb 26th. Any idea how much I'd lose just selling the card after buying a 3080ti ? Or I guess I could step up to a 3090 but that seems like it would waste more cash. Or just sell the 3080 in the new year and buy a 16GB 6900XT with a good cooler?
dburne Posted December 19, 2020 Posted December 19, 2020 (edited) 1 hour ago, Alonzo said: I've got the 3080, but really want the 3080ti for the extra VRAM. I can EVGA step-up, but only if the new card comes out for step-up before Feb 26th. Any idea how much I'd lose just selling the card after buying a 3080ti ? Or I guess I could step up to a 3090 but that seems like it would waste more cash. Or just sell the 3080 in the new year and buy a 16GB 6900XT with a good cooler? Yeah I would probably not recommend spending that kind of money on the 3090 with a 3080 Ti maybe on the horizon. If the card shortage keeps up you probably will be able to sell that 3080 for a good price. I am sure 6900 would be good also, they do though seem to pair better with AMD boards. Edited December 19, 2020 by dburne
dburne Posted December 20, 2020 Posted December 20, 2020 3 hours ago, SCG_Fenris_Wolf said: That's a 3090. I was talking about 3080s. Addendum: But just to be sure, note that the VRAM OC will throttle you as soon as it starts throwing some errors, on the 3090 as well - but as opposed to the 3080s, I couldn't test that yet. My 3090 comes Monday. You should see that its performance plateaus before getting there. If you found some overhead that's fine, but does it give better fps? Best to bench it with the VRAM OC'd and with usual OC (but no VRAM), compare, and take the healthy alternative. Perhaps I will check that out in the next few days.
Alonzo Posted December 20, 2020 Posted December 20, 2020 2 hours ago, dburne said: Yeah I would probably not recommend spending that kind of money on the 3090 with a 3080 Ti maybe on the horizon. If the card shortage keeps up you probably will be able to sell that 3080 for a good price. I am sure 6900 would be good also, they do though seem to pair better with AMD boards. I could step-up to the 3090, that was my thinking. But it'd be CAD$1000 to do so which is fairly silly. To be honest I'm considering a switch to AMD, my 5.1ghz 8086K is doing alright in IL2 but it gets trashed in the other sim (and even in IL2 I think I could use more CPU power for multiplayer, Fenris seems to achieve a locked-90 in MP with his high-end AMD rig). And again in that other sim, apparently the AMD cards bench a little higher than the 3080 and I'd immediately get the 16GB VRAM. Ray tracing has been kinda nice for Cyberpunk's accurate reflections, but the rasterized shadows and lighting looks very similar to their raytraced counterparts and run a ton faster.
Ribbon Posted December 20, 2020 Posted December 20, 2020 7 hours ago, Alonzo said: I've undervolted my 3080 FTW3 and it's made quite a difference. Max load power draw dropped from 380W to ~300W, so there's literally 80 watts less heat the cooler needs to dissipate. I adjusted fan curves (100% at 70C) and now it tops 78°C but i'll go with undervolting too.
SCG_Fenris_Wolf Posted December 20, 2020 Posted December 20, 2020 (edited) @Alonzo, the AMD cards bench lower than the 3080 in high-res (say >9mio pixels) VR environments. The 6900xt barely matches the 3080 here. Initially I wanted to get a 6900xt, but I have stepped away from it due to this. Don't ask me why, might be the considerably slower VRAM and that il-2/dx11(?) doesn't manage the infinity cache properly? Just guessing. Edited December 20, 2020 by SCG_Fenris_Wolf 1
Bernard_IV Posted December 22, 2020 Posted December 22, 2020 The 10gb Vram bothers me a bit, though it has not been a problem yet. Part of it is that I compromised a little, didn't get exactly what I wanted. A 3080ti FTW Hybrid model is what I want.
Alonzo Posted December 22, 2020 Posted December 22, 2020 On 12/19/2020 at 10:20 PM, SCG_Fenris_Wolf said: Don't ask me why, might be the considerably slower VRAM and that il-2/dx11(?) doesn't manage the infinity cache properly? Just guessing. Infinity cache is automatic and managed by the card, so it's not an IL2 / programming issue, but the cache is part of the reason the card gets slower at higher resolutions. AMD said their hit rate was really good like above 50% at 1440p, which increases effective bandwidth of the memory. As you increase resolution the hit rate goes down and so the "effective bandwidth" does also.
Capt_Hook Posted December 22, 2020 Posted December 22, 2020 My turn at the EVGA till is about to come up for the 3090 Ultra FTW, but I think I'm going to pass and wait for the 3080 to come up, or the 3080 Ti to be announced. I can afford it, but that extra $1000 is a lot to spend for a pretty marginal improvement over the 3080. I'm probably 3-4 weeks from my spot coming up for the 3080 FTW. The 2080 Ti is still OK, but I really want that extra 30% in the Reverb G2. That extra grand would buy a great replacement for the Warthog throttle with a lot left over. Tempted to upgrade the 3900X to the 5900X, but I'll wait till AM5 and new MBs. I hope it would come out at $1000, but I'd easily pay $1200 for a 3080 Ti with 16 GB and 10% more clock.
Bernard_IV Posted December 22, 2020 Posted December 22, 2020 The clock will likely be the same or similar.
1CGS LukeFF Posted December 23, 2020 1CGS Posted December 23, 2020 (edited) I received my RTX 3080 MSI Gaming X Trio late last night, and wow, it is everything and more that I thought it would be. ? https://www.userbenchmark.com/UserRun/37331333 Edited December 23, 2020 by LukeFF
Jaws2002 Posted December 23, 2020 Author Posted December 23, 2020 (edited) For those of you in Toronto Area looking for a 3080, Canada Computers in Ajax has three EVGA 3080 FTW3 in stock now! https://www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=181376 ...And it's gone. Edited December 23, 2020 by Jaws2002
Capt_Hook Posted December 24, 2020 Posted December 24, 2020 On 12/22/2020 at 3:20 PM, Bernard_IV said: The clock will likely be the same or similar. You don't think they'll push the thermals just a bit harder and juice it another 5-10% or so? Just a memory size upgrade without any clock bump at all would be very unusual for a Ti release, and possibly make it a harder sell unless they discontinue the base 3080.
Alonzo Posted December 24, 2020 Posted December 24, 2020 1 hour ago, Capt_Hook said: You don't think they'll push the thermals just a bit harder and juice it another 5-10% or so? Just a memory size upgrade without any clock bump at all would be very unusual for a Ti release, and possibly make it a harder sell unless they discontinue the base 3080. Rumor mill says the 3080ti is the full GA102 die, i.e. the same as the 3090, so a few more shader cores than the 3080. But it will have 20GB of 19Gbps VRAM on a 320-bit bus, vs the 3090's 24GB at 19.5Gbps on a 384-bit bus. This gives the 3080ti 760GB/s memory bandwidth vs the 3090's 935.8GB/s. Performance of the 3080ti, out of the box, will be a few percent less than the 3090, and a few percent more than the 3080. The reason to buy it is for the 20GB VRAM upgrade, not the performance bump, and because VRAM sells graphics cards (people just look at the number on the box).
E69_Qpassa_VR Posted December 24, 2020 Posted December 24, 2020 On 12/19/2020 at 10:17 PM, Alonzo said: I've undervolted my 3080 FTW3 and it's made quite a difference. Max load power draw dropped from 380W to ~300W, so there's literally 80 watts less heat the cooler needs to dissipate. Could you pass the frequency curve? Thanks a lot
Alonzo Posted December 24, 2020 Posted December 24, 2020 32 minutes ago, E69_Qpassa_VR said: Could you pass the frequency curve? Thanks a lot This works for me, but every card will be different. This is for the EVGA 3080 FTW3 Ultra, where the default voltage is 1.04 - 1.08 volts. I was able to get it working down as low as 875mV but it seems more stable for me with a bit more juice. If you get game crashes you should increase the voltage or decrease the frequency. 1
Bernard_IV Posted December 28, 2020 Posted December 28, 2020 On 12/24/2020 at 9:49 AM, Capt_Hook said: You don't think they'll push the thermals just a bit harder and juice it another 5-10% or so? Just a memory size upgrade without any clock bump at all would be very unusual for a Ti release, and possibly make it a harder sell unless they discontinue the base 3080. You can just do that on your own. My XC3 goes to around 2070mhz with a mediocre cooler and fans blazing away. No way you get over 2200 with whatever else. I saw the Kingpin cards maxing out at 2125 or so and that is on water cooling. This is just for VRAM, some may need it and others may not.
CCCPBera Posted January 1, 2021 Posted January 1, 2021 (edited) Hey people, I´m thinking about buying a 3080 and I have two models avaliable: Gainward/Phoenix and Gigabyte. Which one do you recommend? Edited January 2, 2021 by CCCPBera typo
Jaws2002 Posted January 2, 2021 Author Posted January 2, 2021 1 hour ago, CCCPBera said: Hey people, I´m thinking about buying a 3080 and I have two models avaliable: Gainward/Phoenix and Gigabyte. Which one do you recommend? If the specs are similar, I'd take the Gigabyte. I have an overclocked 1080ti and it's been flawless for nearly four years.
SCG_Fenris_Wolf Posted January 2, 2021 Posted January 2, 2021 (edited) Gigabyte have trouble controlling RGB on the new GPUs, fusion 2.0 does not work properly. If you like rainbow colours that you can't turn off, by all means, go Gigabyte :)) had gotten a 3070 for a family member on a fresh system, and I've never seen such intrusive and badly overblown whilst not working piece of crapware like RGB Fusion 2.0 yet. Even MSI got their Dragon Center fixed now :p. Edited January 2, 2021 by SCG_Fenris_Wolf
Jaws2002 Posted January 2, 2021 Author Posted January 2, 2021 6 hours ago, SCG_Fenris_Wolf said: Gigabyte have trouble controlling RGB on the new GPUs, fusion 2.0 does not work properly. If you like rainbow colours that you can't turn off, by all means, go Gigabyte :)) had gotten a 3070 for a family member on a fresh system, and I've never seen such intrusive and badly overblown whilst not working piece of crapware like RGB Fusion 2.0 yet. Even MSI got their Dragon Center fixed now :p. Oh, I agree with that. RGB Fusion is a bloody mess. If I let it turn on at startup, my CH control manager doesn't work.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now