Jump to content

Nvidia teases a 21 day countdown to unveil or release of the RTX 3000


Recommended Posts

Posted

Evga is a good company. I had a gtx 285, a gtx 590 and a gtx 780 from them. The last two burned on me and that's why i switched to Gigabyte, but this generation i would buy an EVGA

 

Posted

From what I read about the 30-series cards flaking out on boost, being down to cheap-ass caps, Asus were the only ones confirmed to be using the required number of high-grade ones ("military grade" they claim). Pics of the open back of the GPU were referenced by reviewers.

Posted
2 minutes ago, J3Hetzer said:

From what I read about the 30-series cards flaking out on boost, being down to cheap-ass caps, Asus were the only ones confirmed to be using the required number of high-grade ones ("military grade" they claim). Pics of the open back of the GPU were referenced by reviewers.

 

This was debunked a while ago, it was a driver issue.

 

8 minutes ago, Gambit21 said:

Evga or Asus

 

Whichever is in stock.

  • Upvote 1
Posted
4 minutes ago, Alonzo said:

 

Whichever is in stock.

 

Yep

That’s what I mean.

Posted (edited)

Patience guys...

Edited by Drum
Posted

My first Evga card, got the one with black lips ?

Running il2 max settings in VR Rift S and Cyberpunk on ultra settings with ray tracing smoothly on 2k monitor.

Still waiting Reverb G2 (should be here in a week or two according to Bestware)!

 

IMG_20201218_123003.jpg

  • Like 1
Posted

Forgot to say that i've tried DLSS in Cyberpunk 2077 and it makes a big performance improvement while keeping image quality.

We need this in il2!!!

Posted
47 minutes ago, =VARP=Ribbon said:

Forgot to say that i've tried DLSS in Cyberpunk 2077 and it makes a big performance improvement while keeping image quality.

We need this in il2!!!


LOL, first we need the videocards! I have been waiting on my 3080 for two months I think, now number 93 in queue ?

Posted
24 minutes ago, SYN_Vander said:


LOL, first we need the videocards! I have been waiting on my 3080 for two months I think, now number 93 in queue ?

Forgot to mention that DLSS test in Cyberpunk i did was on rtx2070s.

I think any rtx gpu supports DLSS!

 

93 in queue could be very good news if they receive at least 100 gpu's in next batch.....???

 

Posted

Nice, that cooler is huge.  The XC3 model cooler is too small IMO.  I've got my fans up too high to keep up.  I'm going to trade this card for a TI model or a 3090 when I get the chance.

SCG_Fenris_Wolf
Posted

Well if you need an RTX 3080 FE earlier, you can private message me. No sale/purchase/anything talks on the forums though, I'll let you know what I mean via PM.

Posted
16 hours ago, Bernard_IV said:

Nice, that cooler is huge.  The XC3 model cooler is too small IMO.  I've got my fans up too high to keep up.  I'm going to trade this card for a TI model or a 3090 when I get the chance.

Step up program from Evga is really nice feature in those circumstances.

I still got the high temps on full load (up to 83°C) which makes me a bit uncomfortable  but i've read somewhere that i need to adjust fan curves or undervolt it.....going to investigate that a bit further!

Posted

The 3080ti Hybrid would be awesome to have.  I bet it is dead quiet.

Posted
12 hours ago, =VARP=Ribbon said:

I still got the high temps on full load (up to 83°C) which makes me a bit uncomfortable  but i've read somewhere that i need to adjust fan curves or undervolt it.....going to investigate that a bit further!

 

I've undervolted my 3080 FTW3 and it's made quite a difference. Max load power draw dropped from 380W to ~300W, so there's literally 80 watts less heat the cooler needs to dissipate.

  • Thanks 2
SCG_Fenris_Wolf
Posted (edited)

My RTX 3080 FE pulled even with my MSI RTX 3080 Trio X Gaming across the board - 3DMark's Port Royal, Timespy Extreme, IL-2 V6 test.

 

3080 FE goes to a power target of 115%, Core +105MHz, no fan curve edited. Boosts to 2040MHz, holds 1985MHz indefinitely. Stays at 73°C which is not hot - it never temperature throttled, I monitored it.

 

Note however - never OC the VRAM in series 3000 cards! I have to repeat this info again and again until people realise, I still see this mistake ever so often made. It's self-correcting RAM. If you change its frequency it'll start throwing errors quickly and throttles. This has been verified again and again by people all over the internet (and by me as well in IL-2 with both Trio X G and FE).

 

The max OC'd RTX 3080 Founders Edition was even to an OC'd Trio X Gaming down to a frame-range of +-1 average.

 

The result? As @Alonzo said:

Buy whatever is in stock. 99% aren't binned.

(The 1% is the Kingpin).

Edited by SCG_Fenris_Wolf
  • Like 1
  • Upvote 1
Posted

Running my 3090 FTW3 Ultra at +143 core, +500 Vram, 107% Power Target, full voltage. That is my gaming profile for the card.

Gives me a constant 2115 MHz and rarely breaks 60c.

I blast the fans though on the card, since gaming in VR with earbuds isolates the noise. My game profile runs the fans at 94%.

  • Like 1
SCG_Fenris_Wolf
Posted (edited)

That's a 3090. 

 

I was talking about 3080s. :)

 

 

Addendum: But just to be sure, note that the VRAM OC will throttle you as soon as it starts throwing some errors, on the 3090 as well - but as opposed to the 3080s, I couldn't test that yet. My 3090 comes Monday. You should see that its performance plateaus before getting there. If you found some overhead that's fine, but does it give better fps? Best to bench it with the VRAM OC'd and with usual OC (but no VRAM), compare, and take the healthy alternative.

Edited by SCG_Fenris_Wolf
Posted
41 minutes ago, dburne said:

Running my 3090 FTW3 Ultra at +143 core, +500 Vram, 107% Power Target, full voltage. That is my gaming profile for the card.

Gives me a constant 2115 MHz and rarely breaks 60c.

I blast the fans though on the card, since gaming in VR with earbuds isolates the noise. My game profile runs the fans at 94%.

I think you got a really nice chip.  That one is a keeper.

  • Like 1
Posted

I've got the 3080, but really want the 3080ti for the extra VRAM. I can EVGA step-up, but only if the new card comes out for step-up before Feb 26th. Any idea how much I'd lose just selling the card after buying a 3080ti ? Or I guess I could step up to a 3090 but that seems like it would waste more cash.

 

Or just sell the 3080 in the new year and buy a 16GB 6900XT with a good cooler?

Posted (edited)
1 hour ago, Alonzo said:

I've got the 3080, but really want the 3080ti for the extra VRAM. I can EVGA step-up, but only if the new card comes out for step-up before Feb 26th. Any idea how much I'd lose just selling the card after buying a 3080ti ? Or I guess I could step up to a 3090 but that seems like it would waste more cash.

 

Or just sell the 3080 in the new year and buy a 16GB 6900XT with a good cooler?

 

Yeah I would probably not recommend spending that kind of money on the 3090 with a 3080 Ti maybe on the horizon.

If the card shortage keeps up you probably will be able to sell that 3080 for a good price.

I am sure 6900 would be good also, they do though seem to pair better with AMD boards.

Edited by dburne
Posted
3 hours ago, SCG_Fenris_Wolf said:

That's a 3090. 

 

I was talking about 3080s. :)

 

 

Addendum: But just to be sure, note that the VRAM OC will throttle you as soon as it starts throwing some errors, on the 3090 as well - but as opposed to the 3080s, I couldn't test that yet. My 3090 comes Monday. You should see that its performance plateaus before getting there. If you found some overhead that's fine, but does it give better fps? Best to bench it with the VRAM OC'd and with usual OC (but no VRAM), compare, and take the healthy alternative.

 

Perhaps I will check that out in the next few days.

Posted
2 hours ago, dburne said:

Yeah I would probably not recommend spending that kind of money on the 3090 with a 3080 Ti maybe on the horizon.

If the card shortage keeps up you probably will be able to sell that 3080 for a good price.

I am sure 6900 would be good also, they do though seem to pair better with AMD boards.

 

I could step-up to the 3090, that was my thinking. But it'd be CAD$1000 to do so which is fairly silly. To be honest I'm considering a switch to AMD, my 5.1ghz 8086K is doing alright in IL2 but it gets trashed in the other sim (and even in IL2 I think I could use more CPU power for multiplayer, Fenris seems to achieve a locked-90 in MP with his high-end AMD rig). And again in that other sim, apparently the AMD cards bench a little higher than the 3080 and I'd immediately get the 16GB VRAM.

 

Ray tracing has been kinda nice for Cyberpunk's accurate reflections, but the rasterized shadows and lighting looks very similar to their raytraced counterparts and run a ton faster.

Posted
7 hours ago, Alonzo said:

 

I've undervolted my 3080 FTW3 and it's made quite a difference. Max load power draw dropped from 380W to ~300W, so there's literally 80 watts less heat the cooler needs to dissipate.

I adjusted fan curves (100% at 70C) and now it tops 78°C but i'll go with undervolting too.

SCG_Fenris_Wolf
Posted (edited)

@Alonzo, the AMD cards bench lower than the 3080 in high-res (say >9mio pixels) VR environments. The 6900xt barely matches the 3080 here. Initially I wanted to get a 6900xt, but I have stepped away from it due to this.

 

Don't ask me why, might be the considerably slower VRAM and that il-2/dx11(?) doesn't manage the infinity cache properly? Just guessing.

Edited by SCG_Fenris_Wolf
  • Like 1
Posted

The 10gb Vram bothers me a bit, though it has not been a problem yet.  Part of it is that I compromised a little, didn't get exactly what I wanted.  A 3080ti FTW Hybrid model is what I want.

Posted
On 12/19/2020 at 10:20 PM, SCG_Fenris_Wolf said:

Don't ask me why, might be the considerably slower VRAM and that il-2/dx11(?) doesn't manage the infinity cache properly? Just guessing.

 

Infinity cache is automatic and managed by the card, so it's not an IL2 / programming issue, but the cache is part of the reason the card gets slower at higher resolutions. AMD said their hit rate was really good like above 50% at 1440p, which increases effective bandwidth of the memory. As you increase resolution the hit rate goes down and so the "effective bandwidth" does also.

Posted

My turn at the EVGA till is about to come up for the 3090 Ultra FTW, but I think I'm going to pass and wait for the 3080 to come up, or the 3080 Ti to be announced.  I can afford it, but that extra $1000 is a lot to spend for a pretty marginal improvement over the 3080.  I'm probably 3-4 weeks from my spot coming up for the 3080 FTW.  The 2080 Ti is still OK, but I really want that extra 30% in the Reverb G2.  That extra grand would buy a great replacement for the Warthog throttle with a lot left over.  Tempted to upgrade the 3900X to the 5900X, but I'll wait till AM5 and new MBs.

 

I hope it would come out at $1000, but I'd easily pay $1200 for a 3080 Ti with 16 GB and 10% more clock.

Posted

The clock will likely be the same or similar.

Posted
On 12/22/2020 at 3:20 PM, Bernard_IV said:

The clock will likely be the same or similar.

You don't think they'll push the thermals just a bit harder and juice it another 5-10% or so?  Just a memory size upgrade without any clock bump at all would be very unusual for a Ti release, and possibly make it a harder sell unless they discontinue the base 3080.  

Posted
1 hour ago, Capt_Hook said:

You don't think they'll push the thermals just a bit harder and juice it another 5-10% or so?  Just a memory size upgrade without any clock bump at all would be very unusual for a Ti release, and possibly make it a harder sell unless they discontinue the base 3080.  

 

Rumor mill says the 3080ti is the full GA102 die, i.e. the same as the 3090, so a few more shader cores than the 3080. But it will have 20GB of 19Gbps VRAM on a 320-bit bus, vs the 3090's 24GB at 19.5Gbps on a 384-bit bus. This gives the 3080ti 760GB/s memory bandwidth vs the 3090's 935.8GB/s.

 

Performance of the 3080ti, out of the box, will be a few percent less than the 3090, and a few percent more than the 3080. The reason to buy it is for the 20GB VRAM upgrade, not the performance bump, and because VRAM sells graphics cards (people just look at the number on the box).

E69_Qpassa_VR
Posted
On 12/19/2020 at 10:17 PM, Alonzo said:

 

I've undervolted my 3080 FTW3 and it's made quite a difference. Max load power draw dropped from 380W to ~300W, so there's literally 80 watts less heat the cooler needs to dissipate.

Could you pass the frequency curve? Thanks a lot 

Posted
32 minutes ago, E69_Qpassa_VR said:

Could you pass the frequency curve? Thanks a lot 

 

This works for me, but every card will be different. This is for the EVGA 3080 FTW3 Ultra, where the default voltage is 1.04 - 1.08 volts. I was able to get it working down as low as 875mV but it seems more stable for me with a bit more juice. If you get game crashes you should increase the voltage or decrease the frequency.

 

3080-undervolt.png.9fa1528cabbbd41c7957ab258f52e9de.png

 

  • Like 1
Posted
On 12/24/2020 at 9:49 AM, Capt_Hook said:

You don't think they'll push the thermals just a bit harder and juice it another 5-10% or so?  Just a memory size upgrade without any clock bump at all would be very unusual for a Ti release, and possibly make it a harder sell unless they discontinue the base 3080.  

You can just do that on your own.  My XC3 goes to around 2070mhz with a mediocre cooler and fans blazing away.  No way you get over 2200 with whatever else.  I saw the Kingpin cards maxing out at 2125 or so and that is on water cooling.  This is just for VRAM, some may need it and others may not.

Posted (edited)

Hey people, I´m thinking about buying a 3080 and I have two models avaliable: Gainward/Phoenix and Gigabyte. Which one do you recommend?

Edited by CCCPBera
typo
Posted
1 hour ago, CCCPBera said:

Hey people, I´m thinking about buying a 3080 and I have two models avaliable: Gainward/Phoenix and Gigabyte. Which one do you recommend?

 

If the specs are similar, I'd take the Gigabyte. I have an overclocked 1080ti and it's been flawless for nearly four years.

SCG_Fenris_Wolf
Posted (edited)

Gigabyte have trouble controlling RGB on the new GPUs, fusion 2.0 does not work properly. If you like rainbow colours that you can't turn off, by all means, go Gigabyte :))

 

had gotten a 3070 for a family member on a fresh system, and I've never seen such intrusive and badly overblown whilst not working piece of crapware like RGB Fusion 2.0 yet. Even MSI got their Dragon Center fixed now :p. 

 

 

 

 

 

Edited by SCG_Fenris_Wolf
Posted
6 hours ago, SCG_Fenris_Wolf said:

Gigabyte have trouble controlling RGB on the new GPUs, fusion 2.0 does not work properly. If you like rainbow colours that you can't turn off, by all means, go Gigabyte :))

 

had gotten a 3070 for a family member on a fresh system, and I've never seen such intrusive and badly overblown whilst not working piece of crapware like RGB Fusion 2.0 yet. Even MSI got their Dragon Center fixed now :p. 

 

Oh, I agree with that. RGB Fusion is a bloody mess. If I let it turn on at startup, my CH control manager doesn't work. :blink:

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...