Jump to content

Nvidia teases a 21 day countdown to unveil or release of the RTX 3000


Recommended Posts

S! 

 

And now news coming out that custom 3080 models from Zotac etc. suffer from black screens, CTD and other anomalies. FE versions unaffected. One guess is that when the OC hits over 2000MHz = card crashes. And of course out of stock everywhere with price climbing. 

Link to post
Share on other sites

Anybody able to get his hands on a 3080 or 3090. 

My supplier has all out of stock with unknown delivery times.

I had registered for the 3090 at Nvidia, and got the message that I could now order the card with a link,

The link brought me to an Nvidia  3090 Out of Stock web page. 

Just ridiculous.

Link to post
Share on other sites

Sometimes it pays not to be an early adopter. I'm lucky I'm not in dire need of replacing a video card right now. The choice between buying an overpriced soon to be eclipsed 20 something series nVidia card and a new unstable 3080 isn't a choice I'd want to make.

Link to post
Share on other sites

Some new leaks hint AMD's Big Navi may be a decent card, even for 4k gaming.

  Some people in the comments are trying to guess the possible performance, based on released information and many think RTX3080 performance level is within AMD's reach. And this cards will have lower power requirements, due to more advanced TSMC 7nm+ node. Add to that16GB of GDDR6 and, if priced right, this cards could bring backhat good old Radeon.

https://videocardz.com/newz/amd-navi-21-to-feature-80-cus-navi-22-40-cus-and-navi-23-32-cus

"

This GPU would allegedly feature up to 80 Compute Units (5120 Stream Processors if each CU has 64 cores). The variant listed in macOS11  appears to have a boost clock at 2050 MHz with Navi 21A variant and up to 2200 MHz with Navi 21B variant.

With a boost clock of 2200 MHz, the Navi 21B would have a shader performance of 22.5 TFLOPs. For comparison, with NVIDIA’s new dual SM architecture, the GeForce RTX 3080 graphics card has a maximum FP32 throughput of 29.8 TFLOPs.

Edited by Jaws2002
Link to post
Share on other sites

Interesting. The key item is that the NVidia design has an extra fan (one of the 2 fans in the GPU) taking out the air from the case. At the end the case is a closed system and the internal temperature (of everything inside, CPU, GPU, RAM, etc) is a result of the heat balance of the case. The hot air leaving the case minus the cool air entering the case at steady-state conditions:

 

HEAT = FLOW * (Tout- Tin)

 

Being,

HEAT: the TOTAL heat generated by Mobo, CPU, GPU etc is the same for both designs,

FLOW: The total air flow going out of the case (which is equal to the total airflow entering into the case)

Tin: the temperature of your room

Tout: Is the temperature of the air leaving the box (which is an average of the temps inside the box)

DeltaT: is Tout-Tin

 

FLOW_Nvidia * (DeltaT_Nvidia) = FLOW_other * (DeltaT_others)

 

so for example, if the Nvidia design increases the FLOW by 30% (I don´t know actual number) it will reduce the DeltaT by 30% which means lower temp inside the case.

 

All the triple fans in the non-nvidia designs are just moving the air around INSIDE the case.

 

For open design PCs (no case) I assume both designs would perform equally. Perhaps better the triple fans design since it has 3 fans versus 2.

Edited by chiliwili69
Link to post
Share on other sites
On 9/27/2020 at 1:10 AM, Jaws2002 said:

  Some people in the comments are trying to guess the possible performance, based on released information and many think RTX3080 performance level is within AMD's reach.

 

That is indeed very interesting. I will hold back my investment in a RTX 3070 and wait for the final benchmark comparisions of BigNavi/3070/3080. I have no need for a rush.

Link to post
Share on other sites
2 hours ago, messsucher said:

Definitely will wait and see what AMD have to say

If you only use the GPU for gaming then AMD is an interesting Nvidia (cheaper) alternative, but I use a.o. the Nvidia CUDA and machine-learning features in other non-gaming softwares.

Link to post
Share on other sites
2 minutes ago, simfan2015 said:

If you only use the GPU for gaming then AMD is an interesting Nvidia (cheaper) alternative, but I use a.o. the Nvidia CUDA and machine-learning features in other non-gaming softwares.

 

Sadly I am not such, but if you are like me, that is old school who does not tolerate a single spyware in your computer, to whom your computer is your private property, then you use Linux and certainly wait for AMD because AMD have trouble free best drivers, which are open source too.

  • Haha 1
Link to post
Share on other sites
7 hours ago, chiliwili69 said:

Interesting. The key item is that the NVidia design has an extra fan (one of the 2 fans in the GPU) taking out the air from the case. At the end the case is a closed system and the internal temperature (of everything inside, CPU, GPU, RAM, etc) is a result of the heat balance of the case. The hot air leaving the case minus the cool air entering the case at steady-state conditions.

 

Yes. I watched the video, and it's kind of complex. In all cases it's true that the NVidia FE design exhausted more warm air compared to the axial fan designs, but whether that translated to more heat stuck inside the case (and where) depended on many factors. The Gamers Nexus video even compared vertical and horizontal GPU mounts.

 

But the real question is, "does it matter?" that the FE cards exhaust more heat? Mostly, not really. If your case has terrible air flow, sure, maybe it matters. But for the majority of sensible case designs with enough case fans and air flow, it doesn't really matter.

Link to post
Share on other sites

What is certain is that all this huge cards with axial flow cooling design do create a lot of dead areas with poor airflow. 

 Nvidia really did good with this cooler design. The blower fan, in the back of the card  has a huge exhaust and the front fan moves the air behind the card. It is warmer air than the ambient temperature, but it's still cooler than the temperature of a working CPU or ram module. That blow through fan will also help with keeping the m.2 SSD's cool under the graphics card.

Edited by Jaws2002
Link to post
Share on other sites
4 hours ago, messsucher said:

then you use Linux and certainly wait for AMD

AFAIK Linux does not allow for proper gaming, nor does it allow for full use of the modern graphics cards (in case of consumer softwares, not scientific) !?

An nvidia 3080/90 for gaming ? => Windows ! 

Link to post
Share on other sites
41 minutes ago, simfan2015 said:

AFAIK Linux does not allow for proper gaming, nor does it allow for full use of the modern graphics cards (in case of consumer softwares, not scientific) !?

An nvidia 3080/90 for gaming ? => Windows ! 

 

Things has changed last two years greatly. First of all there is great graphics card support. There is some AAA games released natively to Linux, but most importantly there is Valve Proton, which let you play heck loads of games on Linux. I play this BoX with proton and it works perfectly. Then there is various emulators, which let you have "HD graphics" in old 3D console games.

 

Compared to Windows Linux does not have that many games, but there are hundreds or rather thousands games which work on Linux, many of them great games, so you can enjoy Linux as a gaming computer like you can enjoy PS4 as gaming console with limited games available.

 

And there is less fiddling with Linux nowadays. I only have to install Linux and that is pretty much the only thing I need to do with it. Everything tend to just work, I don't ever have to DL a single driver etc. The biggest issues now is bad VR support, only Valve Index support Linux, and tricky head tracking things. So for simulators using either of those Linux is bad.

 

You can check link below out. It does give pretty accurate information. It is made by gamers.

 

https://www.protondb.com/

Edited by messsucher
Link to post
Share on other sites
16 hours ago, Alonzo said:

But for the majority of sensible case designs with enough case fans and air flow, it doesn't really matter.

 

This is the key point. What is really "enough case fans"?

In the jayz video he show a case with three fans in the front (air inlet flow I believe) and just one fan in the rear (air outlet flow). The air is also leaving through all the case holes.

He is using CPU aircooling, so the CPU fans are just inside the case. (In AIO liquid cooling the fans helps to pull out the air from the case, but this is not the case).

I think is a quite common configuration of fans.

 

So, in that particular configuration, the FE design works better than axial GPUs because it helps to increase the AIRFLOW variable of the mentioned equation.

 

Probably the FE design would be equivalent to use the axial GPU plus an extra powerful fan in the rear.

Link to post
Share on other sites

Looking at tests &- reviews for the NVDIA RTX 3080 card, came a cross these; maybe wait a little until the early morning fog clears up?

 

https://www.slashgear.com/nvidia-geforce-rtx-3080-instability-traced-to-capacitors-27640107/

https://www.tomshardware.com/news/msi-stealthily-revamps-geforce-rtx-3080-design-amid-stability-concerns

Link to post
Share on other sites
26 minutes ago, SCG_Fenris_Wolf said:

New Nvidia drivers came out, fixed the crashes, didn't downclock the boosts.

 

It wasn't the capacitors, and they fixed it on a software level. Embarassing, half the internet got riled up and thought it was cap quality issues (poscaps/spcaps vs mlccs).

 

Perhaps a combination of the two...

https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx

  • Upvote 1
Link to post
Share on other sites
17 minutes ago, SCG_Fenris_Wolf said:

New Nvidia drivers came out, fixed the crashes, didn't downclock the boosts.

 

It wasn't the capacitors, and they fixed it on a software level. Embarassing, half the internet got riled up and thought it was cap quality issues (poscaps/spcaps vs mlccs).

 

But again, if it was just a driver issue, it speaks to how rushed this whole launch has been.

 

NVidia is explicitly trying to generate as many sales as possible before the AMD cards launch. From a corporation-making-money perspective that's the right thing to do, especially as many people only upgrade every other generation, so this is a once-in-four-years sale that NVidia can make to 1000-series owners. But gamers get rushed cards, low inventory, botting/scalping, and unstable drivers. Board partners got very little testing time, and several of them have had their reputation dragged through the muck due to a driver issue that may have nothing to do with their board quality.

 

I still think there's more going on here than "NVidia wants to cash in on a 4-year upgrade cycle". I don't think Big Navi is going to beat the 3080, but I think it's going to be within spitting distance, use 30% less power and have 60% more VRAM. For anyone who's not actually 4K gaming (and let's remember the Steam hardware survey stats, most people are actually still on 1080p, let alone 1440p) an AMD card that's not-quite-as-good as the 3080 but is a bit cheaper and a lot less power hungry is a very good looking option.

Link to post
Share on other sites
2 hours ago, SCG_Fenris_Wolf said:

New Nvidia drivers came out, fixed the crashes, didn't downclock the boosts.

 

It wasn't the capacitors, and they fixed it on a software level. Embarassing, half the internet got riled up and thought it was cap quality issues (poscaps/spcaps vs mlccs).

Do you think all this companies would stop production and change capacitors if it was just a driver issue? 

And how do you know the auto boost was not tamed down? 

 

 

Edited by Jaws2002
Link to post
Share on other sites
4 hours ago, Jaws2002 said:

Do you think all this companies would stop production and change capacitors if it was just a driver issue? 

And how do you know the auto boost was not tamed down? 

 

 

 

I was not able to overclock past 2000 mhz before the driver update. It now overclocks to 2100 no problem. 

  • Thanks 1
Link to post
Share on other sites
On 9/28/2020 at 1:05 AM, sevenless said:

 

That is indeed very interesting. I will hold back my investment in a RTX 3070 and wait for the final benchmark comparisions of BigNavi/3070/3080. I have no need for a rush.

 

Same

Link to post
Share on other sites
6 hours ago, sevenless said:

In case you win the lotterie...

 

 

There is one interesting comment in this video, that SLI is more or less replaced by the PCie link. He mentions that some games use the dual GPUs but through PCie.

PCie 4.0 has a bandwith of 32 GB/sec with 16 lines, far less that NVLink which stays at 600 GB/sec. But for games how much is needed?

Link to post
Share on other sites
6 minutes ago, IckyATLAS said:

But for games how much is needed?

 

Hard to answer as a layman, but I have seen a YT vid from him comparing 3.0 and 4.0 of PCIe showing no difference. So my conclusion would be that 16 GB/s of PCIe 3.0 is sufficient.

 

 

Edited by sevenless
Link to post
Share on other sites
11 minutes ago, IckyATLAS said:

But for games how much is needed?

 

SLI is dead, though. It was always hit or miss whether games supported it, and the 3080 doesn't have an SLI bridge. GN Steve certainly proclaimed it 'dead' in the 3080 launch video. Isn't it dead?

 

(And even though it seems like it should be an awesome slam-dunk solution for doubling your VR framerate, I've never heard of a VR game that supports it...which makes me sad, I'd totally buy 2x 3080, one for each eye)

Link to post
Share on other sites
On 9/29/2020 at 5:16 PM, Jaws2002 said:

Do you think all this companies would stop production and change capacitors if it was just a driver issue? 

 

Do you think all this companies not know what they did? - A good capacitors cost just some Penny Pence more but when you use the cheaper ones already in the production you made already a profit without ever having sold anything. Could be that the Quality from the cheaper capacitors vary cause that some cards run bad. When problems appear tone them down with a driver to limit their boost from that card = problem solved.

Edited by Livai
Link to post
Share on other sites
Just now, Jaws2002 said:

Evga said they did swap the capacitors on their production cards, because some early samples,  they sent to reviewers had crashes.

 

In the early days I replaced capacitors on graphiccards myself for better overclocking results. How much you can expect? - Well - just 50 Mhz more. Sounds not much, but in these times where I did it were even 50 Mhz well worth the work.

 

-> Could be the PCB itself. Nvidia demand a 12-layer design ahead using Controlled Depth Drilling (CDD), also known as back drilling. As far as we know, the many energy-hungry and probably also highly potent consumers that are close together require clearly separated signals. And here we have probably the problem = " require clearly separated signals " for high frequency operation.  If the Quality vary here well then we have a problem. If not the PCB then the Chip Quality from Nvidia.

 

I wonder why Nvidia damage themself do they fear AMD and his Cards? 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...