Jump to content

Nvidia teases a 21 day countdown to unveil or release of the RTX 3000


Recommended Posts

9 hours ago, SCG_Fenris_Wolf said:

Gigabyte have trouble controlling RGB on the new GPUs, fusion 2.0 does not work properly. If you like rainbow colours that you can't turn off, by all means, go Gigabyte :))

 

Case with no windows, easy fix. 😉  I do get a bit of RGB-envy from time to time but overall I like my little black box on the floor.

Link to post
Share on other sites

I've never had a problem with RGB Fusion here. I use it to control both the motherboard and RGB strip lights, and it never causes any problems.

Link to post
Share on other sites
15 hours ago, LukeFF said:

I've never had a problem with RGB Fusion here. I use it to control both the motherboard and RGB strip lights, and it never causes any problems.

<<I write this on a mobile phone, expect random capitalisation, missing spaces, incorrect spelling>>

 

That's why I said "on the new GPUs". There's an issue with 3070s, Gigabyte support was of no help. 

Here it crashes the system into a hard lock whenever it gets started.

The system is an ASRock Z270 K6 Gaming, i7 7700K, Gigabyte RTX 3070 OC, 6x 5V RGB fans addressed by BIOS, and a Razer mouse, keyboard, and headphones addressed by Razer Synapse. 

There is no way to address the Gigabyte card, it's not being recognised by any other software either.

 

Both RTX 3080 FE and an older Palit card were properly addressable in the same system with the same software layout, there FE by using Evga prec X1 and the other by Palit's software. It's just Fusion that is messing up.

 

When I went to Reddit to check orhers'experiences with gigabyte, I encountered toxicity and cancer, and comments about mods deleting stuff. People complaining about Fusion, others blaming other software packages of other RGB devices, clearly ignoring that the issue only persists with Gigabyte, then arguing. Signs of a frustrated community.

 

Just as I said, I'd recommend anyone to stay away from Gigabyte GPUs with RGB for now, unless you like rainbows you can't turn off.

Edited by SCG_Fenris_Wolf
  • Thanks 1
Link to post
Share on other sites

guys, apologies in advance if this is somehow covered but i can't find a proper answer to this - does it matter what 3090 card I buy? Eg Asus/EVGA/Nvidia Founder's edition? I do understand that some are 'pre' overclocked (and hence better, if CPU is not a limiting factor) and from what I'm reading EVGA sounds good quality-wise.

For what it matters, I have an Asus motherboard and a 9900k overclocked to 5GHz (hopefully may get it to go to 5.1 or 5.2). Use case is VR, have Reverb G2. 

Any tips appreciated, apologies if this is a noob question!

Link to post
Share on other sites
35 minutes ago, DD_Llama_Thumper said:

guys, apologies in advance if this is somehow covered but i can't find a proper answer to this - does it matter what 3090 card I buy? Eg Asus/EVGA/Nvidia Founder's edition? I do understand that some are 'pre' overclocked (and hence better, if CPU is not a limiting factor) and from what I'm reading EVGA sounds good quality-wise.

For what it matters, I have an Asus motherboard and a 9900k overclocked to 5GHz (hopefully may get it to go to 5.1 or 5.2). Use case is VR, have Reverb G2. 

Any tips appreciated, apologies if this is a noob question!

Some are factory overclocked. They are better only our of the box. You can overclock the FE to the same speed. I matched my manually OCed Trio X Gaming with my OCed Founders Edition (both 3080). And no, it wasn't much louder - and in-case temperature was lower on the FE.

 

Again the same may be true for 3090s. I have a 3090FE as well and the main issue with OCing was that my PSU was too weak. 750W just want enough for quite a few games that spike load by having a 2D loading screen and then going from 10% to 114% power in a transient power spike. OCP triggered so I had to order a new PSU.

 

Reviewers like Igorslab have indicated they aren't binned btw, if I remember correctly.

So get the cheapest one you can, and make sure you have a return option and a silent case, as the 3090s chirp a lot.

Link to post
Share on other sites
4 hours ago, DD_Llama_Thumper said:

guys, apologies in advance if this is somehow covered but i can't find a proper answer to this - does it matter what 3090 card I buy? Eg Asus/EVGA/Nvidia Founder's edition? I do understand that some are 'pre' overclocked (and hence better, if CPU is not a limiting factor) and from what I'm reading EVGA sounds good quality-wise.

For what it matters, I have an Asus motherboard and a 9900k overclocked to 5GHz (hopefully may get it to go to 5.1 or 5.2). Use case is VR, have Reverb G2. 

Any tips appreciated, apologies if this is a noob question!

 I would not bother with trying to push your 9900K over 5 GHz as it is unlikely to boost your fps.  At the native resolution that the HP G2 runs (i.e. greater than 4 K), fps is pretty much dependent on GPU processing power.  I have the same CPU and can redline ( run at 100%) a 2080 Ti, using the G2, and the CPU is running at 50% or less when running IL2.

 

I recently acquired fpsVR, as recommended by other forum members, and it is a great tool for identifying where the bottlenecks are.

 

If you are prepared to tune your 3090, as SCG_Fenris_Wolf has suggested it may not be so important on which brand you choose but I would recommend reading several reviews on each card your are considering. Some manufacturers tune their 3090's for max speed (and use 25% more power) to get 1 - 2 fps and others aim for quiet or cool running and as I am not much of an OC'er the base tune approach is of interest.

 

My very generalised interpretation, based on on-line reviews only as I do not have a 3090, is:

EVGA - tuned for max power usage and do not get much fps improvement as a result, if I had one of these cards I would have to undervolt it before my case or power supply melted :(

Asus - tuned to run cool

MSI - tuned to run quiet

 

I personally would avoid buying a 3090 from a brand I have never heard of before.

 

Happy hunting but as several other existing 3090 owners have noted on this forum, you will still need to make some settings compromises to your G2 or IL2 game settings even with this powerhouse of a card.

 

Edited by RAAF492SQNOz_Steve
punctuation.
Link to post
Share on other sites
6 hours ago, RAAF492SQNOz_Steve said:

 I would not bother with trying to push your 9900K over 5 GHz as it is unlikely to boost your fps.  At the native resolution that the HP G2 runs (i.e. greater than 4 K), fps is pretty much dependent on GPU processing power.  I have the same CPU and can redline ( run at 100%) a 2080 Ti, using the G2, and the CPU is running at 50% or less when running IL2.

 

I recently acquired fpsVR, as recommended by other forum members, and it is a great tool for identifying where the bottlenecks are.

 

If you are prepared to tune your 3090, as SCG_Fenris_Wolf has suggested it may not be so important on which brand you choose but I would recommend reading several reviews on each card your are considering. Some manufacturers tune their 3090's for max speed (and use 25% more power) to get 1 - 2 fps and others aim for quiet or cool running and as I am not much of an OC'er the base tune approach is of interest.

 

My very generalised interpretation, based on on-line reviews only as I do not have a 3090, is:

EVGA - tuned for max power usage and do not get much fps improvement as a result, if I had one of these cards I would have to undervolt it before my case or power supply melted :(

Asus - tuned to run cool

MSI - tuned to run quiet

 

I personally would avoid buying a 3090 from a brand I have never heard of before.

 

Happy hunting but as several other existing 3090 owners have noted on this forum, you will still need to make some settings compromises to your G2 or IL2 game settings even with this powerhouse of a card.

 

Nay the 5.0ghz + overclock along with the fastest ram you can get will give you more FPS and steady at that in VR.  The game uses a single thread to calculate all of the plane movement and other physics items and it needs more speed than it can get.  It is not optimized to utilize the wealth of cores modern CPUs have.  If you have an intel turn off hyperthreading and run that one core as fast as you can get it to go.  Set AVX offset to 0.  I tried a bunch of speeds and cooling methods to get to 5.1 or 5.2ghz and it matters.  I went from a 2800 cl15 ram kit to a 3200 cl14 kit and that also made a difference.  My system won't let me run RAM any faster otherwise I would.

Edited by Bernard_IV
  • Like 1
Link to post
Share on other sites
57 minutes ago, Bernard_IV said:

Nay the 5.0ghz + overclock along with the fastest ram you can get will give you more FPS and steady at that in VR.  The game uses a single thread to calculate all of the plane movement and other physics items and it needs more speed than it can get.  It is not optimized to utilize the wealth of cores modern CPUs have.  If you have an intel turn off hyperthreading and run that one core as fast as you can get it to go.  Set AVX offset to 0.  I tried a bunch of speeds and cooling methods to get to 5.1 or 5.2ghz and it matters.  I went from a 2800 cl15 ram kit to a 3200 cl14 kit and that also made a difference.  My system won't let me run RAM any faster otherwise I would.

Thanks for the response. What type of difference did overclocking the CPU and adding cl14 RAM make to your fps with IL2?

Edited by RAAF492SQNOz_Steve
Link to post
Share on other sites
1 hour ago, RAAF492SQNOz_Steve said:

Thanks for the response. What type of difference did overclocking the CPU and adding cl14 RAM make to your fps with IL2?


My CPU from 4.3ghz to 5.0ghz, smoother co-ops in VR.

Something I wasn't expecting was being more often (lower altitude) holding 90FPS in VR after re-doing the CPUs thermal paste (20c lower temps).

Edited by JG51-Hetzer
Link to post
Share on other sites

I didn't take any quantitative type of proof with FPS counters or whatever but you get way less frame drops when you are diving on planes over a city and other various situations like furballs.  With the slower speeds you just start dropping frames in those situations.  It still isn't perfect but much better the more you can get.

Link to post
Share on other sites
23 hours ago, RAAF492SQNOz_Steve said:

 I would not bother with trying to push your 9900K over 5 GHz as it is unlikely to boost your fps.  At the native resolution that the HP G2 runs (i.e. greater than 4 K), fps is pretty much dependent on GPU processing power.  I have the same CPU and can redline ( run at 100%) a 2080 Ti, using the G2, and the CPU is running at 50% or less when running IL2.

 

I recently acquired fpsVR, as recommended by other forum members, and it is a great tool for identifying where the bottlenecks are.

 

This has been said before, but IL2 likes every little bit of frequency you can give it. Citing "my CPU runs at 50% or less" doesn't tell us anything; IL2 has a couple of 'hot' threads and overall CPU utilization can be low while still encountering a CPU bottleneck. If you take a look at fpsVR it will show you any frame time spikes, which are worse in multiplayer or with lots of Ai, and if you get any spikes at all it can drop you from 90 down to reprojected-45.

 

So yes, going from 5.0 to 5.1ghz on a 9900K might not get you much, but it will help in any marginal situations. I trust Fenris' advice on this and he's seen significant improvements switching to a new AMD chip due to the strong IPC and low memory latency.

 

8 hours ago, Bernard_IV said:

I didn't take any quantitative type of proof with FPS counters or whatever but you get way less frame drops when you are diving on planes over a city and other various situations like furballs.  With the slower speeds you just start dropping frames in those situations.  It still isn't perfect but much better the more you can get.

 

The benchmark thread has plenty of quantitiative proof, here's a recent summary that Chili posted. Basically a 5600X with average memory setup is equal to a 5.2ghz 9900K, a 5600X with good memory settings is equal to a 5.4ghz 10900K, and the rest of the chips (5800X, 5900X, and 5950X) crap on the Intel stuff from a great height.

 

 

  • Like 2
Link to post
Share on other sites

Those look great.  As soon as the DDR5 based stuff comes out in a year or so I'm going to upgrade.  It looks like it will finally be worth it.  We'll see if the Intel chips can catch up.  

Link to post
Share on other sites
10 hours ago, Alonzo said:

<snip

 

The benchmark thread has plenty of quantitiative proof, here's a recent summary that Chili posted. Basically a 5600X with average memory setup is equal to a 5.2ghz 9900K, a 5600X with good memory settings is equal to a 5.4ghz 10900K, and the rest of the chips (5800X, 5900X, and 5950X) crap on the Intel stuff from a great height. >end of snip

 

Quote

 

 

 

Is it as clear cut as that for Zen 3 superiority in all situations? 

 

When running a HP Reverb G2 we are talking about resolutions above 4k and as Youtube presentations like the one below show, things change. The Ryzen chips are clearly superior at resolutions below 4 K and for business apps but the pendulum swings back in quite a dramatic fashion (given how much the i7 was trailing) for 4K gaming. Wonder if this swing continues at G2 resolutions?  Here in Oz, you can get a i7 10700KF for $520 but a Ryzen 5800X will set you back $730+ and stock is scarce. If you want to run a HP Reverb G2, is the Ryzen the best option?

 

P.S. The summary of gaming performance is close to the end (at about 9:30 minutes) if you want to save some time.

 

Discalimer: I own neither chip but am pondering my options along with a GPU upgrade.

 

 

Edited by RAAF492SQNOz_Steve
added Youtube time for summary part
Link to post
Share on other sites

The pendulum doesn’t swing back at 4K. Both processors perform similarly there because the GPU becomes the limiting factor. 

For IL2 I consider we are always CPU limited because I’d always like to add more units in the ground and in the air. I mean, I can set my graphical settings to meet the limitations of my GPU. Even balanced looked OK to me (except for the clouds). But when it comes to the CPU, any headroom I get I can easily spend in more planes and units, and it would still feel too little. 

  • Upvote 2
Link to post
Share on other sites
6 hours ago, coconut said:

The pendulum doesn’t swing back at 4K. Both processors perform similarly there because the GPU becomes the limiting factor.

 

Yeah, I think a lot of the issue is that VR doesn't scale like pancake and it doesn't have tech like variable refresh rate monitors. In pancake you can turn off vsync and get improved performance right up to hundreds of FPS, and with a 144hz monitor some of that is even useful performance improvement. With VR, going above the refresh rate of the headset doesn't give you a better experience, it gives headroom to allow for game situations, PC stutters, or whatever that would otherwise drag you below the HMD refresh rate and cause a poor experience. And the severity of "poor experience" depends on the perception of the player -- for me, any frame drop is very obvious and I hate it.

 

We can use pancake benchmarks to draw some conclusions about a VR experience, but not many. In pancake, there's always a bottleneck. In VR you actually want no bottleneck (more precisely you want the refresh rate of the HMD to be the bottleneck). The consequences of running out of headroom is not like pancake gaming where you lose a few frames from your average FPS and can't even see it due to a variable refresh rate monitor -- the consequences are dropped frames and a crummy experience.

 

10700KF for $520 vs 5800X for $730, is it worth it? Difficult to answer. You get some percentage more headroom from the 5800X chip, possibly as much as 30% or 40%. If that headroom allows you to do something smoothly in VR that you can't otherwise do, and the smoothness in that scenario is critical to you, then yes it's worth the money. Fenris has found that his AMD chip allows him to fly multiplayer in VR with a G2 at a solid 90 FPS. He very much dislikes frame drops, so for him the AMD chip is worth it. For someone else who's playing single player with light Ai missions or for whom a frame drop is not a big deal, probably not worth the money for them.

  • Like 1
Link to post
Share on other sites
  • 3 weeks later...

So sick of Nvidia and their antics. Really ...they can’t come close to keeping up with demand for the 30 series cards, then they introduce more versions they can’t supply, they play games with whether potential additional options like 3080ti are real or not and now they may be reintroducing a past generation card for what purpose who knows. They suck but know they can basically do whatever they want because there is not enough competition to keep them truly honest.
 

We who need a new card are left with settling for older cards, getting gouged by resellers or living with what we have until something changes. 

  • Upvote 1
Link to post
Share on other sites

They planned this shortage. I remember, back in August, everyone on tech blogs was surprized to find out that Nvidia stopped production for most 2000 series cards, long before they had a replacement. 

  Many realized then, this was done to create an artificial shortage, so they can pump the price again.

 Now all their cards got a bump in price...

   Stevie Wonder could see this coming.

 

  • Like 1
Link to post
Share on other sites
2 minutes ago, Jaws2002 said:

They planned this shortage. I remember, back in August, everyone on tech blogs was surprized to find out that Nvidia stopped production for most 2000 series cards, long before they had a replacement. 

  Many realized then, this was done to create an artificial shortage, so they can pump the price again.

 Now all their cards got a bump in price...

   Stevie Wonder could see this coming.

 

 

Maybe now that AMD is getting closer to giving them a run in the performance department that might change before too long.

Nvidia been sitting on top for a little too long. Started with the excessive pricing on the 20x series of cards.

  • Like 1
Link to post
Share on other sites

The problem with AMD is the 7nm TSMC node is now the center piece of a lot of hardware, and they have limited production capability.

18 minutes ago, dburne said:

Started with the excessive pricing on the 20x series of cards.

 

That's why so many opf us stayed with the 1000 series cards. The price of the 2000 series was an insult. 

  • Like 1
  • Upvote 1
Link to post
Share on other sites
26 minutes ago, Jaws2002 said:

The problem with AMD is the 7nm TSMC node is now the center piece of a lot of hardware, and they have limited production capability.

 

That's why so many opf us stayed with the 1000 series cards. The price of the 2000 series was an insult. 


This. I lolled so hard when I first saw the prices of the 20-series. Now I'm lolling again, because it'll be a cold day in hell before I pay the up-coming inflated 30-series prices. Hopefully NV will be punished by thousands of others with a similar FY attitude. Sick of the obscene price-gouging antics of these DBs.

  • Upvote 1
Link to post
Share on other sites

 

56 minutes ago, Jaws2002 said:

That's why so many opf us stayed with the 1000 series cards. The price of the 2000 series was an insult. 


That is certainly true in my case. I built a nice beefy machine a year ago but opted to put my aged 1080 gtx in it because the 2000 series cards at the higher end were so ridiculously priced. I decided to wait for the new generation and now due to Nvidia’s bs I can’t get what I want AND when I can get one the price has been jacked up. Monopolies or close to that result in irresponsible behavior and puts the customer last in their considerations. That’s what we’re seeing from Nvidia. 
 

And sadly AMD can’t supply the products that could be reasonable alternatives. I was close to deciding to go red with my gpu buy even though I have always bought green and AMD is not quite as good as Nvidia but they have failed miserably to take advantage of their opportunities so as far as I’m concerned they suck too! 

Edited by TheSNAFU
  • Like 1
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...