Jump to content

Nvidia teases a 21 day countdown to unveil or release of the RTX 3000


Recommended Posts

Posted (edited)
2 minutes ago, chiliwili69 said:

 

We should be cautios about what performance increase we would get in IL-2, a non Ray-Tracing game.

NVidia was mixing RT and non RT performances in those tables, no so sure about what is the effective gain for non-RT games.

 

So I will stick to my 1080Ti until we have solid proof of sufficient performance gain in IL-2 VR. 

Remember that NVIDIA is very very clever doing marketing speeches. And making you believe things that are not really real.

 

Yeah true. I will wait for reallife benchmarks until I´ll bite. However the 3070 is the most likely candidate to replace my trusty old SLI 980 combo.

Edited by sevenless
Posted
On 9/3/2020 at 8:02 PM, Jaws2002 said:

tldrdoto – Please clarify if the slide saying RTX 3070 is equal or faster than 2080 Ti is referring to traditional rasterization or DLSS/RT workloads?

Very important if you could clear it up, since no traditional rasterization benchmarks were shown, only RT/DLSS supporting games.

[Justin Walker] We’re talking about both. Games that only support

 

What a short answer from Nvidia. "We´re talking about both" doesn´t seem to be a very scientific answer.

 

For sure in the new 30X0 series the RT games would have a much higher performance increase than the non-RT games. But they don´t compara that. 

I would like to see a chart comparing Pascal, Turing and Ampere with no Ray Tracing.

Posted

We'll have to wait for post release benchmarks, from people not on  Nvidia's patroll, to get the truth about this cards. 

 

Posted

I will need to test the new AMD Zen 3 processor together with the Reverb G2 headset on my current system before I decide whether to upgrade from my 2070 Super card, or not.

 

If I can get a consistent 90 fps in IL2, or thereabouts, with this setup then I won't be buying a 30xx card just for the sake of it.

Posted
2 hours ago, Vortice said:

I will need to test the new AMD Zen 3 processor together with the Reverb G2 headset on my current system before I decide whether to upgrade from my 2070 Super card, or not.

 

If I can get a consistent 90 fps in IL2, or thereabouts, with this setup then I won't be buying a 30xx card just for the sake of it.

You might be around there or close with the graphics down and clouds on low and no anti ailiasing.  Full high settings graphics on the Reverb is tough.  The 3070 looks like a tremendous bargain if it is in the 2080ti ballpark.  I'm waiting on a few benchmarks to see if the 3090 is impressive enough to drop that much coin on otherwise the 3080 will suit my needs.

Posted
2 hours ago, Vortice said:

I will need to test the new AMD Zen 3 processor together with the Reverb G2 headset on my current system before I decide whether to upgrade from my 2070 Super card, or not.

 

If I can get a consistent 90 fps in IL2, or thereabouts, with this setup then I won't be buying a 30xx card just for the sake of it.

 

8 minutes ago, Bernard_IV said:

You might be around there or close with the graphics down and clouds on low and no anti ailiasing.  Full high settings graphics on the Reverb is tough.  The 3070 looks like a tremendous bargain if it is in the 2080ti ballpark.  I'm waiting on a few benchmarks to see if the 3090 is impressive enough to drop that much coin on otherwise the 3080 will suit my needs.

 

I have a 2080 (regular) which is in the ballpark of the 2070 Super, and I run my Valve Index at a total resolution a little higher than the Reverb's native res. I've decided to accept "shadows off" as a compromise for getting good frame rates, so I think you could be good with your card. I'd be more skeptical of the AMD processor's ability in VR -- make sure you use the tools that show whether it's a CPU or GPU bottleneck, and if CPU then find things to reduce CPU load (going from High to Balanced can reduce CPU load, for example).

 

But yes, the 3070 looks like it's a great value card. Probably the "go to" recommendation for VR players now.

 

I was benching my 2080 XC Ultra yesterday and it looked highly temperature limited, so I'm seriously considering a 3080 hybrid water cooler.

Posted
5 hours ago, Alonzo said:

I was benching my 2080 XC Ultra yesterday and it looked highly temperature limited, so I'm seriously considering a 3080 hybrid water cooler.

 

I have an EVGA 2070 Super fitted with an EKB full cover water block in a custom loop with two 360 radiators and the card never goes above 60 Celsius whatever happens, so temperatures are not a limiting factor for me in this case.

 

If the new CPU can give me the extra grunt I require to run IL2 decently in VR, then the 2070 should be able to give me all the eye candy to go with it, I hope.

 

But this is all just speculation, and possibly wishful thinking, at the present moment, so until I get the new CPU and the G2 and try them out together I cannot be certain.

Posted (edited)
6 hours ago, Alonzo said:

I'm seriously considering a 3080 hybrid water cooler.

 

I would like a hybrid cooled card, but i already have a 360mm aio for the CPU and i only have place for a radiator at the bottom of my case. But mounting an aio with the pump/gpu block being the highest point in the loop is asking for all kind of problems, because any air and bubbles contained in the loop will collect in the pump.

 I may just get an air cooled card or maybe just get a kit and make a custom loop for both, the gpu and cpu.

Edited by Jaws2002
Posted

Even doing 3D rendering I’ve never needed water cooling. Seems like an expensive, unnecessary complication/risk.

ZiggyZiggyStar
Posted
On 8/17/2020 at 12:27 PM, RAAF492SQNOz_Steve said:

Us Aussies will think that $2600 would be a bargain if they price the RTX 3090 GPU here in Oz with the same mark up that HP Australia tried to get for the HP Reverb G2 at $1400 AUD.  

Think that Canada is a bit under $900 for the Reverb G2 and given that the Aussie dollar is close to parity with the Canadian dollar, HP Aust were really going for it on the max profit margin stakes :cray:

 

With that type of inflated price they would be asking for around $4,000 AUD :tease:for the RTX 3090 in downunder land.

 

In the end HP Australia had to back off on the price gouging to a mere $1100 AUD and ran the line that the original price was a mistake.

I had been talking to the HP customer service during this time and sorry HP Aust , it was not a mistake, they had been pondering the price for several weeks before trotting out the $1400 one.

 

Fingers crossed, for all concerned, that the new GPU's come in at the low end of pricing expectations. :drinks:


Yes, looks like HP Australia must have been trying to pull a fast one. I have preordered the G2 here in NZ for just under $NZ 1100.

Posted (edited)

It is not clear for me how the pure rasterization performance will be impacted compared to 2080TI.

Besides what the GPU is, the max clock speed per CPU core remains a major performance criteria for IL2. The scheduling implemented in Win 10 on the CPU side will also help but I do not know by how much for IL2.

I would consider two parameters (outside of RTX/DLSS) that make more sense for IL2. 

 

1) Cuda Cores

The number of cuda cores and the new memory management.

2080Ti has 4352 cuda cores

3080    has 8704 cuda cores that is 100% more or X 2

3090 has 10496 cuda cores that is  141% more or X 2.4

 

2) The new memory management on PCIe 4.0 avoiding to go through the CPU.

That could improve vastly the loading of the scenes for large and complex maps, with a high number of objects.

It is not clear to me however if this needs specific implementation in IL2 by the Devs.

 

Potentially we could expect at least x 1.3-1.5 improvement in FPS in 4K with a top CPU running 5GHz per core.

I mean 4K because at lower resolutions the CPU has to process much more data as the frame rate increases and in IL2 he is the very critical component.

In 4K or above the GPU has to handle an enormous number of pixels and so he becomes the critical component.

What is the perfect equilibrium between CPU/GPU is difficult to say. 

Besides that if the devs could at least implement DLSS 2.0 that would allow excellent antialiasing with little or no hit on FPS.

I can imagine that Ray Tracing would be too difficult to implement in the IL2 graphic engine. But it could bring spectacular visual effects on water, snow, ice and atmospheric objects like clouds.

 

 

 

 

 

 

 

Edited by IckyATLAS
Posted

As luck would have it, the company I work for has been tossing $100 gift cards at us to keep us working through this pandemic. So, the $700 price range is reachable for a 3080 card when they come out. I haven't read through this entire thread but at one point I thought I'd read there was a new power connector for these new cards. From my last build from two years ago, I already have a 750w power supply. Would that need to be updated too? And what about a bios update? Those always have made me nervous.

 

I also want to say how much I'm awed by those of you who can buy the latest, greatest, fastest and outrageously expensive computer parts and then overclock the Hell out of them to get 3 or 4 more FPS. You have bigger ballz than I ever had. :)

Posted
1 hour ago, Rjel said:

I also want to say how much I'm awed by those of you who can buy the latest, greatest, fastest and outrageously expensive computer parts and then overclock the Hell out of them to get 3 or 4 more FPS. You have bigger ballz than I ever had. :)

I jump the gun sll the time. My ballz are useful and have made 3 kidz. 
However I never considered it from that perspective. More a kind of stupidity when having money,

And how one prioritize in this life, the latter you might proudly have high ratings

Posted
7 hours ago, Gambit21 said:

I’ve never needed water cooling.

I do understand that watercooling is interesting in some cases, but IMHO only FULL ... no fans anymore.

What is the advantage of hybrid ... you might end up with problems with both cooling solutions !?

If maximum OC is the idea then I do understand watercooling, but only for the 3090 ... because there is not a faster one ... yet anyway and e.g. the Kingpin is then an option (if you have +2000 USD laying around).

Watercooling does LOOK COOL though !!! 

[Doing away with fans completely would be marvellous ... I already had an expensive GPU card where the fan was defective and the only/better way was to simply ... buy a new card].

 

Posted
2 hours ago, Rjel said:

I haven't read through this entire thread but at one point I thought I'd read there was a new power connector for these new cards. From my last build from two years ago, I already have a 750w power supply. Would that need to be updated too? And what about a bios update? Those always have made me nervous.

 

 The new, 12 pin power connector is only implemented on the 3090 and it's not a big deal. Whoever makes the card will provide a single 12-pin to 2x8-pin connector, so you can connect it to a pair of  8-pin cables, coming out of your 12 volt rail. no biggie. 

   i don't think updating the bios is required. On my board is no longer the scary job that used to be. I download the bios from the vendor's website, format a small usb memory stick with FAT32, then copy the bios on the usb stick. Shut down the computer, clear CMOS (there's a handy button on the IO section of the board on the back of the case), insert the USB stick in a dedicated Q-Flash USB port in the back of the PC and restart the computer. then in bios i just select q-flash and select the bios to install.

 If something goes wrong, I have dual bios on the motherboard and i can just manually switch to the other bios.

Don't need to know DOS or other voodoo like before. 

and about overclocking, again it is somewhat dangerous, but when you compare the price of a 3090 vs 3080 for example, it makes a lot more sense to get a 3080 and overclock it a little, than pay twice the price for a 3090. 

 

29 minutes ago, simfan2015 said:

I do understand that watercooling is interesting in some cases, but IMHO only FULL ... no fans anymore.

What is the advantage of hybrid ... you might end up with problems with both cooling solutions !?

If maximum OC is the idea then I do understand watercooling, but only for the 3090 ... because there is not a faster one ... yet anyway and e.g. the Kingpin is then an option (if you have +2000 USD laying around).

Watercooling does LOOK COOL though !!! 

[Doing away with fans completely would be marvellous ... I already had an expensive GPU card where the fan was defective and the only/better way was to simply ... buy a new card].

 

 

 you always have fans. Even with water cooling. what do you think cools the water in the radiators? The hybrid cards use water to cool the GP U and maybe memory, because those generate the most heat and cooling those with air requires big noisy fans and the small fan is there to just cool the rest of the PCB and in some cased vRAm. It is a quiet way to get a lot of extra performance out of the card. Some of this Hybrid cards, like the Kingpin, come with significantly beefier power delivery and custom bios, so they will be able to overclock much  better than a vanilla card, that's water cooled.

  And again, the 3090 is twice as expensive as 3080, so it makes sense to overclock and water cool the 3080.

 Doing away with fans is not exactly feasible, because power passes through all components and all generate heat, so some case fans are always required, but you can use less noisy ones.

 

  • Thanks 1
Posted (edited)
2 hours ago, Jaws2002 said:

And again, the 3090 is twice as expensive as 3080, so it makes sense to overclock and water cool the 3080.

Yes I understand the price difference.

But taken into account the price of a watercooled and thus overclocked 3080 ... is it also much *much faster* (at least say ... 10-20 percent) than the stock 3080, does it then have 24 GB of VRAM ?

Is this not more like car-tuning ... very expensive but indeed nice to look at and brag about ! 

I OC my GPUs too, but have to admit I know exactly 0 about performance benefits between watercooling and air-cooling (except less noise) ... but did read about some (supposed?) drawbacks !?

2 hours ago, Jaws2002 said:

Doing away with fans is not exactly feasible

Indeed, most probably not for cards like the 3080 and 3090 !

However I thought there were indeed fan-less solutions so thank you for pointing this out !

Edited by simfan2015
Posted (edited)

The advantage of water cooling are more obvious under heavy load. My air cooled graphics card is the noisiest part in my rig when i play games or use the computer for more heavy tasks. It completely covers the humming of the all in one, three fans and 360mm radiator,  i use for the CPU. 

   Water is also significantly better at heat transfer and you have a good amount of water that has to be warmed up, before the cpu/gpu hits thermal limits. That means longer maximum boost bursts. 

 There are obvious drawbacks. The extra fail point in the pump, price, more complicated to instal, risk of damage to components in case of a leak, having to buy a new water block every time you replace the graphics card.

 

Edited by Jaws2002
Posted

We need to wait for reviews to know whether water cooling is “worth it” for these new chips. For the last generation, they were heat and power limited, so if you could cool it better and use an unlocked bios, you could get more out of the chip.

 

But for other generations, there was little to gain from liquid cooling the chip. So again, we need to wait for tests by an outfit like Gamers Nexus, who actually know what they’re doing.

 

Why buy a hybrid card? Same reason You might consider liquid coolers for your cpu. They do a good job and are much less hassle than a custom loop. But if a hybrid card is $200 more than a regular one, the “worth it” factor comes into play.

Posted (edited)

Yeah while my EVGA RTX 2080 Ti FTW3 is a pretty cool running card ( never seen it get above 65c in gaming), I do have all three fans running full bore at 

100% when I am gaming. And I run it overclocked to 2.15 GHz with max voltage and power. Fortunately I use earbuds with my Rift S so they isolate all that exterior noise for me.

I will definitely be holding out for another EVGA FTW3 model for the 3090 cards.

They keep a good resale value as well. Although this time I will be taking a hit when I sell the 2080 Ti FTW3 I am sure due to pricing

structure of the 3080 cards.

Edited by dburne
  • Upvote 1
Posted

As you already noticed, I do bitch a lot about 3090 ? , but may still buy it. I want a worthy successor to my venerable 1080ti. 

Posted
23 hours ago, IckyATLAS said:

The number of cuda cores and the new memory management.

2080Ti has 4352 cuda cores

3080    has 8704 cuda cores that is 100% more or X 2

3090 has 10496 cuda cores that is  141% more or X 2.4

 this is the kind of thinking (more cuda cores more performance) which could not work anymore here.

That increase of cuda cores was due to the duplication of the processing lines of the shader chip. In fact the 2080Ti has more cuda cores per line.

It is still to be seen if that increase of cuda cores is as important as it was before.

Posted

I think that link was there for a while. Anyway, there has to be a 3080ti, because the gap between 3080 and 3090 is too big. Is it going to be this year or early next year remains to be seen. 

 

Posted

@chiliwili69 It's actually a bit more interesting than that. Apparently each Cuda core used to have an FP32 and an Int32 module. 

 

What they've done here is modified the Int32 part so not can either run Int32 or FP32 instructions. So we should actually see pretty big lifts on games that are very dependent on the FP32 instructions, but much less on the ones that live in Int32 land. 

 

At the moment I'm probably going to get the 3090 myself, but I am going to be very interested in seeing the Dx11 and FS2020 benchmarks* of the card before I actually buy it. 

 

Depending on what is and is not essential to Il-2, and what is expected about Big Navi (2.23 Ghs, 80 CU) it is conceivable that it could be faster than the 3080's, but again, that will depend on how flight sims use the cuda cores. We could, just as easily, see a near perfect 2x lift. Or none at all. Just don't know, yet. 

 

It's going to be interesting times :)

 

Harry Voyager

 

 

*From what I've seen of the FS2020 and Il-2 benchmarks, FS2020 does seem to take a pile driver to the same sort of things that Il-2 and other major flight sims in the market do, and as good as Il-2 is, I suspect the only flight sim we're going to see benchmarks of will be FS2020

  • Like 1
Posted (edited)

For benchmarks on IL2 I am afraid we will have to buy the card and do it ourselves. That is expensive for a 3090, but finally the same price level of a 2080Ti.

For me buying a 3090 would make sense with uplifting my whole rig. I am an Intel customer so I am very unhappy about the CPU situation at the moment. Version 3 of Ryzen that will be shortly announced, may this time overtake Intel for games where they were reigning undisputed until now. Ryzen 2 just matched them, but if Ryzen 3 can best their clock speed then Intel may loose the battle. Rocket Lake is too far in the future and it is not clear to me how better it will be anyway.

 

I would have appreciated if the Devs would do this, and show us what is the optimal CPU/GPU choice specifically for IL2.

They developed the code so they should know exactly what will run their game best.

 

Edited by IckyATLAS
  • Like 1
Mitthrawnuruodo
Posted
4 minutes ago, IckyATLAS said:

I would have appreciated if the Devs would do this, and show us what is the optimal CPU/GPU choice specifically for IL2.

They developed the code so they should know exactly what will run their game best.

 

The developers probably don't know. Rendering performance depends on the graphics driver, which is beyond their control. Furthermore, it's unlikely that they always have the latest high-end hardware to test the game.

 

Besides, the "optimal" configuration really varies once you take into account costs and different display devices. If you're strictly looking for the maximum performance, the choices are usually quite obvious.

Posted
7 hours ago, dburne said:

Augh now it seems there will be also a 3080 Ti coming this year. Now will have a tough choice, try and get the 3090 as it releases earlier or wait for the 3080 Ti.

Guess I will wait and see how availability goes with the EVGA FTW3 model.

 

https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-ti.c3581

 

You might also want to at least wait for AMD's offering. It's very plausible they will be able to compete with the 3080, possibly offering a 20GB card at around that level of performance. If they do, it might kick NVidia to release the 3080ti sooner (and I firmly believe a 3080ti is going to be better price/performance than the 3090).

 

2 hours ago, Jaws2002 said:

I think that link was there for a while. Anyway, there has to be a 3080ti, because the gap between 3080 and 3090 is too big. Is it going to be this year or early next year remains to be seen.

 

I'm going to try to wait and see. Am currently re-overclocking my 2080 to see what I can wring out of it for VR performance while the 3080/Big Navi situation settles itself.

 

54 minutes ago, Voyager said:

@chiliwili69 It's actually a bit more interesting than that. Apparently each Cuda core used to have an FP32 and an Int32 module. 

 

What they've done here is modified the Int32 part so not can either run Int32 or FP32 instructions. So we should actually see pretty big lifts on games that are very dependent on the FP32 instructions, but much less on the ones that live in Int32 land.

 

The "Moore's Law is Dead" youtube guy did some calculations and in terms of raw TFLOPs, a 3000-series "Cuda core" is about one third weaker than a 2000-series.

Posted (edited)
54 minutes ago, IckyATLAS said:

I would have appreciated if the Devs would do this, and show us what is the optimal CPU/GPU choice specifically for IL2.

 

As for CPU I can tell you. Two weeks ago I switched to an new rig. ASUS Maximus XI Hero with Intel i9-9900K@5.000 Mhz fully capable of doing 5.2Ghz with an ASUS 360 AiO cooler and 32 GB DRAM@3.800Mhz. I came from an i7-2600K@4.000Mhz with 16gigs. I now am able to play everything on "high density" settings in SP without any slow downs in game timer. My old i7 was barely capable to do medium settings and I usually played at scattered density in SP.

 

I still have my GTX-980 SLI combo running on that board and can do everything on ultra one notch below highest settings @1920x1080. I however plan to go for the RTX-3070 when it becomes available in october to have the option to go 1440P or higher.

 

So from my recent experience, it is mainly the CPU which is the decicive factor in this game. Differences between i9-9900K, i9-9900KS and i9-10900K are most likely negligible and in the one digit percentage ballpark, because a good 9900K chip should easily be able to go 5.2Ghz with a good 360 AiO. 

Edited by sevenless
Posted
12 minutes ago, sevenless said:

 

As for CPU I can tell you. Two weeks ago I switched to an new rig. ASUS Maximus XI Hero with Intel i9-9900K@5.000 Mhz fully capable of doing 5.2Ghz with an ASUS 360 AiO cooler and 32 GB DRAM@3.800Mhz. I came from an i7-2600K@4.000Mhz with 16gigs. I now am able to play everything on "high density" settings in SP without any slow downs in game timer. My old i7 was barely capable to do medium settings and I usually played at scattered density in SP.

 

I still have my GTX-980 SLI combo running on that board and can do everything on ultra one notch below highest settings @1920x1080. I however plan to go for the RTX-3070 when it becomes available in october to have the option to go 1440P or higher.

 

So from my recent experience, it is mainly the CPU which is the decicive factor in this game. Differences between i9-9900K, i9-9900KS and i9-10900K are most likely negligible and in the one digit percentage ballpark, because a good 9900K chip should easily be able to go 5.2Ghz with a good 360 AiO. 

 

:good:

 

Yep - I am running my 9900 k at 5.2 GHz and it runs IL-2 GB in VR very nicely. Only on minor occasion does my Rift S drop into ASW.

Hoping the new 3090 Card will allow it to never drop into ASW. However I will also be getting a Reverb G2 I have on pre-order, so will be interesting to see

how the rig and new GPU handle the Reverb G2.

Posted
3 hours ago, dburne said:

 

:good:

 

Yep - I am running my 9900 k at 5.2 GHz and it runs IL-2 GB in VR very nicely. Only on minor occasion does my Rift S drop into ASW.

Hoping the new 3090 Card will allow it to never drop into ASW. However I will also be getting a Reverb G2 I have on pre-order, so will be interesting to see

how the rig and new GPU handle the Reverb G2.

I envy you!

Too early to say but sounds 5.2ghz cpu, rtx3090 and reverb g2 with valve lenses is finally closing gap between VR and monitor image quality bringing VR out of it's childhood :)

i wanted to upgrade my old 6700k (in combo with 2070super i bought last year) and mobo but since 30X0 reveal and G2 (currently on rift S) i think i'll wait this time next year for complete pc and VR headset upgrade.

 

  • Like 1
Posted
14 hours ago, Mitthrawnuruodo said:
14 hours ago, IckyATLAS said:

I would have appreciated if the Devs would do this, and show us what is the optimal CPU/GPU choice specifically for IL2.

They developed the code so they should know exactly what will run their game best.

 

The developers probably don't know. Rendering performance depends on the graphics driver, which is beyond their control. Furthermore, it's unlikely that they always have the latest high-end hardware to test the game.

 

Besides, the "optimal" configuration really varies once you take into account costs and different display devices. If you're strictly looking for the maximum performance, the choices are usually quite obvious.

 

Yes, it will almost impossbile for the Devs to test multiple PC (and VR) hardware, but it would be too easy to just produce a basic in-game benchmark or at least an always playable track-record. So we all can test the performance and share results, and determine the gain of everything. Look this poll.

 

It would be highly beneficial for all people upgrading CPU/GPU/RAM to know the "expected" gain of the upgrade, and see if it is worth for them. It would be also useful to see if your system is running at the expected performance of your peers. And also good to know the performance impact of every single graphic option.

This would be really welcomed for all the new hardware arriving this year: new AMD/NVidia GPUs, new Intel/AMD CPUs, new VR devices...

 

@Jason_Williams or @Han, I know the limited resources and the large things to do, but could you please share with us just your thoughts about this potential in-game benchmarking?

  • Upvote 1
Posted
20 hours ago, Alonzo said:

The "Moore's Law is Dead" youtube guy did some calculations and in terms of raw TFLOPs, a 3000-series "Cuda core" is about one third weaker than a 2000-series.

 

Haven't seen that one yet, but I think he is seriously misinterpreting it. 

 

If you take a look at the 2080 Ti, it lists 4352 cores, but each of those cores is a paired FP32 and an Int32 unit, and can do both in a single pass. In contrast, when the 3080 lists "8704" cores, they are actually talking about 4352 FP32 only cores paired with 4352 FP32/Int32 cores. 

 

If you have an even split of FP32 operations and Int32 operations, it takes exactly the same number of passes to complete as a 2080 TI does. If, in contrast, you have a purely FP32 driven render, you should be able to do it in half the passes that a 2080 TI would. 

 

That why it will matter the type of operation we're looking at, and we really aren't able to tell without either benchmarks or access to the source code and enough specialist knowledge to understand the ratio of floating point vs integer math it's using. I believe nVidia believes that most of the rendering is limited by floating point operations, and feels that being able to give up integer operations in exchange for more FP32 ones will be a big advantage.

 

I'm honestly a bit bugged that it got tangled up in marketing speak, but I understand why marketing is pushing it that way ?

WheelwrightPL
Posted
16 hours ago, =VARP=Ribbon said:

Too early to say but sounds 5.2ghz cpu, rtx3090 and reverb g2 with valve lenses is finally closing gap between VR and monitor image quality bringing VR out of it's childhood :)

 

Unless you have a 4k monitor which produces much sharper image. Also g2 has the same pixel density as a first-gen Reverb, so the image isn't any sharper.

Maybe you will be able to super-sample the image on rtx3090 at least 2-times, which will improve (frankly atrocious) VR antialiasing, but that alone won't make the image sharper. In the final analysis it may be worth it to you, as it is all a personal choice, based on multiple tradeoffs.

Posted
3 hours ago, Voyager said:

Haven't seen that one yet, but I think he is seriously misinterpreting it.

 

I think you should watch the video ? He's purely looking at the overall claimed FPS performance, then looking at the number of cores, doing the division, and saying "therefore a 3000-series 'core' is 30% weaker than a 2000-series core".

 

Quote

I'm honestly a bit bugged that it got tangled up in marketing speak, but I understand why marketing is pushing it that way

 

I think he'd agree with you. His suspicion is that NVidia is making a lot of hype because they want people to get on the preorder train so they can sell a bunch of 3000 series before AMD releases their chips. He doesn't think AMD will necessarily beat the 3080, but that they will make competitive alternatives, for example a card that's within 10% of the performance of a 3080 but that has 16GB of RAM and is $150 cheaper.

 

Here's the video.

 

Posted

AMD GPUs have *always* been hyped ... B4 release !

I do hope I am wrong this time, but IMHO (big) IF AMD would have a killer response to 3xxx then AMD most probably would have rained on Nvidia's parade by now !

 

Posted
11 hours ago, Alonzo said:

I think you should watch the video ? He's purely looking at the overall claimed FPS performance, then looking at the number of cores, doing the division, and saying "therefore a 3000-series 'core' is 30% weaker than a 2000-series core".

If we are thinking to the same segment, he's saying it's 30% less efficient, i.e. doubling the number of cores (or was it the TFlops?, I don't remember) only gives a 60% increase. Which isn't very interesting to me (in addition to being of dubious mathematical correctness). The kind of efficiency I care about is how much it costs and how much performance I get.

 

He also makes a big deal out of the performance-doubling claims of NVidia being exaggerated, but who cares... 60% is pretty impressive by itself. I'd love to see a 60% performance increase in the CPU side, for instance.

 

I'm not holding my breadth. If AMD keeps doing what they've been doing, they'll give us a cheap card with lots or memory with good performance per $, buggy drivers, limited feature set and overall not very interesting for VR.

  • Upvote 1
Posted
15 hours ago, WheelwrightPL said:

 

Unless you have a 4k monitor which produces much sharper image. Also g2 has the same pixel density as a first-gen Reverb, so the image isn't any sharper.

Maybe you will be able to super-sample the image on rtx3090 at least 2-times, which will improve (frankly atrocious) VR antialiasing, but that alone won't make the image sharper. In the final analysis it may be worth it to you, as it is all a personal choice, based on multiple tradeoffs.

Hi WheelwrightPL!

I find anything in this game above 2k 30" monitor unnecessary due to ingame graphics but yes VR will hardly catch monitor image quality.

Point is for VR to catch 1080p and 1440p monitor image quality (which despite VR screen resolution it's lenses and ppi makes it hard).

As an owner of rift S i still think reverb G2 will be huge improvement over Rift S in image quality due to almost double resolution and ppi and better lenses, taking in mind FoV from both headsets G2 should be around 70% improvement over Rift S......just my thoght!

From articles i've read G2 leaves rift S and index far behind in image quality and sharpness.....i hope soo!

Anyway looking forward for reviews and impressions/comparision from you guys that own Rift S and have Reverb G2/rtx30X0 on preorder ( @dburne and others).

 

S!

 

Posted (edited)

@coconut Your assessment is spot on !

By looking at the AMD t-shirt it was quite obvious to me what he was about to say.

Being an Nvidia fanboy myself ... I can't blame him for it.

For me it all comes down to software ... you can buy the 3090 -or- get the Biggest Navi money can buy almost for free, but if the drivers are buggy it simply doesn't matter (to me).

Nothing is more frustrating than visual glitches and/or game crashes.

But maybe AMD will get it all right this time and (only) then performance per $ and performance per Watt does indeed matter.

YMMV

Edited by simfan2015

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...