Jump to content

Big Navi leak? Possibly 2500mhz?


Recommended Posts

Posted

You guys really get milked in Europe...Not as bad as the poor guys in Australia and NZ, but man.

It sucks.

Posted
1 hour ago, SCG_Fenris_Wolf said:

1500€ is overpriced.

 

Uhh..:ya think?

 

Posted
1 hour ago, SCG_Fenris_Wolf said:

I have a 3090FE in the shopping cart right now but I can't get myself to buy it. 1500€ is overpriced.

Depends. If you do content creation or ML then it might make a case for itself. As a gaming only card? Nah. Unless you absolutely need the final fps. Say IL2 with Reverb G2 and the 3080 gets you 75 to 80 but the 3090 gets you 90 to 95? Maybe a case there but the numbers seen so far don't show it. I'm going 6800XT with custom block and 5900X on X570. 

  • Upvote 1
Posted (edited)
14 minutes ago, robbiec said:

Depends. If you do content creation or ML then it might make a case for itself. As a gaming only card? Nah. Unless you absolutely need the final fps. Say IL2 with Reverb G2 and the 3080 gets you 75 to 80 but the 3090 gets you 90 to 95? Maybe a case there but the numbers seen so far don't show it. I'm going 6800XT with custom block and 5900X on X570. 

 

Iv’e done a ton of 3D work - that

card/price is a waste.

 

 

 

 

Edited by Gambit21
Mitthrawnuruodo
Posted

Apparently the 3090 is quite cost-effective for ML. The 24 GB of RAM is very useful for models that need it. "Can I fit four of them in my workstation?" seems to be a common question.

Posted (edited)

AMD posted more benchmarks of the new cards.

Videocardz compiled all the data in a better format.

 https://videocardz.com/newz/amd-discloses-more-radeon-rx-6900xt-rx-6800xt-and-rx-6800-gaming-benchmarks

 

This are done with the "Smart access" enabled, so be aware of the added boost given to AMD cards by allowing the CPU to use the fast cache and GDDR6 memory on the graphics card.

 Interesting nonetheless.

 

 

AMD-Radeon-RX-6900XT-6800XT-6800-vs-GeForce-RTX-3090-3080-2080Ti-4K.thumb.jpg.33dcfb97d70f3917b7b34bd4052ea439.jpg

 

AMD-Radeon-RX-6900XT-6800XT-6800-vs-GeForce-RTX-3090-3080-2080Ti-2K-2048x1152.thumb.jpg.803edb8ea3882a314b2bbacc3442f311.jpg

 

Edited by Jaws2002
  • Thanks 1
Posted
2 hours ago, Gambit21 said:

 

Uhh..:ya think?

 


LOL. That says it all doesn’t it

Posted
1 hour ago, Mitthrawnuruodo said:

Apparently the 3090 is quite cost-effective for ML. The 24 GB of RAM is very useful for models that need it. "Can I fit four of them in my workstation?" seems to be a common question.

 

3D people are that much different than gamers when it comes to their PC’s.

 

 

 

I

 

Posted

Based on the AMD numbers, their 6900XT basically makes the RTX 3090 a joke. I was never going to buy one anyway, but I did back-order a nice EVGA 3080 FTW3. But now we have the 6800XT and 6900XT. I play in VR, so that's my whole use-case for the card. Against the 6800XT, the 6900XT for +53% cost and +10% performance seems like poor value. But with VR I need every ounce of performance available.

 

It does seem like the 6800XT is the sweet spot. Could the 6900XT be "worth it" as a step up?

 

Posted
3 minutes ago, Alonzo said:

Based on the AMD numbers, their 6900XT basically makes the RTX 3090 a joke. I was never going to buy one anyway, but I did back-order a nice EVGA 3080 FTW3. But now we have the 6800XT and 6900XT. I play in VR, so that's my whole use-case for the card. Against the 6800XT, the 6900XT for +53% cost and +10% performance seems like poor value. But with VR I need every ounce of performance available.

 

It does seem like the 6800XT is the sweet spot. Could the 6900XT be "worth it" as a step up?

 

 

I always have to weigh the “in 3-4 years I’m going to be right back here doing this again” factor and economize as best I can.

 

Meaning, that “holy crap my rig smokes!” feeling (and reality) is highly transient so matter how much I spend -  so what did I pay to get there this time?

 

From what I’m seeing, at 4K it’s a tough choice between the 6800/6800XT considering my own case of a new, from scratch build, right down to the case, new RAM, the works.

 

Then back to the “for a $100 more I could have the XT” quandary. 

 

I’m trying to build a 5 year machine here - so that pushes me closer to the XT.

With my i5 2500k it just didn’t make sense to upgrade it for basically 6 years. The software just wasn’t pushing it. I’m hoping to be in the same situation with this new Ryzen.

  • 3 weeks later...
Posted
On 10/30/2020 at 1:25 PM, Gambit21 said:

I’m trying to build a 5 year machine here - so that pushes me closer to the XT.

 

I think both the 6800 and 6800XT are in a good place on the price/performance curve. 6800 is of course better value, but 6800XT isn't bad. 6900XT is definitely off the end of that curve and into "meh, gaming is cheaper than golf, I buy what I want" territory.

Posted
On 10/30/2020 at 9:57 AM, Jaws2002 said:

You guys really get milked in Europe...Not as bad as the poor guys in Australia and NZ, but man.

It sucks.

They can be really unhealthy and collect it all back on "free" medical care. ?‍♂️

Just looked at some benchmarks.  Really impressive improvements for AMD.  It looks to go back and forth with the 3080 depending on the game.  Nvidia gets a bit of an edge in ray tracing that mostly nobody cares about.  Happy with my 3080 but these things are impressive for sure.  This fierce competition will only benefit us.

  • Upvote 1
Posted

What's interesting is that the Radeon cards seem to have more stable frametimes than the 30 series. I'm curious how that will pan out in VR. While it does seem like the 30 series do handle 4k+ slightly better than the 6000 series, the frame time stability may turn out to be a bigger impact in practice. 

 

I also notice that the 2.5Ghz OCs seem to beostly power and temperature limited, so while I don't think we'll be able to reliably hit anywhere near that in the stock cards, it really does open up the possibilities for the AIB cards. 

Posted

At 4K I’m not sure which way to go now dammit.

Posted
2 hours ago, Gambit21 said:

At 4K I’m not sure which way to go now dammit.

 

This side of Christmas it's gonna be "which one can I actually order?" since stock is so limited. 6800XT with a nice AIB custom cooler does seem tempting, performance on par with the 3080 but with an extra 6GB of VRAM. TechPowerUp, by the way, says 10GB is fine for the forseeable future because although the consoles have 16GB of unified memory they are unlikely to use more than 8-10GB for video memory.

Posted (edited)
3 hours ago, Gambit21 said:

At 4K I’m not sure which way to go now dammit.

Hahaha. You are not alone. At 4k looks like Nvidia still has the stronger cards. If you add RT performance and DLSS, it kinda tips the scale in Nvidia's favor, but i don't like going down to 10GB ov video memory.

  I'm sure 6900xt will beat the 10gb 3080 in raster  but it's much more expensive and it still gets trashed in RT.

 That upcoming 3080ti looks like it could be the perfect allarounder. 

I had my finger hovering over "buy now" this morning, but i'm glad i waited for the reviews. 

I'm not in a hurry. I'm too busy. Don't have time for games this days.

I'll let the smoke clear.

Edited by Jaws2002
Posted

@Gambit21 @Jaws2002 The expectation is that nVidia is going to do a 20GB 3080 Ti, which would resolve the memory issue, but I'd expect it to be $1k USD, so we'll have to see how the 6900 XT performs. 

 

Also, the 6000 series may turn out to have a decisive advantage or disadvantage for flight simulator type applications, so I really want to see how it handles FS2020.

Posted
1 minute ago, Voyager said:

@Gambit21 @Jaws2002 The expectation is that nVidia is going to do a 20GB 3080 Ti, which would resolve the memory issue, but I'd expect it to be $1k USD, so we'll have to see how the 6900 XT performs. 

 

Also, the 6000 series may turn out to have a decisive advantage or disadvantage for flight simulator type applications, so I really want to see how it handles FS2020.

 

Looking forward to flight sim test.

The most I spend is $700 on the GPU...that's redlined.

That lowers my CPU budget down to 5800xt territory...which is fine.

SCG_Fenris_Wolf
Posted (edited)

I've completed my X570 / 5900X setup and I can run IL-2 on ultra / max settings in VR at 90fps 24/7. The bottleneck has been removed.

 

Here's my setup https://www.userbenchmark.com/UserRun/35564379

 

My 5900X performs at 108% because I tightened the CL timings on my RAM. My RAM is SKU F4-3600C16D-32GTZN and runs on 3600MHz, 14-15-15-15-30 on 1.45V (they are Samsung B-Die). They are Dual Rank (2x16GB), which is important. Infinity Fabric is at 1800MHz (this is important for the CPU to communicate between its Dies.

 

 

Cancelled unexpectedly by supplier: Today I will receive 4000MHzCL19 SKU F4-4000C19D-32GTZKK and see how it performs with the Infinity Fabric at 2000MHz, and if my 5900X can actually do that, and then with tightened timings.

Edited by SCG_Fenris_Wolf
  • Upvote 5
Posted
1 hour ago, SCG_Fenris_Wolf said:

I've completed my X570 / 5900X setup and I can run IL-2 on ultra / max settings in VR at 90fps 24/7. The bottleneck has been removed.

 

 

 

That is pretty incredible congrats!

  • Thanks 1
Posted (edited)
16 hours ago, Voyager said:

so I really want to see how it handles FS2020.

 

At 1440p and 1080p it's just as good as 3080, but at 4k it's pretty bad. 

 

https://www.guru3d.com/articles-pages/amd-radeon-rx-6800-xt-review,20.html

 

But then again,  fs 2020 has a pretty crappy game engine. 

2080ti is faster than rx6800xt at 4k. That makes no sense at all.

Edited by Jaws2002
Posted (edited)
10 hours ago, SCG_Fenris_Wolf said:

SKU F4-3600C16D-32GTZN 

 

I have the exact same memory. FAST AND SEXY. 

 

Congrats! Looks like your Coputer with Zen3, is a beast in VR. ?

Edited by Jaws2002
  • Upvote 1
Posted
15 hours ago, SCG_Fenris_Wolf said:

I've completed my X570 / 5900X setup and I can run IL-2 on ultra / max settings in VR at 90fps 24/7. The bottleneck has been removed.

 

So that's 5900X CPU, RTX 3080 GPU, and good motherboard and fast RAM. And you're powering a Reverb at 90hz?

 

Do you have any thoughts on price/performance for the various AMD chips? Do you think 5600X and 5800X are too slow to also do the job?

 

I'd consider an upgrade, but CPU frame times on my 8086K at 5.1ghz are usually totally fine. But it's the times when that CPU isn't fast enough that you really notice the difference, I guess.

Posted (edited)
19 hours ago, SCG_Fenris_Wolf said:

My RAM is SKU F4-3600C16D-32GTZN and runs on 3600MHz, 14-15-15-15-30 on 1.45V (they are Samsung B-Die). They are Dual Rank (2x16GB), which is important. Infinity Fabric is at 1800MHz (this is important for the CPU to communicate between its Dies.

 

 

My SKU is almost exactly the same, except there's a C at the end.

 F4-3600C16D-32GTZNC

 

16-19-19-39 

I wonder if I can those timings to tighten up..we shall see.

 

 

 

 

 

 

Edited by Gambit21
  • Upvote 1
SCG_Fenris_Wolf
Posted
On 11/20/2020 at 1:34 AM, Alonzo said:

 

So that's 5900X CPU, RTX 3080 GPU, and good motherboard and fast RAM. And you're powering a Reverb at 90hz?

 

Do you have any thoughts on price/performance for the various AMD chips? Do you think 5600X and 5800X are too slow to also do the job?

 

I'd consider an upgrade, but CPU frame times on my 8086K at 5.1ghz are usually totally fine. But it's the times when that CPU isn't fast enough that you really notice the difference, I guess.

It could power the reverb G2 if I had it.

 

Europeans don't have it yet:

Spoiler

HP is currently in damage control regarding European customers. They haven't delivered any of the pre-orders to Europe, while the Australians got a second batch already, and the (Western) Canadians even saw them in retail (while the eastern Canadians face the same situation as the Europeans). Logistical disaster.

 

thermoregulator
Posted
On 11/20/2020 at 4:15 AM, Gambit21 said:

 

My SKU is almost exactly the same, except there's a C at the end.

 F4-3600C16D-32GTZNC

 

16-19-19-39 

I wonder if I can those timings to tighten up..we shall see.

 

 

 

 

 

 

Those kits are Hynix DJR, while Fenrises kits are B-die. I have got both. You should be totally able to rise the frequency with DJR kits. Timings an especially subtimings will be more difficult. On my 3900xt rig, i got them on 3800, cl 16,16,19,39, but tRFC is around 497. Voltage 1.380v. Its just out of my head, I will check that later. But I am by no means an experienced RAM overclocker. 

 

Posted
On 11/19/2020 at 8:24 AM, SCG_Fenris_Wolf said:

I've completed my X570 / 5900X setup and I can run IL-2 on ultra / max settings in VR at 90fps 24/7. The bottleneck has been removed.

 

Here's my setup https://www.userbenchmark.com/UserRun/35564379

 

My 5900X performs at 108% because I tightened the CL timings on my RAM. My RAM is SKU F4-3600C16D-32GTZN and runs on 3600MHz, 14-15-15-15-30 on 1.45V (they are Samsung B-Die). They are Dual Rank (2x16GB), which is important. Infinity Fabric is at 1800MHz (this is important for the CPU to communicate between its Dies.

 

 

Cancelled unexpectedly by supplier: Today I will receive 4000MHzCL19 SKU F4-4000C19D-32GTZKK and see how it performs with the Infinity Fabric at 2000MHz, and if my 5900X can actually do that, and then with tightened timings.

I'll chance my arm on an equivalent setup but 6800XT, Power Color have released the Red Devil version and it's doing 2.5Ghz on air so I'm hoping a water blocked 6800XT should be at least similar if not capable of hitting 2.6Ghz. That will put it ahead of a 3080. Fabric clock on cpu, board and gpu is the magic key here to releasing all the goodness. Good job on getting 2Ghz Fblk BTW ?- I'll find out in the next few weeks if I can do the same but using 4 x 8GB Samsung B-Dies nominally rated at 3200 C14. 

I'll then order a Reverb G2 to finish off my setup. 

  • Like 1
SCG_Fenris_Wolf
Posted (edited)

Yeah, I'll get the 6900XT in 2 weeks.

 

I play only VR, and there's no Raytracing in this realm. For Cyberpunk2077 brute force will have to suffice.

 

I got my FCLK to 1900 (anything higher isn't stable) and memory clocks in at 3800-16-16-16-16-32 trfc even at 370. 1.4V DRAM (effectively bounces to 1.422V) is even enough, didn't know these Samsung B-Die kits were so good. I did test them at 4000MHz, work great, but the FCLK doesn't go there.

 

 

If you use an X570 make sure to note most boards' DIMMs are daisy chained. You'll want 2x16GB, not 4x8GB or you'll nerf your latency. 

Edited by SCG_Fenris_Wolf
  • Upvote 1
Posted

My Crosshair VIII has been really good in that regard, getting sub 70s for latency easily enough with a 3900X (and 4 sticks) but failing that will order a set with B-Dies in 16GB format. 

SCG_Fenris_Wolf
Posted

PXL_20201122_204513943~2.jpg

  • Thanks 1
thermoregulator
Posted
1 hour ago, SCG_Fenris_Wolf said:

Yeah, I'll get the 6900XT in 2 weeks.

 

I play only VR, and there's no Raytracing in this realm. For Cyberpunk2077 brute force will have to suffice.

 

I got my FCLK to 1900 (anything higher isn't stable) and memory clocks in at 3800-16-16-16-16-32 trfc even at 370. 1.4V DRAM (effectively bounces to 1.422V) is even enough, didn't know these Samsung B-Die kits were so good. I did test them at 4000MHz, work great, but the FCLK doesn't go there.

 

 

If you use an X570 make sure to note most boards' DIMMs are daisy chained. You'll want 2x16GB, not 4x8GB or you'll nerf your latency. 

I´ve got same RAM settings, just tRFC is 304 on G.Skill F4-3200C14-16GTZN, 5900x, MSI X570 Prestige Creation. Haven't tried to push it further yet.

SCG_Fenris_Wolf
Posted

Yeah for the 1fps and probably less stable frametimes, or corrupted data if it goes to far, may not be worth it. 

 

My clock now seems stable but I'll start a stress test in the morning and check in the afternoon tomorrow.

Posted
On 11/22/2020 at 2:10 PM, SCG_Fenris_Wolf said:

Yeah, I'll get the 6900XT in 2 weeks.

 

I play only VR, and there's no Raytracing in this realm. For Cyberpunk2077 brute force will have to suffice.

 

At least one person I've heard on YouTube has said they prefer 2077 with raytracing off, since when it's on everything becomes bright and sparkly and chrome. They prefer dark and gloomy. So you might still be fine with the 6900XT for 2077.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...