chiliwili69 Posted October 26, 2020 Posted October 26, 2020 1 hour ago, Jaws2002 said: Hey Chiliwili, what tracks are you using now, to test performance? Currently I am not using any benchmark. Last one was the Remagen test until 4.005, then became unplayable after some patchs in June-2020. I have suggested to dev team to create a benchmark in IL-2: 1 1
Alonzo Posted October 26, 2020 Posted October 26, 2020 Personally I just create a benchmark track each time I need one. Remagen spawn point, some bombers, shoot them up and fly low over the city, 2 minutes. Just like Chili. It only helps me benchmark my system against itself but that's often what I want to do. Find sweet spot between all the graphics options, supersample, etc. I do wish we had an in-game benchmark for comparison between systems.
Alonzo Posted October 27, 2020 Posted October 27, 2020 From the Hardware Unboxed review. So the 3070 is basically 2080ti level performance. I'd say the 3070 is a good recommendation for VR users and would do a great job driving a Rift S or Valve Index, and a reasonable job driving a Reverb G2. Although if you're buying a G2 it's probably worth the extra money to move up to a 3080. BUT! Tomorrow is AMD reveal day. They might surprise us, and the rumor is they have a card that will be competitive with the 3070 or even the 3080, and their pricing choices will probably force NVidia to respond. So things could change.
SCG_Fenris_Wolf Posted October 27, 2020 Posted October 27, 2020 (edited) Igor has published some leaks. Pure rasterized performance of the 6800XT, which is our use-case with IL-2, is allegedly 10-20% faster than the 3080FE based on Timespy 4K. That puts it as equal to the 3090FE. In RT it is as quick as a 2080ti. But that doesn't interest me much. We have no DLSS or RT in IL-2 (even though RT and DLSS could make IL-2 probably the most beautiful combat simulator for years to come period). I'll see if I will have to sell my 3080 Trio X Gaming. I had never expected team Red to be better by such a margin. We know for certain tomorrow! Edited October 27, 2020 by SCG_Fenris_Wolf 1
Jaws2002 Posted October 27, 2020 Author Posted October 27, 2020 12 minutes ago, SCG_Fenris_Wolf said: Igor has published some leaks. Pure rasterized performance of the 6800XT, which is our use-case with IL-2, is allegedly 10-20% faster than the 3080FE based on Timespy 4K. That puts it as equal to the 3090FE. And that was a 6800xt. Amd will also release the full size 6900xt.
SCG_Fenris_Wolf Posted October 27, 2020 Posted October 27, 2020 That might rain hellfire down upon team green... We'll see
Alonzo Posted October 27, 2020 Posted October 27, 2020 10 minutes ago, Jaws2002 said: And that was a 6800xt. Amd will also release the full size 6900xt. I'm interested in the naming. It'll help me decide which leakers to listen to next time around. My favorite guy (Tom from Moore's Law is Dead) maintains that it's not necessarily called the 6900XT, it might be the 6800XTX, and he thinks it'll be an AMD exclusive (no board partner models). He's saying they'll only call it the 6900XT if it beats the 3090, otherwise they'll stick with a 6800 moniker. 2 minutes ago, SCG_Fenris_Wolf said: That might rain hellfire down upon team green... We'll see Either way, we as consumers win. Well, all of us who didn't buy a scalped 3080 for $1800. Competition is good for all of us. And frankly I'm happy if AMD charge a bit more for their cards (since NVidia has set the price benchmarks) and make some money and continue to be a strong competitor in future. 1
SCG_Fenris_Wolf Posted October 27, 2020 Posted October 27, 2020 Igorslab is the former head of old Tom's hardware (they had to give the license back, which is run by other people now which is why it changed). An electro-engineer from Saxonia with the most industry contacts we know of. Need to speak German to see his videos. He also has a website on which he translates some things to English for the international public.
J2_Steve Posted October 27, 2020 Posted October 27, 2020 I remember when Intel and Nvidia didn't dominate. I remember when ATI released the Radeon 9700, that was a card that blew the Geforce 2 out of the water and my favourite clockers where the early Athlons. After being on the intel/Nvidia merry go round since the I7s first turned up, I've decided I'm defo going Red on my next build 2
sevenless Posted October 27, 2020 Posted October 27, 2020 5 hours ago, SCG_Fenris_Wolf said: That might rain hellfire down upon team green... We'll see Yep, seems like that could be the case. I´m looking forward to see the benchmarks.
chiliwili69 Posted October 28, 2020 Posted October 28, 2020 16 hours ago, SCG_Fenris_Wolf said: Igor has published some leaks. Pure rasterized performance of the 6800XT, which is our use-case with IL-2, is allegedly 10-20% faster than the 3080FE based on Timespy 4K. I suposse you refer to this page: https://www.igorslab.de/en/3dmark-in-ultra-hd-benchmarks-the-rx-6800xt-without-and-with-raytracing/ The interesting rasterization benchmark is not Timespy(DX12), it is Firestrike(DX11) where 6800XT is supossed to run 18% faster than 3080 FE. Let´s see if this is true then. 1
Livai Posted October 28, 2020 Posted October 28, 2020 (edited) Hey can someone tell me the recept with which I can bake a 3080/3090 myself??? Edited October 28, 2020 by Livai
Jaws2002 Posted October 28, 2020 Author Posted October 28, 2020 18 minutes ago, Livai said: Hey can someone tell me the recept with which I can bake a 3080/3090 myself??? 1
RedKestrel Posted October 28, 2020 Posted October 28, 2020 19 minutes ago, Livai said: Hey can someone tell me the recept with which I can bake a 3080/3090 myself??? It involves some fermentation so be warned, it's time consuming. 1. First, you need your silicon starter. Put 1 cup sand and 1 cup water in a mason jar and mix well. Sit it on your countertop. Every day, pour out half of the mix and add half a cup of sand and half a cup of water to feed it. Within a few days you should start to see bubble. That's the sand starting to ferment into microchips. 2. After a week, feed it 1 cup sand and 1 cup water and let it digest for a few hours. Then mix 1 cup of the resulting mixture with finely shredded US dollar bills. You will need 700 of them for a 3080 or 1500 for a 3090. A food processor is very helpful for this but you can do it yourself if you don't mind the time. 3. Pour it into an internet connected blender and use the 'F5' button repeatedly to refresh the mixture to see if it is ready yet. This will take 6 months to a year. 4. Once the mixture is ready, bake at 400 degrees celsius until you burn out the power supply unit of your oven. 5. Remove when you can see the gamer-approved RGB lights start to flicker on the card. That's how you know its the ideal crispiness. Beware some pitfalls with this recipe! During the mixing process, you may attract some pests - these are the dreaded 'scalpers'. Set up a number of fortifications around your device to keep them from it. I suggest a defense-in-depth approach with several lines of defense, with overlapping fields of fire, artillery positions, anti-tank ditches, and barbed wire. Invite several hundred thousand friends to assist you in manning the defenses. If you're not sure how to do this, look up recipes for the Russian fortifications in the Battle of Kursk. There, you're all set! 1
Alonzo Posted October 28, 2020 Posted October 28, 2020 I'm sure anyone following this thread will already have seen it, but AMD is releasing the following cards: 6900 XT -- trades blows with the 3090, 16GB VRAM (vs 24GB), 300W total board power (vs $LOTS), $999 (vs $1400+ for the 3090) 6800 XT -- trades blows with the 3080, 16GB VRAM (vs 10GB), 300W total board power (vs 350+), $649 (vs $699 for the 3080) some other one I didn't pay attention to but it's the 3070 competitor, similar vibes We still don't know the availability of these cards or NVidia's response, but this is some serious competition from AMD. 1 1
Beazil Posted October 28, 2020 Posted October 28, 2020 Ooh! A graphics war for lower prices? Volley from team red looks good!
Gambit21 Posted October 28, 2020 Posted October 28, 2020 Hopefully a price war will ensue. This is going to be good. 2
Alonzo Posted October 28, 2020 Posted October 28, 2020 5 minutes ago, Gambit21 said: Hopefully a price war will ensue. This is going to be good. Yes, exactly. Rumor is that NVidia is prepping a cut-down 3090 with 12GB VRAM and slightly less performance, and will call it the 3080ti. That's already a good result for gamers. NVidia is really hurting from their decision to go with 8nm, the chips are hella power hungry. Apparently they played some kind of game of chicken with TSMC, didn't want to pay what TSMC wanted to charge for the 7nm process, and got stuck with Samsung's 8nm and hence hotter, more power-hungry cards. Rumor mill is NVidia is going to cut their losses on this generation and try to get a 4000-series out mid/late next year on the 7nm node.
dburne Posted October 28, 2020 Posted October 28, 2020 Well this is all certainly very interesting. Currently in a waiting cue for EVGA 3090 FTW3 Ultra Model, but may still be a few weeks away from my turn to order from the que coming up.
Jaws2002 Posted October 28, 2020 Author Posted October 28, 2020 (edited) Another big Nvidia blunder was to cheap out on memory. All this new AMD cards, including the 6800, are going to have 16GB of GDDR. This will go up against 10GB and 8GB cards offered by Nvidia. Sure 3090 brings 24 GB, but at a stupidly high price. Congrats to AMD to come back in such a force, and humble two big giants that got too greedy. It looks like I'll most likely have my first full red gaming PC.? I'll probably get the 6900XT. Edited October 28, 2020 by Jaws2002 1
RedKestrel Posted October 28, 2020 Posted October 28, 2020 I know the hunger here is understandably for the higher end cards, but has their been any information regarding pricing and performance for whatever amounts to a 3060 from Nvidia, or the equivalent from AMD?
Gambit21 Posted October 28, 2020 Posted October 28, 2020 Looking at the benchmark video posted by @sevenless above, it’s clear that I’m going to need to exercise some temperance with this build since I need everything (CPU, MB, drives, power supply etc). With a 4K monitor capped at 60fps, it does me no good to pay for a card capable of 90fps@4k if I can put those funds into RAM or whatever. On the other hand, I want to be able to crank my clouds all the way up finally, and who knows maybe VR in a few years. So while I’m beyond stoked about all of this, I’m going to spin in circles on the final decision I think. Or...(I hope) a single “no brainer” choice will emerge (like it did with the 970) In any case, I’m fairly certain that choice will be one of the AMD cards at this point.
Jaws2002 Posted October 28, 2020 Author Posted October 28, 2020 3 minutes ago, Gambit21 said: With a 4K monitor capped at 60fps, it does me no good to pay for a card capable of 90fps@4k if I can put those funds into RAM or whatever. If you play MS Flight sim that 60fps is not that easy to achieve. 1 1
Ala13_UnopaUno_VR Posted October 28, 2020 Posted October 28, 2020 Performance in VR is the point, for me, and the one that works best for the il2, will be the one I will buy 1 1
Gambit21 Posted October 28, 2020 Posted October 28, 2020 4 minutes ago, Jaws2002 said: If you play MS Flight sim that 60fps is not that easy to achieve. Which will matter at some point - thanks.
Alonzo Posted October 28, 2020 Posted October 28, 2020 1 hour ago, Gambit21 said: Looking at the benchmark video posted by @sevenless above, it’s clear that I’m going to need to exercise some temperance with this build since I need everything (CPU, MB, drives, power supply etc). With a 4K monitor capped at 60fps, it does me no good to pay for a card capable of 90fps@4k if I can put those funds into RAM or whatever. On the other hand, I want to be able to crank my clouds all the way up finally, and who knows maybe VR in a few years. So while I’m beyond stoked about all of this, I’m going to spin in circles on the final decision I think. Or...(I hope) a single “no brainer” choice will emerge (like it did with the 970) In any case, I’m fairly certain that choice will be one of the AMD cards at this point. It's definitely worth being patient (hard, I know, I want faster pixels today please!). I think a lot of enthusiasts watching this video are going to consider an AMD card, and I think NVidia is going to have to respond with some price reductions, at least once AMD actually has inventory in the sales channels. I think the 3070 or 6800 (non-XT) are about where you should be looking for solid 4K60 performance. Someone yesterday said "the best deal on a 3070 is to save up $200 more and buy a 3080" and I think I agree with that assessment, but this AMD release could cause NVidia to drop the price on the 3070 a little. It's ~2080ti performance which has always been good for 4K -- the new marketing hype around "finally 4K ready cards" is BS, it's moving goalposts, this year's games are more demanding, but you've been able to play "today's 4K games" for several years on the top cards. 1
Ribbon Posted October 28, 2020 Posted October 28, 2020 (edited) I just saw 2 rtx3080 in stock? I'll wait end of November (my reverb g2 arrival) before i decide amd or nvidia/intel upgrade. From seen performance is almost on par while amd being 50-100$ cheaper (50$ isn't decision factor for me), by that time nvidia may adjust prices and restock. If they stay on par performance wise i'll still go for nvidia/intel due to driver stability.....unless amd prooves otherwise! Anyway good times for us "gamers"! Edited October 28, 2020 by =VARP=Ribbon
grcurmudgeon Posted October 29, 2020 Posted October 29, 2020 Bah, I have 10% off at Best Buy, but their system was hosed this morning trying to get a 3070. First their refresh wasn't working in Safari, had to switch to Chrome which cost me a few minutes. Then I waited too long on the checkout button and missed my first opportunity. I did get to checkout (which was supposed to reserve it) and got to payment, then it kicked me out again. Third time there were troubles with their sign-back-in piece, after a bit I got to payment and it told me they were all sold out. Sigh, could have used that $50 off on it. Maybe something will come back in the next day or two.
No_85_Gramps Posted October 29, 2020 Posted October 29, 2020 (edited) For those interested the 3070 is now available at Best Buy, Amazon still not available. Probably some other sellers offering them now. Edited October 29, 2020 by No_85_Gramps
Voyager Posted October 29, 2020 Posted October 29, 2020 (edited) To blow hot and cold with the same breath, we'll want to be very aware of the performance at a given resolution. It does sound like at 4K+ resolutions, the 30 series performs better. It could easily be that whether we want a 30 or a 6k depends entirely on what resolution we're running at. While at the moment, I'm leaning towards a 6900, if a 3080 TI comes out with the 384 bit bus, that might, for me, be the better option in VR. (4k is 8m pixels, but the Reverb is 9.3m pixels. 1440 Ultra wide is only 4.8m pixels) I am *really* looking forward to the November bemchmarks. Edited October 29, 2020 by Voyager
FoxbatRU Posted October 29, 2020 Posted October 29, 2020 It will be interesting to see the real prices and availability in stores. In the meantime, theoretically, AMD looks somewhat more attractive. But ... if you don't need G-sync (like me).
Voyager Posted October 29, 2020 Posted October 29, 2020 1 hour ago, FoxbatRU said: It will be interesting to see the real prices and availability in stores. In the meantime, theoretically, AMD looks somewhat more attractive. But ... if you don't need G-sync (like me). I thought they'd both openness up their standards such that both now worked with both sync systems?
Jade_Monkey Posted October 30, 2020 Posted October 30, 2020 I am over the moon with the AMD announcement because it seems like they finally caught on at the high end of the market. I think this time we all won (consumers). On the down side, I hate that neither AMD nor Nvidia show VR performance metrics on their announcements, I guess we are a really niche audience. Last week I cancelled my order through Dell (I'm not giving them a dime after pulling that bs). Just now, I got a notification from EVGA that it was my turn in their queue based system. I ended up ordering the 3080 a few minutes ago because i dont have a card right now and the AMD cards may or may not be readily available in a few weeks. However I will keep a close eye on the AMD offerings and will definitely give them a serious consideration. I'm in some bizarre state where I am incredibly pumped about AMD even though i just ordered a 3080. I really think the 6800XT is the sweetspot if its within budget. 1
Gambit21 Posted October 30, 2020 Posted October 30, 2020 1 hour ago, Jade_Monkey said: I really think the 6800XT is the sweetspot if its within budget. It's going to a question of can I justify not spending another $100 for the 6800XT vs the 6800 for arguably a load of future proofing. We'll see what happens with prices.
dburne Posted October 30, 2020 Posted October 30, 2020 10 hours ago, Jade_Monkey said: I am over the moon with the AMD announcement because it seems like they finally caught on at the high end of the market. I think this time we all won (consumers). On the down side, I hate that neither AMD nor Nvidia show VR performance metrics on their announcements, I guess we are a really niche audience. Last week I cancelled my order through Dell (I'm not giving them a dime after pulling that bs). Just now, I got a notification from EVGA that it was my turn in their queue based system. I ended up ordering the 3080 a few minutes ago because i dont have a card right now and the AMD cards may or may not be readily available in a few weeks. However I will keep a close eye on the AMD offerings and will definitely give them a serious consideration. I'm in some bizarre state where I am incredibly pumped about AMD even though i just ordered a 3080. I really think the 6800XT is the sweetspot if its within budget. Congrats, still waiting in the cue for my RTX 3090 FTW3 Ultra.
FoxbatRU Posted October 30, 2020 Posted October 30, 2020 13 часов назад, Voyager сказал: I thought they'd both openness up their standards such that both now worked with both sync systems? Nvidia certifies some (mostly new) Free-sync monitors for G-sync. But if the monitor originally had a board for G-sync, and Free-sync was not there, then video cards from AMD will not be suitable for such synchronization. If AMD could somehow work with G-sync monitors, they would eat even more Nvidia's pie. 1
Jaws2002 Posted October 30, 2020 Author Posted October 30, 2020 Nvidia bundles the new Call of Duty with their 3080 and 3090 cards. https://videocardz.com/newz/nvidia-bundles-call-of-duty-black-ops-cold-war-with-geforce-rtx-3080-and-rtx-3090-graphics-cards Competition rocks.
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now