Jump to content

Recommended Posts

Hi guys,

 

since my GPU performs really badly in bos, what GPU will be best for constant 60 fps with at least 2xAA and vsync on ultra settings on 1920x1080 res.

and without stuttering.

Thanks

Edited by Voidhunger

Share this post


Link to post
Share on other sites

I understand Nvidia works best with BoS since its DX9 and ATI doesn't support that. For the price I'll bet you can't go wrong with a GTX 970. If price is not an object go for the GTX 980ti

Edited by SharpeXB

Share this post


Link to post
Share on other sites

Even the 970 can't run the game at constant 60 at ultra but it's fair enought. Mine runs at 55-60 usually and may drop down to 47-45 when I'm close to Stalinrad. Also I still got lag sometimes although FPS look fine which is more related to the limited engine BoS has. If oyu lower your demand I'd totally recommend the 970, though I don't know whether there are cheaper cards that can't do the same.

Edited by Stab/JG26_5tuka

Share this post


Link to post
Share on other sites

I understand Nvidia works best with BoS since its DX9 and ATI doesn't support that. For the price I'll bet you can't go wrong with a GTX 970. If price is not an object go for the GTX 980ti

 

ehm ehm you have i7 3770K , 2x titan x and BOS struggles ??? pffff  :huh: 

 

 

I wanted to buy gigabyte 980 but the card is too long for my case and other brands cost too much.

Even the 970 can't run the game at constant 60 at ultra but it's fair enought. Mine runs at 55-60 usually and may drop down to 47-45 when I'm close to Stalinrad. Also I still got lag sometimes although FPS look fine which is more related to the limited engine BoS has. If oyu lower your demand I'd totally recommend the 970, though I don't know whether there are cheaper cards that can't do the same.

 

 

do you have vsync on?

Share this post


Link to post
Share on other sites

ehm ehm you have i7 3770K , 2x titan x and BOS struggles ??? pffff :huh:

Before the Tirans I had a GTX 780ti. That ran BoS on Ultra and 4x AA 1080x1920 very well. But it was a more expensive card than the 970 is now depending on where you got one. Then I had a GTX 980. That worked very well also at 1080p

But if size is an issue all those are the same 10.5" long.

I'm using the 2 Titans because I have a UHD monitor. That actually makes Titans struggle.

 

The i7 3770K worked very well with BoS and this game is heavily dependent on CPU. But recently my overckocking gave out and now at its stock 3.5 GHz speed large missions like Vetranens are not playable. The Campaign and smaller missions or MP work fine. The recommendation I got to replace it is the new Skylake i7 6700K coming in early August.

Edited by SharpeXB

Share this post


Link to post
Share on other sites

Vsync off and I'm playing in "not fullscreen" mode.

 

and if you turn on vsync what happens?  stutters or fps are too low?

Before the Tirans I had a GTX 780ti. That ran BoS on Ultra and 4x AA 1080x1920 very well. But it was a more expensive card than the 970 is now depending on where you got one. Then I had a GTX 980. That worked very well also at 1080p

But if size is an issue all those are the same 10.5" long.

I'm using the 2 Titans because I have a UHD monitor. That actually makes Titans struggle.

 

The i7 3770K worked very well with BoS and this game is heavily dependent on CPU. But recently my overckocking gave out and now at its stock 3.5 GHz speed large missions like Vetranens are not playable. The Campaign and smaller missions or MP work fine. The recommendation I got to replace it is the new Skylake i7 6700K coming in early August.

 

ahh I didnt notice you monitor :)  OK thanks

Share this post


Link to post
Share on other sites

ahh I didnt notice you monitor :) OK thanks

Yeah if you want to make two 12GB graphics cards run up to 99% usage blasting 75C air out the back of the PC whirring like mad. That will do the job ;-)

Share this post


Link to post
Share on other sites

I have a PC very similar to you PC. In my case with Z97, I5 4690k at 4,5ghz with one GTX970 to play at 1200p (but I'm applying 1600p vía Nvidia DSR system)

 

GTX970 can manage all games at 1080p or plus in ultra without problems. The CPU speed it's the king in this case, not the GPU. I'm sure that with only one GTX970 you can manage 2 screens at 1080p without problems if your CPU speed it's over 4ghz (in your case I think that you can)

 

The FPS don't mark the perfomance in one game, it's the frametime. you can play at 30fps, but if you frametime it's low you will play veeery smooth. And the other side, you can play at 80fps, but if your frametime it's high or with oscilations, bad solution.

 

 

GTX980 to play at 1080p.....this GPU it's to manage plus screen resolution, to play at 1080p it's throwing money.

 

In any case, I think that with your PC and CPU speed you can choice any new GPU, with 60fps, 40 or 30, you will have a good frametime with this I5 and 4,4ghz

Edited by SuDoKu

Share this post


Link to post
Share on other sites

Hi,

 

just bought today gtx 980 and i can play now on ultra with vsync on :)

I hope that the GPU will last longer than previous two nvidia GPUs .

  • Upvote 1

Share this post


Link to post
Share on other sites

Nice! Planning on messing with OC at all? If you're synced all the time, probably not worth it.

 

I understand Nvidia works best with BoS since its DX9 and ATI doesn't support that. For the price I'll bet you can't go wrong with a GTX 970. If price is not an object go for the GTX 980ti

Of course AMD and ATi cards support dx9....It has been supported since the standard has existed.

 

 

I get excellent performance from my hardware. Yes, BoS is a little more demanding to some other games, but it's very smooth and overall looks pretty damned nice.

 

 

Sudoku is right about frametimes. Yet to see any valid benchmarks of BoS that include frametimes.

 

 

Anyways, anything the ballpark of a 970 should be fine for bos at 1080p. That said, I do not recommend a 970 specifically. They appear to provide very nice performance for BoS and many other titles, yet there are still some frametime anomalies in several games for many owners and some odd SLI scaling with regards to frametimes. There are long-term implications for difficulties due to the non-simultaneous read/write across the segmented bus. Granted, the hardware can do very well. It's just something to keep in mind and something you should definitely research when shopping for a new card. Most importantly, pay attention to frametimes and minimums in particular. Average and max are both meaningless metrics. Ideally, if you could choose to run the same minimums constantly or run the same minimums with higher maximum/average, the flat line at minimum will provide a much better experience. That's how important those metrics are.

Edited by e345spd

Share this post


Link to post
Share on other sites

Of course AMD and ATi cards support dx9....It has been supported since the standard has existed.

I'm not much of an expert but that's a common thing I read about RoF, BoS and DCS (which is still DX9 at the moment pre EDGE)

It's a matter of driver support and that Nvidia is better at this with DX9. Maybe someone more knowledgable about this issue can tell us more.

Edited by SharpeXB

Share this post


Link to post
Share on other sites

Well there's no question about dx9 support as it exists in standardized, defined form. If someone is talking about a vendor-specific extension that doesn't actually exist in the definition, then that would be different, as no one else who supports the standard would have a license for using the extension. I know this happens all the time with openGL and less aware developers, It (basically) doesn't happen with dx9 aside from sometimes both intentional and unintentional platform targeting through shaders, not dx9 itself. That's also not an issue of support for the standardized shader model used by dx9, it's simply standard optimization for more than one architecture, as is always the case. The 'targeting' can be especially troublesome and 'unintentional' for any cross-platform/vendor optimization if you decide to say...Implement a package of shaders that someone sends you, all contained within in a pre-compiled binary that you don't have any access or control over ;) .        

 

Anything else without frametimes and repeatable results is simply anecdotal evidence. As such, I will provide anecdotal evidence. My hardware runs all of those titles excellently at high native resolution. Crossfire works with any dx9 game that has basic functional multi-gpu support, including CloD :ph34r:.  If people believe they have noted a trend in performance differences in any games that I care about, I've certainly not seen any valid evidence of it. When properly configured, I would wager that I get superior multi-GPU scaling 9 times out of 10, too.

 

 

Anyways, the real problem is that we don't have access to any decent benchmarks for titles we actually want to play. I assume most of us on here have a similar selection of titles, many of which are never in reviews. Plus, the reviews are next to useless anyways without frametimes or at least a hint at meaningful minimums (some drops can be completely unnoticeable if fast enough).

 

Someone should start a 'sim hardware review' website that actually looks at important things with technical understanding, not just this input accessory/game review nonsense. 

Edited by e345spd

Share this post


Link to post
Share on other sites

Crossfire works with any dx9 game that has basic functional multi-gpu support, including CloD :ph34r:.

 

How do you get multi GPU working in CoD? Does it work by default or did you do something special? I didn't see it working, I'm running Vanilla CoD. Someone sent me a work around that supposedly works for SLI but I haven't tried it yet. Edited by SharpeXB

Share this post


Link to post
Share on other sites

The other trouble with the ATI vs Nvidia comparison, currently is that the ATI card which competes with the GTX 980ti is the R9 295X2. That card looks like it's equipped with an external cooling fan which is extremely difficult for most people to install, probably requiring a comlpete case rebuild. That's a pretty awful feature and makes the decision between the two a no brainier.

 

Looks like the ATI 300s got rid of the fan. That's better. Except the R9 390X gets stomped by the GTX 980ti

Edited by SharpeXB

Share this post


Link to post
Share on other sites

R9 390X is about 430$, GTX 980ti > 650$... AMD card that competes with 980ti is R9 Fury X, price is similar and performance is similar. Fury X has liquid cooling, so it kind of depends on whether you like that or not.

Edited by 13./JG51mprhead

Share this post


Link to post
Share on other sites

R9 390X is about 430$, GTX 980ti > 650$... AMD card that competes with 980ti is R9 Fury X, price is similar and performance is similar. Fury X has liquid cooling, so it kind of depends on whether you like that or not.

Yeah that liquid cooling is a no-go. Where that big fan is supposed to go in most PC cases I have no idea. Plus the Fury X is substantially outperformed in most cases by the 980Ti for the same price.

Share this post


Link to post
Share on other sites
The Fury X performs on the level with the 980Ti which is fine. But the big fail for the Fury is the dependence on the water cooling and the big fan.

The 980Ti is available with this feature for those that want it but it's not mandatory.

Plus these cards are aimed at the 4K market which will require two cards. That makes fitting giant extra fans in your case an even bigger CF headache. It's a 5 minute swap vs a complete PC rebuild for most people. That lack of cooler options is a terrible choice for ATI

The Fury is not equipped with HDMI 2.0 ports which are necessary to connect an HDTV

Apparently the 980 can be over locked as well but not the Fury

Given that the price is the same there's not a compelling reason to choose the Fury over the 980Ti.

And for this game in particular the lack of DX9 driver support is a problem.

This round goes to Nvidia. Better luck next time

;-)

Edited by SharpeXB

Share this post


Link to post
Share on other sites

I agree that in crossfire water cooling will most likely be a headache to many, but then again, cost for having Fury CF or 980ti SLI will be more than most people are willing to spend for their computer in total. Fury X should also have plenty of room for overclocking, at least that is what AMD said. I would say that in general it's matter of personal preference which card you choose, if you only play BOS Nvidia is probably smarter choice.

Share this post


Link to post
Share on other sites

I think some of you are forgetting the actual dimensions of the radiator and the card itself:

 

AMD_R9_Fury_X_Fiji_Aufmacher_01-pcgh-pcg 

 

 

Anyways, they will be releasing air cooled versions in the near future, plus the tiny nano.

 

Also, the Fury non-x specifically is the one to look at for relative value at the high end if you aren't attracted to playing with the full die.

 

The cut stream processors have a minimal effect on performance in most situations and result in in card with fantastic overall efficiency per relative minimum frame time. Shader-heavy fury x is more aligned with dx12 in terms of hardware resource balance.

 

 

 

Overclocking is still waiting for 3rd party tool support to really figure out. Apparently, getting software control over the voltage regulators is always rather complex for AMD cards, or so I read.  

 

HDMI 2.0 is supported through a conversion attachment. Frankly, the displayport version on those cards is superior to hdmi 2.0 in terms of flexibility and performance.  I see no reason to waste space on hdmi 2.0 specifically. The majority will be using displayport.  

Share this post


Link to post
Share on other sites

I run a 780Ti at 2560x1440p @ 144Hz, on ultra settings

and have no issues at all, ive never seen my FPS drop below 65.

Two Titans should run it a breeze without even breaking into a sweat.

I think people have issues with the way they set the drivers/software up etc.

In the Nvidia Control panel set everything to "application controlled"

Then set it up through the game settings.

And if you have a G-Sync monitor enable it and turn V-Sync OFF (it eats framerates!)

And my CPU is an old 3930K which never gets above 50degrees.

Its all to do with settings!

Share this post


Link to post
Share on other sites

Two Titans should run it a breeze without even breaking into a sweat.

 

BoS needs the two Titan X cards to run at 3840x2160 Ultra 2xAA at a good 60fps. Many other games do too. 4K is a freakin killer... Looks nice though.

Share this post


Link to post
Share on other sites

I see no reason to waste space on hdmi 2.0 specifically. The majority will be using displayport.

HDMI 2.0 is required to connect an UHDTV. For monitors DP 1.2 is a good choice as well. I actually looked for one of those HDMI/DP adapters and couldn't find one rated for 4K/60

Edited by SharpeXB

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×
×
  • Create New...