Jump to content

Guide: Nvidia Control Panel Profile (GTX780-1080ti)


Recommended Posts

=TBAS=Sshadow14
Posted (edited)

Hey guys after seeing many random posts with incorrect profile settings and people asking for them.
I decided to add this little post with a basic guide of what and how the profile should be setup.

FIRSTLY
DO NOT EVER SETUP THE GLOBAL PROFILE TO RUN GAMES WITH
(This can cause windows failures and issues)
Eg, Setting power management in global to prefer maximum performance is very bad for GPU lifespan as it will run at max clocks even while pc is on screensaver or idling overnight
Eg, Turning on FXAA Globally can cause errors in Video programs, Internet explorer, and even Problems loading windows after settings are applied.
****I Take No legal or financial responsibility if you somehow break something****
ALWAYS MAKE A PROFILE FOR EACH INDIVIDUAL GAME UNDER PROGRAM SETTINGS ADD >> IL2.EXE FROM RECENTLY RUN PROGRAMS.


Happy Flying
Sshadow14

Edited by =TBAS=OccludedLight14
  • Upvote 3
  • 2 weeks later...
chiliwili69
Posted (edited)

I have tested your suggested settings in my 1070 card and I obtain an average of 115 fps (instead of 122 with previous settings).

I have tested it twice with each settings to be sure and the four tests (2 with your settings, 2 with default) are shown in the below graph:

post-18865-0-85852900-1497812903_thumb.png

 

The test uses the "Balapan Test" which is a heavy-duty recorded flight described in this post:

https://forum.il2sturmovik.com/topic/29322-measuring-rig-performance-common-baseline/

 

The default settings of the nvidia card are:

post-18865-0-17496300-1497813282_thumb.png

I only changed the Power setting for Optimum performance

Edited by chiliwili69
  • Upvote 2
Posted

Why running 120Fps on an 60hz monitor? Is it not better to limit these to 60fps?

  • Like 1
=TBAS=Sshadow14
Posted

yes and also the higher FPS will be because image is slightly lower quality.
Default max quality without a profile is medium(quality) but with profile you can turn it up a little more.

having 120fps on 60hz is ok as fast sync does the best job it can to prevent tearing in most cases.
 

Posted (edited)

I did read somewere about SLI and CrossfireX testings, that it is not only a matter of high FPS, but also being constant at the same straight level. So I would think that incase of an 60hz monitor blocking the FPS on 60 fps would give you an straight line in your FPS graph and if you Vcard does not have to bring in more pictures it should also run cooler.

Beawere I'm not an FPS expert and still trying to figure out on nvidia's Vsync.

 

The source is Tomshardware and Hardware.info

Edited by Dutch2
chiliwili69
Posted

 

 

Why running 120Fps on an 60hz monitor? Is it not better to limit these to 60fps?

 

I was just doing that testing to know how far my rig can go and record that performance. Since I am going to upgrade the CPU (to 4790K) and will want to measure the improvement in monitor and in VR. 

 

Before VR, when I used my G-Sync 60Hz monitor at 4K. Most of the time it was at 60fps, but in complex scenes, when fps go below 60, the G-Sync was handling very well the lower frame rate and there were not tearing.

 

Personally, I don´t find any difference on playing on a 60Hz monitor or 120Hz monitor.

 

There is blog talking about this saying that 60fps is "enough" for gaming in monitors, but myths and placebos are everywhere:

https://xcorr.net/2011/11/20/whats-the-maximal-frame-rate-humans-can-perceive/

  • Upvote 1
=TBAS=Sshadow14
Posted

If you have a 60Hz monitor and turn on FAST SYNC*Bottom option in profile* It will allow 60-120fps (if your system can pull 120) and it will revoce 98% of tearing it more than high quality enough for gaming and il2.

This is not Elite Gaming in esports where a person needs 144hz /144fps
 

  • 2 months later...
Posted

By saying "ALL Nvidia cards from GTX780-GTX1080TiF" you mean the XX80 series only or does this also apply to the 1050?

  • 3 months later...
=TBAS=Sshadow14
Posted (edited)

From GTX780 Including all cards upto GTX1080ti (except all Mobile version of cards GTX 950M for example)
These settings are not for VR
 

Edited by =TBAS=Sshadow14
Posted (edited)

nVidia Tweak Guides is very informative on what settings to use in game and globally..................I have a 1070 vga and have texture filtering set to allow, it's my understanding Battle of Stalingrad is direct X and not OpenGL like the legacy IL2. 

 

http://www.tweakguides.com/NVFORCE_1.html

 

 

 

Texture Filtering - Negative LOD Bias: LOD is short for Level of Detail, and adjusting the LOD Bias is a method of sharpening details on textures. The LOD Bias controls texture detail by determining when different Mipmaps are used. Mipmaps are a precomputed series of textures each of a certain resolution used to improve performance. When you look at a surface close to you, a higher resolution mipmap is loaded; as you move further away from that surface, progressively lower resolution mipmaps of it are displayed instead. The default LOD Bias in a game is typically 0.0, but by using a negative value for LOD Bias (e.g. -1.5), you can force mipmap levels to be moved further away, which can improve texture sharpness at the cost of introducing shimmering when textures are in motion. In general, it is better to just use Anisotropic Filtering to improve texture detail, rather than lowering LOD Bias, as there is no shimmering and the performance impact is minor.

The available options for this setting are Allow and Clamp. Modern games automatically set the LOD Bias, which is why this setting exists, so that you can either select Clamp to lock out and thus forcibly prevent any negative LOD Bias values from being used, or Allow it. Unfortunately, Nvidia has explicitly noted in its release notes for the GeForce drivers for several years now that: "Negative LOD bias clamp for DirectX applications is not supported on Fermi-based GPUs and later." In other words, this setting currently has no impact on the majority of games on GTX 400 and newer GPUs; you cannot prevent negative LOD bias in most games.

It is recommended that Texture Filtering - Negative LOD Bias be set to Clamp under Global Settings, and that Anisotropic Filtering be used instead to improve texture clarity. At the moment this will only work for OpenGL games, which are relatively rare. If Nvidia re-introduces this feature for DirectX games, then the recommendation above will remain the same for optimal image quality.

 

 

Cheers

 

Hoss

Edited by 1./JG54_Hoss
=TBAS=Sshadow14
Posted

Hardware level Lod biasing was diabled a couple years back for all Nvidia cards past Fermi GPU or something like that..

Long story short
set to clamp or allow it does nothing under Direct X 9,10,11,12 (i just set it to clamp as this would reduce shimmering if it worked) - Placebo
but thanks for pointing that out

Posted

Fast sync doesn't work as you describe. It doesn't allow more frames to be displayed than your monitor can support. It doesn't support the display of between 60-120 fps on a 60hz monitor. 

 

Its primary purpose is to reduce the visual input lag that vsync causes (especially on high fps games) by allowing the game to push out the max amount of frames it can from which the tech renders the most recent full frame that is available for the sync refresh cycle. 

 

Because it ideally requires at least double the fps number of refresh rate number it can actually introduce a graphical stutter on systems incapable of producing those high frame rates consistently. In simple terms this is because the raw frames its needs to work with in order to keep the scenes looking consistent and smooth haven't been able to be produced in time further up the pipeline.

 

A player can therefore end up with poorer graphic performance whilst at the same time generating more heat and using more power needlessly. 

 

Bottom line - Fast sync isn't the ONLY option. Its one option to try if your fps generally are able to support it.  Otherwise vsync may still provide you with a smoother gaming experience. 

 

As for "Force on" - not sure what your are referring to as I don't think there is a "force on " option in the NCP for vsync.  if you mean "on" - that's vsync - and it comes with all the input lag that vsync come with because of the way the tech works. 

  • 10 months later...
Posted

Sorry, I cannot seem to find this guide.

  • Like 1
-332FG-Gordon200
Posted

The Nvidia Control Panel can be found by right clicking on your desktop if you have a Nvidia GPU.

Posted

Thank you for the reply. I can find the control panel. What I am looking for is the actual guide to explain and recommend certain settings/options.

 

  • 1 month later...
=SqSq=SignorMagnifico
Posted

Has anyone figured out how to get anti-aliasing in the Nvidia Control Panel or Nvidia Inspector to work? I have a feeling that in-game anti-aliasing is causing planes to disappear when clouds are rendered behind them.

Posted
On 11/9/2018 at 8:42 PM, SShrike said:

Thank you for the reply. I can find the control panel. What I am looking for is the actual guide to explain and recommend certain settings/options.

 

It's here at Tweakguides.com

 

http://www.tweakguides.com/NVFORCE_1.html

  • Thanks 1
  • 2 months later...
Posted

Salutations,

 

I desire to delete my customized Program Settings for IL2 in the Nvidia Control Panel. Under Manage 3D settings, the Remove box is greyed out and I can't clear them. Frustrating.

 

How can I delete these profiles... or where can I find them for deletion? Thanks.

Posted
1 hour ago, Thad said:

Salutations,

 

I desire to delete my customized Program Settings for IL2 in the Nvidia Control Panel. Under Manage 3D settings, the Remove box is greyed out and I can't clear them. Frustrating.

 

How can I delete these profiles... or where can I find them for deletion? Thanks.

Hey Thad,

I believe you can just click on your il-2 profile and then on the box to the right of the grayed out delete button, select the restore button and this will clear it to the default values when you installed the nvidia driver.

cheers-sf

Posted (edited)

NO go. the restore button to the right is greyed out too. I just wonder where these settings are saved on my computer.

Edited by Thad
=TBAS=Sshadow14
Posted

This thread is now too old and no longer applicable or obsulete.

Mods can you please Lock/Delete it asap.

 

Thanks muchly

 

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...