Livai Posted September 1, 2018 Posted September 1, 2018 (edited) Jason said something about new graphical effects Quote Guys, About the graphical effects mentioned. These are WIP and not tested in beta yet, but this is the current list. Don't hold us to this cause I don't now what will make the final cut. 1. Smoky tracks from tracers, like MG and AC, have been changed. Now the tracks are synchronized with the speed of the tracers, + they become more realistic in character with a new texture 2. The texture of the flashes is slightly changed for hits and explosions. 3. Improved the effect of the flame 4. The fuel leak effect was changed, it became less noticeable, more realistic 5. Changed the effect of smoke from a damaged engine 6. Changed the effect from the wing tips, tried to achieve a wavy, more believable character. 7. Added the effect of hitting bullets on a wooden plane 8. Improved visualization of light tracers of aircraft Jason Source: https://forum.il2sturmovik.com/topic/39224-developer-diary-200-discussion/ This month is going a lot around with new graphical effects. Here there everywhere. New graphical cards coming out that support new graphical effects. Many game engines push forward. Now that question is IL-2 going to follow their footsteps, too? Why I did this topic because I am impressed to see how far WarThunder Game Engine goes. I never expected that their Game Engine can create very good looking WW2 Shooter " Enlisted ". Now that IL-2 goes Ground forces " Tank Crew " to simulate different scenarios than planes. Will IL-2 follow the same way with graphical effects or fall behind them subjecting to this fate to going own ways or push BEYOND them how IL-2 Cliffs of Dover tried? BTW looking at " Enlisted Volokolamsk map Pre-Alpha " I wonder if our " clash at prokhorovka map " from Tank Crew reach the same Quality? - I mean much better sharper detailed Texture for mud, grass, trees, snow why not some 4k Texture where needed, graphical improvements......................... What you think about this? Edited September 1, 2018 by Livai
CanadaOne Posted September 1, 2018 Posted September 1, 2018 BOX is pretty delicious as far as eye candy goes. I still find the rain on the cockpit effect very, very cool. I see no reason BOX will not go ahead with the times and remain top shelf in the eye candy department 1
ShamrockOneFive Posted September 1, 2018 Posted September 1, 2018 When have you known these guys to sit still when it comes to visuals? I wouldn't expect the ray traced stuff any time soon (needs DX12 for a start) but we know they will keep IL-2 looking great. 1
Ehret Posted September 1, 2018 Posted September 1, 2018 Great graphics don't have to come from a new tech, exclusively. 95% of it lies in hands of game artists and current tech is flexible enough to provide some innovations, too. 1
Cpt_Siddy Posted September 1, 2018 Posted September 1, 2018 (edited) Ray tracing is valuable tech in complex environments as it ease the workload on artists who otherwise needs to do it the old fashion way. For an open space environment like IL2? It is meh... Also, the new RTX stuff, yeah, its there to make things happen faster, not look better. Instead of doing all the light manually in a level, you can just slap the level together and push the RTX button, receive the "muh graphics" bacon. Real time, hardware accelerated, raytracing tech is not a new thing, AMD did it like 2 years ago as some other small startups, just google it up before gobbling up all the RTX coolaid. What the new RTX cards shine at is the 4k resolution post processing with the tensor cores and some software magic using them. Still look like dog poop but once they get it right, it might usher the era of proper 4k gaming and .. 8k VR Edited September 1, 2018 by Cpt_Siddy 1
IckyATLAS Posted September 2, 2018 Posted September 2, 2018 (edited) At the moment I use a GTX 1080 Ti card. I have some doubts about the initial immediate performance gains of a RTX 2080 Ti for BOX. For BOSX we have to take the RTX effects out as to be effective this needs that the game be designed for it (according to Nvidia), which at the moment is not the case. So if we take the RTX out remains the Tensor Core implemented pre-trained neural net that will work on pixel prediction to improve on performance for high resolution images like 4K. Here is a comparative table of both cards as it appeared on a Trusted Review website article on 22 August : The hardware improvements of 27% in memory bandwith and 21% additional cuda cores would go for at least 20% improvement in raw hardware terms. Then comes the Tensor Core Effect (TCE) that should boost performance in a sensible way on 2K-4K res images mainly. This performance gain is unknown and I have seen no numerical benchmark on it. When we look through past experience at how hardware raw performance trickles down into effective fps impact (before the game is updated to better use the new hardware, or the hardware driver is also improved to the game requirements) then I would imagine that an initial 7% gain on fps on the present BOX as is, would be realistic. If you are at 60 fps then you go to 64 fps. If you are at 30 fps then you move to 32. Frankly this makes no difference at all. If we now speculate that on 4K images the TCE would add say another 5% we get a total of 12%. You then go from 60 to 67 fps and 30 to 33, or 28 to 31. Visually the most visible effect would be in the low fps range as there we are the most sensible, but 60 to 64 not really, and 100 to 112 not at all. The RTX 2080 Ti is a good buy, because it is innovative in the longer term, will lead to better visual effects without performance degradation, in the coming months when drivers, TCE and game devs would have released code to use optimally the hardware potential of the board. The real gain will not be a matter of pure fps gain (a little for sure), but an image quality in terms of visual effects without fps degradation that could not be reached otherwise. Edited November 29, 2018 by IckyATLAS grammatical correction
Ehret Posted September 2, 2018 Posted September 2, 2018 (edited) 46 minutes ago, IckyATLAS said: So if we take the RTX out remains the Tensor Core implemented pre-trained neural net that will work on pixel prediction to improve on performance for high resolution images like 4K. Here is a comparative table of both cards as it appeared on a Trusted Review website article on 22 August : This "neural net pixel prediction" sounds like a fancy way to say up-scaling with interpolation... Pixels need to be rendered and (hopefully) not faked - does the company admit that the 2080 lacks a normal fill-rate to achieve something like the 4K? 18 hours ago, Cpt_Siddy said: What the new RTX cards shine at is the 4k resolution post processing with the tensor cores and some software magic using them. Still look like dog poop but once they get it right, it might usher the era of proper 4k gaming and .. 8k VR For 4x more pixels we need 4x more raw fill-rate unless we want to degenerate to console like "ways" of applying lipstick on pigs... (and paying the same, if not more, for that) What happened... where is the generational increase in the raw power? Edited September 2, 2018 by Ehret 1
Cpt_Siddy Posted September 2, 2018 Posted September 2, 2018 1 hour ago, Ehret said: For 4x more pixels we need 4x more raw fill-rate unless we want to degenerate to console like "ways" of applying lipstick on pigs... (and paying the same, if not more, for that) What happened... where is the generational increase in the raw power? The RTX cards use "AI" accelerated lipstick. Remember those benchmarks with "DLSS" on or off?, well the DLSS is the lipstick system accelerated by tensor cores.
Ehret Posted September 2, 2018 Posted September 2, 2018 (edited) Just now, Cpt_Siddy said: The RTX cards use "AI" accelerated lipstick. Remember those benchmarks with "DLSS" on or off?, well the DLSS is the lipstick system accelerated by tensor cores. Well... hopefully there is a bit of super-sampling somewhere and not just a variable blur. Then, why call those new cores tensors? To hide that those new ALUs aren't as flexible as previous shaders and have to work on fixed matrices, maybe?? Edited September 2, 2018 by Ehret
Cpt_Siddy Posted September 2, 2018 Posted September 2, 2018 (edited) 2 hours ago, Ehret said: Well... hopefully there is a bit of super-sampling somewhere and not just a variable blur. Then, why call those new cores tensors? To hide that those new ALUs aren't as flexible as previous shaders and have to work on fixed matrices, maybe?? The idea is that the AI samples some amount of pixel space, and compare it to surrounding pixel space and make an "educated" guess of what the empty pixels are supposed to be. The "dumb" algorithms just averaged the missing pixies, but the "artificially intelligent" program uses some sort of voodoo to actually guess that this is suppose to be a line and this is a part of an object and fill in the missing parts "intelligently". Theoretically, the RTX card can draw frames in the order of: General graphical stuff you do on the CUDA cores, then ray trace the resulting mess, then use tensor cores to touch up and fill in the missing parts. And all those three components can work independently, so the you can have 3 frames in processing pipeline (or work on one frame independently, depending how they setup the data exchange as you cannot ray trace or AI touch up what you have not rendered in some way...) , optimizing your compute power. Ofc, you need to have a game engine that is designed to leverage this technology, because your game engine must have a very specific order in which it orders the system to render the image, unless some sort of universal standard can be agreed upon... Personally, i will wait for the next gen 7nm iteration of this, where all the kinks have been ironed out and the technology is widely accepted. Also, i am curious to see what AMD and Intel are cooking up in their garage. Edited September 2, 2018 by Cpt_Siddy
Ehret Posted September 2, 2018 Posted September 2, 2018 So... they just can not provide enough normal fill-rate and had resort to tricks. No matter what those tricks are this is bad news, actually. Even worse is that tricks demands extra software support and getting something (half) decent at higher res will depend on extra factors. Before, one would get a faster graphics card and be done. Apparently, not now...
Cpt_Siddy Posted September 2, 2018 Posted September 2, 2018 (edited) Well, think this from the perspective of the available technology. You are reaching the limits of transistor, a 7nm is frack all and electrons are unruly little critters. I mean, jesus wept, silicon below 7nm will start having huge problems with tunneling, so you have to do lots of error correction or other exotic trickery... thus raising the cost. And while we have made 1 atom transistors, in a lab, that is the END POINT of physics (caveat: at least the one we have discovered). There is also the fact that most of the exotic logic gates in lab are going to be a bitch and a half to upscale to FAB level production. The NVIDIA s doing the correct thing here and exploring other options to boost the fidelity and frames in the contexts of existing technology. And the keyword in all this mess is EXISTING tech. Since the multi GPU support, and obvious way to remedy this problem, is NON EXISTENT in 99.999999NOINOINNOIN% of all consumer tech today. Doing some AI integration to extend the capabilities of current tech is smart. you can do it on one silicon die, with some software magic, that you can RAM DOWN the throats of developers at the gun point... because you are Frack huge conglomerate, like NVIDIA. And if done right, the tensor core solution to out ever increasing need for speed and graphical crack cocaine can help for the next few generations. Now... if AMD would stop sniffing glue in basement and actually expanded their infinityfabric stuff to work with matrix computation, you could flood the market with modular PC designs, where you can seamlessly increase your performance by just adding three 300 dollar cards to achieve results not even 1200 dollar card can have. I am all down to have multi GPU approach to this problem, but please bear in mind, most of the existing microcode and stuff is from 90's or 80's.... There will be a point where whole new ...EVERYTHING is going to be needed to see the usual gains you have grown used to. Also, little fun fact, the silicon giants spend annually more dough on the RnD than the Apollo program... Edited September 2, 2018 by Cpt_Siddy 2
LLv34_Flanker Posted September 5, 2018 Posted September 5, 2018 S! I would like to see BoX having a graphics engine actually drawing stuff beyond 5km or so. Try altitude bombing and see a building pop into view when almost on top of it. Or fly at medium altitude..you can not see huge hangars on a field unless close to it etc. The stupendous LOD circle needs to be increased and with plane speeds increasing with 262/Tempest etc. the 8km dot range will cause frustration only. Then we could talk about DX12 and doodahmagic effects..:P 2
Livai Posted September 10, 2018 Author Posted September 10, 2018 (edited) Why BoX Graphic look really old outdated when I look at this Graphic, I wonder why? Maybe because it needs some updates This is how Next-Gen Graphic looks for me Edited September 10, 2018 by Livai 1
LizLemon Posted September 10, 2018 Posted September 10, 2018 If doing the shadows is as drop in as nvidia claims then it would be a pretty nice improvement to the game. GI would be nice as well but the game is sorta faking that as is. One big improvement they could pretty much do right now with minimal effort would be making the cockpits use the same shader as the exterior of the plane, giving the cockpits some reflections and a better sense of place. The other thing they really need to do is move over to a PBR pipeline, but that would be a lot of work.
Jackrabbit710 Posted September 21, 2018 Posted September 21, 2018 Anyone benchmarked a 2080ti yet? Mine is apparently delayed for at least a week, but will report back when I can
Henree Posted September 28, 2018 Posted September 28, 2018 the raytracing is only somewhat viable at 1020 rez so you can forget abt 4k raytracing unless you want star citizen performance while playing pacman other than that you get 2 or 3 extra frames here and there for a lot more money. It is an experimental feature at best which will probably work on the 3080ti a second 1080ti is a much better investment. (unless the price of the 2080 will drop drastically it will just not be profitable to develop for it as the userbase will remain tiny)
Ehret Posted September 30, 2018 Posted September 30, 2018 On 9/10/2018 at 3:55 PM, Livai said: Why BoX Graphic look really old outdated when I look at this Graphic, I wonder why? Maybe because it needs some updates This is how Next-Gen Graphic looks for me Those are graphics demos as such they will look better by the principle. There are priorities and compromises - you looked at the Unigine's sim, right? How do you like the presented cockpit, there? How big is the flyable space? What can one do there aside from wandering around? So far, most of stunning looking "demos" resulted in a disappointing games, if any at all.
Livai Posted October 30, 2018 Author Posted October 30, 2018 On 9/30/2018 at 8:54 AM, Ehret said: Those are graphics demos as such they will look better by the principle. There are priorities and compromises - you looked at the Unigine's sim, right? How do you like the presented cockpit, there? How big is the flyable space? What can one do there aside from wandering around? So far, most of stunning looking "demos" resulted in a disappointing games, if any at all. Stunning looking "demos" resulted in a disappointing games is questionable because Developer tend to downgrade the graphic what is faster & easier than to look into the code what cause the problem - what I call laziness The last demo from 2013 was 262x262km flyable space. Now you have 400km visibility distance and and and.................................... Problem? - are the Cockpits as far I have seen they use real cockpits. The point - the graphic from unigine can be done with the current Graphic Engine what BoX already has if you code into the game engine the tech what unigine use. Not sure if Unigine's game engine is intended to simulate anything complicated compared to BoX Game Engine but the base is there. The base where BoX Game Engine lacks and vice versa. The same can be said if we compare CloD Team Fusion with BoX however Team Fusion is always a step ahead -> We have seen what BoX already can do but we never have seen CloD in what CloD is capable to simulate in the right hands.
IckyATLAS Posted November 19, 2018 Posted November 19, 2018 It is amazing that we have so many different graphic engines around. Why Box has its own and does not use one "standard". It is strange that we have not ended up in certain consolidation with a graphic engine that does it all, at least in terms of animated 3d graphics. Rendering, lighting shading etc. all these are known since a lot of time. Why doing one all over again. Having a standard tool would allow devs to forget about the engine and just concentrate on the game itself.
AristocratPanda Posted November 19, 2018 Posted November 19, 2018 I don't know. Maybe coding ambitious flight physics and damage model into a standard engine like that is not an easy task? I don't recall many games with those engines with extreme physics, ballistic and damage modeling. Also, you have to know how to handle all this on a very big map with limited consumer hardware ressources. Can anyone remind me?
kissklas Posted November 21, 2018 Posted November 21, 2018 On 11/20/2018 at 12:01 AM, Kemal said: I don't know. Maybe coding ambitious flight physics and damage model into a standard engine like that is not an easy task? I don't recall many games with those engines with extreme physics, ballistic and damage modeling. Also, you have to know how to handle all this on a very big map with limited consumer hardware ressources. Can anyone remind me? It's more about what you need the physics to do. Standard game physics in games are often not very reliable, and there is usually a limit to which elements are accounted for. Usually a global gravity and forces like wind may be used but they are very simple. When an object falls, air thickness, lift and drag and so forth are not calculated in any way. A simulator uses a very different type of physics simulation, with focus on the things that are important to flying.
JonRedcorn Posted November 23, 2018 Posted November 23, 2018 On 9/2/2018 at 2:17 PM, Ehret said: So... they just can not provide enough normal fill-rate and had resort to tricks. No matter what those tricks are this is bad news, actually. Even worse is that tricks demands extra software support and getting something (half) decent at higher res will depend on extra factors. Before, one would get a faster graphics card and be done. Apparently, not now... You have no idea what your talking about. The 2080 is a more than capable 4k card. The 2080ti is an excellent 60+ fps 4k card. Turing was a midway card anyways.
Ehret Posted November 23, 2018 Posted November 23, 2018 2 hours ago, 15th_JonRedcorn said: You have no idea what your talking about. The 2080 is a more than capable 4k card. The 2080ti is an excellent 60+ fps 4k card. Turing was a midway card anyways. To the contrary - you should have the idea that there was an information embargo imposed by the nVidia and first reviews could be read only in/after September 19. Before that only rumors and wild speculations were available and we could talk only as such. I made that post September 2 well before the 19 so excuse me. You are nitpicking senselessly.
JonRedcorn Posted November 23, 2018 Posted November 23, 2018 3 minutes ago, Ehret said: To the contrary - you should have the idea that there was an information embargo imposed by the nVidia and first reviews could be read only in/after September 19. Before that only rumors and wild speculations were available and we could talk only as such. I made that post September 2 well before the 19 so excuse me. You are nitpicking senselessly. Unfortunately that doesn't negate the copious amounts of idiocy on display in this thread. I literally felt dumber after reading it.
LLv34_Flanker Posted November 26, 2018 Posted November 26, 2018 S! ArmA and it's engine is hardly a showcase of efficient coding. Ain't they working on a new one?
icecream Posted November 28, 2018 Posted November 28, 2018 On 11/26/2018 at 1:16 AM, LLv34_Flanker said: S! ArmA and it's engine is hardly a showcase of efficient coding. Ain't they working on a new one? Probably the enfusion engine from DayZ
jollyjack Posted October 28, 2019 Posted October 28, 2019 On 9/1/2018 at 5:58 AM, CanadaOne said: BOX is pretty delicious as far as eye candy goes. I still find the rain on the cockpit effect very, very cool. Yep, get an umbrella at hand when you play IL2 .. compliments to the dev-dept.
LLv34_Flanker Posted October 29, 2019 Posted October 29, 2019 S! While rain effect is nice, it is like you are driving a car, not going 400-500km/h. Been in quite a few planes going fast and droplets are not like that. Nevertheless a nice touch!
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in
Sign In Now