Jump to content

Any chance IL-2 Korea comes to Mac?


Recommended Posts

354thFG_Drewm3i-VR
Posted

 

Besides workarounds with Cross Over, etc.

 

Especially considering how disappointing the latest GPUs from NVIDIA are likely to be. It seems there is not much innovation ongoing these days anywhere within the realm of Windows--excepting AI.

 

Any insight? @LukeFF

 

With how powerful and power-efficient Apple's own ARM chips are quickly becoming, to not do so may be a missed opportunity. I know I would love to forever ditch my PC and just have one MBP!

 

Just figured I would ask.

  • Upvote 1
Posted

The Apple chips are actually not that good at gaming, and Apple doesn't support the same APIs, so supporting Macs would essentially mean creating and maintaining two different game engines. That's not going to happen.

AEthelraedUnraed
Posted
On 1/18/2025 at 1:21 AM, 356thFS_Drewm3i-VR said:

It seems there is not much innovation ongoing these days anywhere within the realm of Windows--excepting AI.

Wrong :)

And yes, while a part of modern innovation is directed towards AI, that's because AI is becoming much more important for computing. Take DLSS for example, which runs largely on tensor cores.

 

On 1/18/2025 at 1:21 AM, 356thFS_Drewm3i-VR said:

With how powerful and power-efficient Apple's own ARM chips are quickly becoming, to not do so may be a missed opportunity. I know I would love to forever ditch my PC and just have one MBP!

The one big "advantage" Apple has always had, is that their OS is custom-made for their hardware, since they control both. That means that they can do lots of small optimisations that Windows cannot do, for the simple fact that 99% of Windows hardware wouldn't be compatible with it. Of course, this "monopoly" on hardware in Apple systems also means that they charge exhorbitant prices for it.

 

Yet even with all those optimisations, NVidia's, AMD's and Intel's raw power far exceeds Apple's, if you're not too concerned about energy efficiency (i.e. have good cooling). For example, in this and this tests, Apple has a very small advantage when running single-core processes, but this drops to less than half the i9's performance when running multi-threaded processes (which you're likely to see in gaming).

 

Still, Apple systems are currently more energy efficient than their competitors, for what depending on your use case may be acceptable tradeoffs. Whether that's worth paying their ridiculous prices and essentially becoming vendor-locked... well, let's just say that there are also people who make their whisky and coke with €2000 whisky 💸

 

On 1/18/2025 at 1:21 AM, 356thFS_Drewm3i-VR said:

Any chance IL-2 Korea comes to Mac?

Highly unlikely. Never say never, but I am prepared to take bets :)

354thFG_Drewm3i-VR
Posted (edited)
8 hours ago, AEthelraedUnraed said:

Wrong :)

And yes, while a part of modern innovation is directed towards AI, that's because AI is becoming much more important for computing. Take DLSS for example, which runs largely on tensor cores.

 

The one big "advantage" Apple has always had, is that their OS is custom-made for their hardware, since they control both. That means that they can do lots of small optimisations that Windows cannot do, for the simple fact that 99% of Windows hardware wouldn't be compatible with it. Of course, this "monopoly" on hardware in Apple systems also means that they charge exhorbitant prices for it.

 

Yet even with all those optimisations, NVidia's, AMD's and Intel's raw power far exceeds Apple's, if you're not too concerned about energy efficiency (i.e. have good cooling). For example, in this and this tests, Apple has a very small advantage when running single-core processes, but this drops to less than half the i9's performance when running multi-threaded processes (which you're likely to see in gaming).

 

Still, Apple systems are currently more energy efficient than their competitors, for what depending on your use case may be acceptable tradeoffs. Whether that's worth paying their ridiculous prices and essentially becoming vendor-locked... well, let's just say that there are also people who make their whisky and coke with €2000 whisky 💸

 

Highly unlikely. Never say never, but I am prepared to take bets :)

I think it would be apt to compare apples to apples so say an M4 Max/Ultra to something like the i9 or AMD equivalent listed above. I was never an Apple guy until recently when I purchased an M1 MacBook Air in late 2021 that is simply a joy to use. The OS, build quality, screen, keyboard, etc. made both my desktop at the time (5800X3D+ RTX 3080+ 32 GB Ram), and subsequent Windows laptop (Razer Blade 18 i9-13950HX+ RTX 4090 + 32 GB mobile) completely superfluous minus flight simming in VR. Even for video production, the M1 is so similar in performance that I use it so I'm not bound to a desk and wall socket. 

 

I really think Apple is going to dominate the consumer sector for the next decade and that game devs would be wise to adapt earlier rather than later. It would seem like a huge missed opportunity because not many people are going to buy a 1-2K MacBook + 2-3K gaming PC, when the MacBook does everything and more (and just works better) vs. the PC. Literally only IL-2 is keeping me on Windows somewhat (before Mac, I used to dual boot linux).

 

 

Edited by 356thFS_Drewm3i-VR
AEthelraedUnraed
Posted
10 hours ago, 356thFS_Drewm3i-VR said:

I really think Apple is going to dominate the consumer sector for the next decade

Yeah, that is just not going to happen, for reasons that include:

- exorbitant prices,

- vendor lock,

- very few models to choose from,

- lack of customisability,

- you can get better hardware for a cheaper price, on Windows,

- objectively bad design that's more concerned with looks than anything else,

- really, really bad repairability compared to most Windows laptops.

 

10 hours ago, 356thFS_Drewm3i-VR said:

Even for video production, the M1 is so similar in performance that I use it so I'm not bound to a desk and wall socket. 

You're literally quoting the one sector where Macs excel at. Everything in the design sector is first written for Mac, then backported to Windows.

  • Upvote 1
354thFG_Drewm3i-VR
Posted (edited)
On 1/20/2025 at 6:02 AM, AEthelraedUnraed said:

Yeah, that is just not going to happen, for reasons that include:

- exorbitant prices,

- vendor lock,

- very few models to choose from,

- lack of customisability,

- you can get better hardware for a cheaper price, on Windows,

- objectively bad design that's more concerned with looks than anything else,

- really, really bad repairability compared to most Windows laptops.

 

You're literally quoting the one sector where Macs excel at. Everything in the design sector is first written for Mac, then backported to Windows.

I guess we'll have to agree to disagree, but I don't think you're being fair to the competition: i.e. the video you linked is a pre-M1 MacBook Air with an Intel chip--the new ones don't have nor need fans at all.

 

You can also get a new M1 MacBook Air that is as powerful as most mainstream Windows laptop computers still in 2025, for $650. There are models of various sizes and form factors (13, 14, 15, 16), from $650-$3,500 and everywhere in between. 

 

The $599 Mac mini is as powerful as almost any Windows gaming desktop under $2,000: 

 

 

 

The truth is that the base models are still so powerful that they work for most users: as a result, Mac already possesses 15% of the consumer market. 

Edit: percentage corrected*

 

Again, the future of ARM is just beginning. x86 is expiring and the devs should consider the possibility that Mac becomes the new Windows for most consumers because the trend is headed that way, especially with games like Cyberpunk 2077 coming natively: once developers figure out how much power and efficiency they can leverage with these chips, they will likely begin making games for Mac and then most will have no need of Windows at all.

 

I really do think the devs should consider this if it all possible...they would be shooting themselves in the foot to not.

Edited by 356thFS_Drewm3i-VR
Wrong information
AEthelraedUnraed
Posted
7 hours ago, 356thFS_Drewm3i-VR said:

I guess we'll have to agree to disagree, but I don't think you're being fair to the competition: i.e. the video you linked is a pre-M1 MacBook Air with an Intel chip--the new ones don't have nor need fans at all.

Alright, I'll admit that "bad design" is somewhat subjective, as its definition depends on what you consider important. I do wonder why you think my other points are "unfair to the competition" though.

 

7 hours ago, 356thFS_Drewm3i-VR said:

There are models of various sizes and form factors (13, 14, 15, 16), from $650-$3,500 and everywhere in between.

Oh wow - a grand choice of, what's it, 17 current-gen laptop models. Of which I can afford 4.

I hope you don't seriously want to argue that apple has anywhere near the amount of choice there is in Windows computers.

 

8 hours ago, 356thFS_Drewm3i-VR said:

You can also get a new M1 MacBook Air that is as powerful as most mainstream Windows laptop computers still in 2025, for $650.

[...]

The $599 Mac mini is as powerful as almost any Windows gaming desktop under $2,000: 

The cheapest first-hand Apple (let's leave second-hand and refurbished products out of the equation since prices are too unpredictable) I could find in mainstream stores was €949, an Apple MacBook Air with 8-core M2 chip. Actually, this one was €999 but I couldn't find the link for the €949 model anymore so I'll give you the €50 for free. For the same price, you can find many laptops including this one: double the RAM, double the disk size, a CPU that's 1.6 times as fast, and a GPU that's almost 2x as fast (couldn't find GPU tests for the 8-core M2 so I tested the 10-core; again, you get that one for free). Moreover, the GPU has 6GB VRAM of its own rather than having to share it with the CPU.

 

Let's have a look at the Mac Mini. Again, prices are a bit more expensive (which makes sense since I'm in a different part of the world - also remember that Dutch listed prices by law include sales tax). The cheapest one is €719. I'll grant you that Apple has found a bit of a niche here, with a gap in Windows PCs (high-performance desktops cost more; cheap desktops perform less). For €80 more however, you can get this PC, which has a 20% slower CPU, but in return offers you 4x the SSD size and a much, much better GPU (around 3x the FPS in most game benchmarks) that has 8GB dedicated VRAM. But still, it's a little bit more expensive. Let's call this one a tie. Still, it's a very far cry from "as powerful as almost any Windows gaming desktop under $2,000".

 

9 hours ago, 356thFS_Drewm3i-VR said:

The truth is that the base models are still so powerful that they work for most users

Well yes, but given that "most users" don't do much more than watching cat video's and writing the occasional e-mail, so are Chromebooks that you can get first-hand for as little as €249.

 

9 hours ago, 356thFS_Drewm3i-VR said:

Mac already possesses 35-40% of the consumer market.

I don't know where you found those stats, but they belong to the realm of fantasy. Current market share of Apple PCs is around 10 to 15%. Even in their powerbase the US, new shipments account for around 15% of all PCs.

 

10 hours ago, 356thFS_Drewm3i-VR said:

Again, the future of ARM is just beginning.

Eh, not quite. ARM is already 40 years old. Also, ARM chips are not exclusive to Apple. Even if ARM magically displaces Intel and AMD, which I doubt it will, there's no reason to assume that would be on Apple systems only.

 

And I doubt that ARM will suddenly displace Intel and AMD in the near future. Yes, their processors give good performance for a very good energy usage, but I disagree that that their designs are that much out of the ordinary. One of the reasons (though admittedly not the only one) why the M4 processor is so energy-efficient, is that it runs on 3nm tech while Intel's flagship the i9 14900 still runs on 10nm. It's no secret who ASML's first customer for their latest lithography machine was - and it ain't Apple or ARM.

 

10 hours ago, 356thFS_Drewm3i-VR said:

x86 is expiring and the devs should consider the possibility that Mac becomes the new Windows for most consumers because the trend is headed that way, especially with games like Cyberpunk 2077 coming natively: once developers figure out how much power and efficiency they can leverage with these chips, they will likely begin making games for Mac and then most will have no need of Windows at all.

This is just wishful thinking on your part. If we're looking at trends, then surely Linux will become the new Windows since its market share has increased by 158% over the past decade, versus 63% for Apple. As much as I'd love to see that happen, I don't think we ever will.

 

I grant you one thing though, namely that more and more games run natively on Mac (as well as Linux). But that has as much to do with many game engines (e.g. both Unreal and Unity) offering cross-platform support out-of-the-box than with anything else. And I think that's a good thing. I'd like to see everyone able to use his/her preferred PC, and that includes Mac. Don't get me wrong - I do think that Mac PCs are high-quality and give good performance, and for some tasks (especially anything artistic), they are very likely the best choice out there. But for many other use cases, there are other options that give better performance for a cheaper price, without all the other limitations of Apple (lack of customisation/extension options, vendor lock, little choice between models, comparatively difficult repairs).

  • Like 1
AEthelraedUnraed
Posted

Anyhow, with all of the above been said...

11 hours ago, 356thFS_Drewm3i-VR said:

I really do think the devs should consider this if it all possible...they would be shooting themselves in the foot to not.

Since IL2 uses its own engine that's DirectX 11 based, porting it to Mac would mean significant re-programming of major parts of the engine. I'd say it's about as much work as programming an entire new module. Given Apple's current market share and its trends, I think they'd be shooting themselves in the foot if they did. This might be different for Combat Pilot which uses Unreal 5 and therefore should be relatively easy to port. But for IL2, I think it's extremely unlikely you're ever going to see a Mac version.

 

Note that there are several options for running Windows games on a Mac, e.g. Parallels or Boot Camp; if you've got a good enough model, you could most likely get acceptable performance.

Posted (edited)
12 hours ago, 356thFS_Drewm3i-VR said:

The $599 Mac mini is as powerful as almost any Windows gaming desktop under $2,000: 

 

 

Did you actually watch that video? Because it clearly shows that the Mini is less than half as fast than their desktop, in the game they tested. They build that desktop to almost the same price point as the Mini, so that Mini can't even beat a $600 PC, let alone a $2000 PC. But I can build something that is actually usable for modern games and with much better components than that desktop for $900: https://pcpartpicker.com/list/73bbkf

 

And that desktop has 32 GB of RAM instead of 16 GB for the Mini, and that is unified memory, so it's even worse than it looks. The desktop I specced has 1 TB of storage instead of a puny 256 GB for the Mini. Is it still 2015 when 256 GB was a barely acceptable amount of storage? Apple has this trick where they offer a base model whose price looks enticing, but then you need to upgrade it to get to somewhat decent specs, and the cost balloons. Even the $999 model only has 24 GB of RAM and 512 GB of storage. Getting 32 GB of RAM and 1 TB of storage increases the price to $1399, so almost 50% more than my build.

 

And my build support a RAM upgrade, GPU upgrade, CPU upgrade and storage upgrade, with off the shelf components, so you can upgrade for a good price.

 

This has been the case for a long time with Apple. If you take the exact specs that Apple is willing to give you for their 'cheap' prices, then it doesn't look crazy to a comparable system. But in reality people often have their own demands and have to find a computer that matches those demands. Then you end up having to start adding these absurdly priced upgrades, or move up to more expensive product category (although you might only need to do that for one feature). With a PC, you can build affordable systems to a much larger variety of needs.

 

12 hours ago, 356thFS_Drewm3i-VR said:

 

Again, the future of ARM is just beginning. x86 is expiring...

 

These claims are just assumptions, that show that you've drunk the kool-aid. The end of x86 has been predicted for ages. For example, in the 90's, PowerPC was going to beat x86:

 

PowerPC-Future.jpg

 

In reality, PowerPC started off competitively, but then fell behind. In fact, Apple went from PowerPC to Intel, because Intel was so much ahead. With Apple's chips, we see that they had a banger with M1, but following generations had modest speed improvements. And history has shown that backwards compatibility is worth a whole lot to people, so there needs to be a major reason to migrate everything.

 

Apple did very well to optimize their system for responsiveness, and for the use cases common to 'creative' professions, but their design is not optimized for gaming. And the Apple ecosystem is simply not able to cater to the large variety of needs that the PC ecosystem caters to.

Edited by Aapje
  • Upvote 1
[CPT]Crunch
Posted

You going to port all my hardware peripherals over to Apple too, because I ain't jumping without them.  It's dead on arrival till that happens.  

  • Upvote 1
354thFG_Drewm3i-VR
Posted
12 hours ago, AEthelraedUnraed said:

Alright, I'll admit that "bad design" is somewhat subjective, as its definition depends on what you consider important. I do wonder why you think my other points are "unfair to the competition" though.

12 hours ago, AEthelraedUnraed said:

 

Oh wow - a grand choice of, what's it, 17 current-gen laptop models. Of which I can afford 4.

I hope you don't seriously want to argue that apple has anywhere near the amount of choice there is in Windows computers.

Choice? No, but there are many options for many budgets and Apple systems also come feature-packed with great software like Pages, Numbers, and Presentations, etc. Those apps in themselves add a lot of value IMO.

12 hours ago, AEthelraedUnraed said:

 

The cheapest first-hand Apple (let's leave second-hand and refurbished products out of the equation since prices are too unpredictable) I could find in mainstream stores was €949, an Apple MacBook Air with 8-core M2 chip. Actually, this one was €999 but I couldn't find the link for the €949 model anymore so I'll give you the €50 for free. For the same price, you can find many laptops including this one: double the RAM, double the disk size, a CPU that's 1.6 times as fast, and a GPU that's almost 2x as fast (couldn't find GPU tests for the 8-core M2 so I tested the 10-core; again, you get that one for free). Moreover, the GPU has 6GB VRAM of its own rather than having to share it with the CPU.

Here is an M1 Air for $650: https://www.walmart.com/ip/Apple-MacBook-Air-13-3-inch-Laptop-Space-Gray-M1-Chip-8GB-RAM-256GB-storage/609040889?classType=VARIANT&athbdg=L1102&from=/search

 

And the computer you posted above is a POS: you can't compare a plastic behemoth to a anondized aluminum unibody with a high-res glass screen! One device works and feels premium (including trackpad and keyboard which I find vital as a writer and web developer), the other creaks, is bulky, and is overall poor quality. I have personally had MSI's "premium" laptop at one point (GS series) and it was a POS in build quality compared to Razer and Apple.

12 hours ago, AEthelraedUnraed said:

 

Let's have a look at the Mac Mini. Again, prices are a bit more expensive (which makes sense since I'm in a different part of the world - also remember that Dutch listed prices by law include sales tax). The cheapest one is €719. I'll grant you that Apple has found a bit of a niche here, with a gap in Windows PCs (high-performance desktops cost more; cheap desktops perform less). For €80 more however, you can get this PC, which has a 20% slower CPU, but in return offers you 4x the SSD size and a much, much better GPU (around 3x the FPS in most game benchmarks) that has 8GB dedicated VRAM. But still, it's a little bit more expensive. Let's call this one a tie. Still, it's a very far cry from "as powerful as almost any Windows gaming desktop under $2,000".

I am sure a purpose-built desktop is better for gaming, but the Mac mini is TINY: for such a small and affordable system, it is genius. It can play games reasonably well and future generations of it could absolutely play something like IL-2 in 5 years or so if it keeps improving. Who wants a bulky desktop in 2025 besides a few niche gamers? That is actually my point.

12 hours ago, AEthelraedUnraed said:

 

Well yes, but given that "most users" don't do much more than watching cat video's and writing the occasional e-mail, so are Chromebooks that you can get first-hand for as little as €249.

To compare a cheap, plastic Chromebook to a premium-feeling and performing ($650) MacBook Air, is very disingenuous. For instance, I make YouTube videos on my M1 Air in Da Vinci Resolve and the experience is on par with my previous desktop and current 18" desktop replacement. It's a bit slower, but sips battery power and works very fast--even with only 8GB of unified memory.

12 hours ago, AEthelraedUnraed said:

 

I don't know where you found those stats, but they belong to the realm of fantasy. Current market share of Apple PCs is around 10 to 15%. Even in their powerbase the US, new shipments account for around 15% of all PCs.

Yes, you are correct and I stand corrected: the current market-share is about 16% of the US consumer market.

12 hours ago, AEthelraedUnraed said:

 

Eh, not quite. ARM is already 40 years old. Also, ARM chips are not exclusive to Apple. Even if ARM magically displaces Intel and AMD, which I doubt it will, there's no reason to assume that would be on Apple systems only.

Yes it's "old" in that sense, but the whole SoC design is very new and revolutionary.

12 hours ago, AEthelraedUnraed said:

 

And I doubt that ARM will suddenly displace Intel and AMD in the near future. Yes, their processors give good performance for a very good energy usage, but I disagree that that their designs are that much out of the ordinary. One of the reasons (though admittedly not the only one) why the M4 processor is so energy-efficient, is that it runs on 3nm tech while Intel's flagship the i9 14900 still runs on 10nm. It's no secret who ASML's first customer for their latest lithography machine was - and it ain't Apple or ARM.

Intel still is on par, but at what costs? much worse power per watt and an overall stalemate: the point is what could be done in the near future with a future high performance Apple chip--if trends stay as they are now. I think it would be unwise to simply dismiss these disruptive leaps in tech. When the M1 came out, literally a fanless MacBook Air was winning in IPC and single-core performance vs. Intel and AMD's best. The situation now is very competitive, which is great, but we haven't really had a chance to see what Apple can do with gaming if given a real chance. Again, I simply asked a question which upsetted the anti-Apple brigade.

12 hours ago, AEthelraedUnraed said:

 

This is just wishful thinking on your part. If we're looking at trends, then surely Linux will become the new Windows since its market share has increased by 158% over the past decade, versus 63% for Apple. As much as I'd love to see that happen, I don't think we ever will.

Me too, but I gave up on Linux and its terrible power efficiency.

12 hours ago, AEthelraedUnraed said:

 

I grant you one thing though, namely that more and more games run natively on Mac (as well as Linux). But that has as much to do with many game engines (e.g. both Unreal and Unity) offering cross-platform support out-of-the-box than with anything else. And I think that's a good thing. I'd like to see everyone able to use his/her preferred PC, and that includes Mac. Don't get me wrong - I do think that Mac PCs are high-quality and give good performance, and for some tasks (especially anything artistic), they are very likely the best choice out there. But for many other use cases, there are other options that give better performance for a cheaper price, without all the other limitations of Apple (lack of customisation/extension options, vendor lock, little choice between models, comparatively difficult repairs).

These are fair points and I agree. I have been a PC user for decades, but have been blown away by my Macbook's capabilities and qualities for writing, creative work, web browsing, light gaming, and web development. I would love to see games offered on every platform with good cross-compatibility. "Exclusives"--of which all tech companies are guilty--irk me. 

10 hours ago, Aapje said:

 

Did you actually watch that video? Because it clearly shows that the Mini is less than half as fast than their desktop, in the game they tested. They build that desktop to almost the same price point as the Mini, so that Mini can't even beat a $600 PC, let alone a $2000 PC. But I can build something that is actually usable for modern games and with much better components than that desktop for $900: https://pcpartpicker.com/list/73bbkf

 

And that desktop has 32 GB of RAM instead of 16 GB for the Mini, and that is unified memory, so it's even worse than it looks. The desktop I specced has 1 TB of storage instead of a puny 256 GB for the Mini. Is it still 2015 when 256 GB was a barely acceptable amount of storage? Apple has this trick where they offer a base model whose price looks enticing, but then you need to upgrade it to get to somewhat decent specs, and the cost balloons. Even the $999 model only has 24 GB of RAM and 512 GB of storage. Getting 32 GB of RAM and 1 TB of storage increases the price to $1399, so almost 50% more than my build.

I may have linked the wrong video (oops). Storage doesn't matter that much to me since I keep most of my files externally, but that's a fair point. IMO, the ram thing is overrated as in my experience even 8GB Ram has not been too limiting for web development, video editing, photo editing, and light gaming.

10 hours ago, Aapje said:

 

And my build support a RAM upgrade, GPU upgrade, CPU upgrade and storage upgrade, with off the shelf components, so you can upgrade for a good price.

 

This has been the case for a long time with Apple. If you take the exact specs that Apple is willing to give you for their 'cheap' prices, then it doesn't look crazy to a comparable system. But in reality people often have their own demands and have to find a computer that matches those demands. Then you end up having to start adding these absurdly priced upgrades, or move up to more expensive product category (although you might only need to do that for one feature). With a PC, you can build affordable systems to a much larger variety of needs.

Fair enough. My point wasn't to tell everyone to go Apple, but to determine feasibility of IL-2 Korea being on Mac. I didn't mean to start a flame war.

10 hours ago, Aapje said:

 

 

These claims are just assumptions, that show that you've drunk the kool-aid. The end of x86 has been predicted for ages. For example, in the 90's, PowerPC was going to beat x86:

 

PowerPC-Future.jpg

 

In reality, PowerPC started off competitively, but then fell behind. In fact, Apple went from PowerPC to Intel, because Intel was so much ahead. With Apple's chips, we see that they had a banger with M1, but following generations had modest speed improvements. And history has shown that backwards compatibility is worth a whole lot to people, so there needs to be a major reason to migrate everything.

I have drunk the Kool-Aid, though it has not been Apple's, but mine through my own experience with the M1 Air after having many (frustrating but "premium") Windows gaming laptops beforehand: the poor battery life, fan noise, heat, poor quality trackpads and keyboards, plastic build, etc. drove me up the wall so I bought a Macbook for work and a desktop for play (IL-2). I then sold the desktop and got a desktop replacement since even that was too much to move around. Now I wish I could have only one premium device do it all again (but never want to go back to Windows). Windows laptops can't touch the efficiency and power on battery of Apple devices. Thus, I (selfishly) would be in favor of IL-2 Korea coming to Apple but it seems impossible due to the legacy code base.

10 hours ago, Aapje said:

 

Apple did very well to optimize their system for responsiveness, and for the use cases common to 'creative' professions, but their design is not optimized for gaming. And the Apple ecosystem is simply not able to cater to the large variety of needs that the PC ecosystem caters to.

 

354thFG_Drewm3i-VR
Posted

My intention in starting this thread was not to initiate a Mac vs. PC flame war. As for that, each have their strengths and weaknesses, but it would nice to be able to have options regarding platform with games like IL-2, which if it isn't possible, I understand but figured it can't hurt to ask.

AEthelraedUnraed
Posted
7 hours ago, 356thFS_Drewm3i-VR said:

Choice? No, but there are many options for many budgets

Yeah, many higher budgets.

 

7 hours ago, 356thFS_Drewm3i-VR said:

Alright then. Not available here, but let's go with it. I presume the $650 is without sales tax as is often the case in US web stores (I changed the delivery to Anchorage, AK that doesn't charge sales tax and the price didn't change). That means it's €754 converted to EUR and with the Dutch 21% sales tax. For the same price, you get this laptop. Slightly better CPU, 2x better GPU with 4GB VRAM of its own, twice the RAM, twice the SSD space. Overall almost twice the bang for your bucks.

 

7 hours ago, 356thFS_Drewm3i-VR said:

And the computer you posted above is a POS: you can't compare a plastic behemoth to a anondized aluminum unibody with a high-res glass screen! One device works and feels premium [...], the other creaks, is bulky, and is overall poor quality.

[...]

the Mac mini is TINY [...] Who wants a bulky desktop in 2025

[...]

To compare a cheap, plastic Chromebook to a premium-feeling and performing ($650) MacBook Air, is very disingenuous.

Your point has now literally become that with Apple, you pay for design. I'm glad that we're finally in agreement.

 

Whether that's worth the premium price to you, well, to each his own I guess. I've got a Lenovo laptop with a cheap looking plastic cover that looks as if it could spontaneously disintegrate when handled roughly - but most importantly it doesn't and it has successfully protected my hardware from damage. Even after 5 years, it's still powerful enough to play any current game on mid settings. And for me, those two things (i.e. functional cover and good performance) are all that matters and I don't give a rat's derrière about "premium feel" and "anondized aluminium unibodies." So I don't think comparing a Chromebook to a MacBook is disingenuous at all when a Chromebook would be good enough for most people's everyday home usage, at almost a quarter of the price of the MacBook you linked.

 

8 hours ago, 356thFS_Drewm3i-VR said:

Yes it's "old" in that sense, but the whole SoC design is very new and revolutionary.

You mean SoC in general or specifically the M1? SoC is already from the 90s and comes with advantages - but also disadvantages which is why it hasn't seen wider usage. If you mean the M1 specifically, then sure it's a good design but nothing revolutionary either, with the core design features also already introduced 8 years prior to the M1 and were also incorporated in Intel chips slightly earlier than Apple.

 

Anyhow, I think the whole "ARM vs x86" discussion is quite separate from "Apple vs others," really. As I linked in my previous post, there are Windows laptops on ARM chips while Apple in the past has had Intel chips. We don't know if Apple will remain on ARM forever or if Windows will remain mostly x86.

 

5 hours ago, 356thFS_Drewm3i-VR said:

My intention in starting this thread was not to initiate a Mac vs. PC flame war. As for that, each have their strengths and weaknesses, but it would nice to be able to have options regarding platform with games like IL-2, which if it isn't possible, I understand but figured it can't hurt to ask.

In your opening post you very heavily implied that current non-Apple brands are at a relative standstill in performance. The opening shots were yours. If you didn't want to start a flame war (PS I think we've kept it civil so far), you could have kept your opinions about "how disappointing the latest GPUs from NVIDIA are likely to be" and how there's "not much innovation ongoing [...] within [...] Windows" in contrast to "how powerful [...] Apple's own ARM chips are quickly becoming" to yourself.

 

Anyhow, I think your main question should be answered by now: there's little chance IL2 ever comes to Mac.

Posted
8 hours ago, 356thFS_Drewm3i-VR said:

My intention in starting this thread was not to initiate a Mac vs. PC flame war. As for that, each have their strengths and weaknesses, but it would nice to be able to have options regarding platform with games like IL-2, which if it isn't possible, I understand but figured it can't hurt to ask.

 

It isn't realistically possible. Sales would be poor, given the lack of flight simmers that have Macs, and the cost in money and labor would be too much for 1CGS to bear.

 

Even MSFS 2024 won't come to Macs, and their business case for doing so is substantially better than for IL-2 (but still poor).

  • Sad 1
354thFG_Drewm3i-VR
Posted
13 hours ago, AEthelraedUnraed said:

Your point has now literally become that with Apple, you pay for design. I'm glad that we're finally in agreement.

 

Whether that's worth the premium price to you, well, to each his own I guess. I've got a Lenovo laptop with a cheap looking plastic cover that looks as if it could spontaneously disintegrate when handled roughly - but most importantly it doesn't and it has successfully protected my hardware from damage. Even after 5 years, it's still powerful enough to play any current game on mid settings. And for me, those two things (i.e. functional cover and good performance) are all that matters and I don't give a rat's derrière about "premium feel" and "anondized aluminium unibodies." So I don't think comparing a Chromebook to a MacBook is disingenuous at all when a Chromebook would be good enough for most people's everyday home usage, at almost a quarter of the price of the MacBook you linked.

 

You mean SoC in general or specifically the M1? SoC is already from the 90s and comes with advantages - but also disadvantages which is why it hasn't seen wider usage. If you mean the M1 specifically, then sure it's a good design but nothing revolutionary either, with the core design features also already introduced 8 years prior to the M1 and were also incorporated in Intel chips slightly earlier than Apple.

 

Anyhow, I think the whole "ARM vs x86" discussion is quite separate from "Apple vs others," really. As I linked in my previous post, there are Windows laptops on ARM chips while Apple in the past has had Intel chips. We don't know if Apple will remain on ARM forever or if Windows will remain mostly x86.

 

In your opening post you very heavily implied that current non-Apple brands are at a relative standstill in performance. The opening shots were yours. If you didn't want to start a flame war (PS I think we've kept it civil so far), you could have kept your opinions about "how disappointing the latest GPUs from NVIDIA are likely to be" and how there's "not much innovation ongoing [...] within [...] Windows" in contrast to "how powerful [...] Apple's own ARM chips are quickly becoming" to yourself.

 

Anyhow, I think your main question should be answered by now: there's little chance IL2 ever comes to Mac.

Again, these are not necessarily my opinions to be honest.

 

With Mac M1 and later I think users get:

 

-great performance

-great efficiency 

-great bang for buck

-great build/component quality

-and a great, streamlined OS

-great, bundled software

 

All for an entry price around $1,000. To me it makes sense to pay a bit more for all of those features and I always end up spending far more on gaming PCs anyway. Having one device to rule them all--even if premium--would be a cost savings for users like me who need a good professional pc and one for gaming.

 

And regarding AMD/Intel/Nvidia,

 

-Nvidia 4000 was great...three years later, 5000 looks to be a modest bump at great cost with a priority placed on AI (which for gamers, isn't very useful).

 -Intel 12th-14th gen were pretty good, but the chips now have a high failure rate; their new Core Ultra 200 chips perform worse in games than 14th gen.

-AMD is doing well but has stagnated a bit as of late, despite their X3D chips being the best Windows/x86 chips for gaming at present; AMD has also not challenged incumbent Nvidia in GPUs.

 

Really, the only one that truly pushed the bar here was AMD and Apple in 2020/2021.

 

Anyway, good discussion and you make a lot of good counterpoints that I can appreciate.

 

 

Posted

In the end, with Korea in itself not being universally acclaimed choice amongst flight sim players, I suppose 1C will closely monitor income from this new game once it's out (impossible to predict now) and adjust hardware compatibility development accordingly.

AEthelraedUnraed
Posted
10 hours ago, 356thFS_Drewm3i-VR said:

-great performance

-great efficiency 

-great bang for buck

-great build/component quality

-and a great, streamlined OS

-great, bundled software

As for "great build/component quality," I agree although one has to be careful not to conflate "great build quality" with "fancy design." Most high-end and business models of other manufacturers have good build quality, even if they have a less sleek design. Good build/component quality is not unique to Apple.

 

"A great, streamlined OS" and "great, bundled software" are personal opinions and can just as easily go the other way. Personally I'd rather not be forced to buy a whole software pack that I'll likely barely use. If I didn't get a free license through my employer, I wouldn't have MS Office installed and I certainly wouldn't have been happy if my PC costed extra money because it included bloatware like that.

 

Regarding "great performance" and "great bang for buck"; as shown multiple times above, you can get about twice the performance for the same buck if you choose other brands. Within their price class, Apple systems are shown to have sub-par performance.

 

11 hours ago, 356thFS_Drewm3i-VR said:

All for an entry price around $1,000. To me it makes sense to pay a bit more for all of those features

Of course, how much money you're prepared to spend on things is always a personal question. If you want a sleek looking laptop, you like macOS and the bundled software, you want good battery life and you're prepared to sacrifice a bit of performance for all that (or equivalently, spend a bit more), then sure, buy a Mac, who am I to judge?

 

11 hours ago, 356thFS_Drewm3i-VR said:

Having one device to rule them all--even if premium--would be a cost savings for users like me who need a good professional pc and one for gaming.

Absolutely, and the cost savings can be even greater if this one device is not a Mac. ;)

 

11 hours ago, 356thFS_Drewm3i-VR said:

5000 looks to be a modest bump at great cost with a priority placed on AI (which for gamers, isn't very useful).

I assume that with "priority placed on AI" you mean the increased amount of CUDA cores? It's true that this is great if you want to train Neural Nets, but it's still useful for gaming where things like physics can be run on them. They also allow for better/faster frame gen and upscaling algorithms such as DLSS4 with multi-frame generation that mean that on some games, the €654 5070 performs better than the €1959 4090. So I don't entirely agree that focusing on AI is necessarily less useful for games than adding raw horsepower; depending on what games you play, AI can give a huge improvement in performance.

 

1 hour ago, Art-J said:

In the end, with Korea in itself not being universally acclaimed choice amongst flight sim players, I suppose 1C will closely monitor income from this new game once it's out (impossible to predict now) and adjust hardware compatibility development accordingly.

Frankly I don't think that how well Korea performs has anything to do with it. Even if Korea sells excellently, the costs of porting IL2 to Mac are too great while the market is too small.

Posted (edited)
7 hours ago, Art-J said:

In the end, with Korea in itself not being universally acclaimed choice amongst flight sim players, I suppose 1C will closely monitor income from this new game once it's out (impossible to predict now) and adjust hardware compatibility development accordingly.

 

A poor performing module would only potentially have an impact on the choice of future modules. Why would it affect a porting decision?

 

6 hours ago, AEthelraedUnraed said:

They also allow for better/faster frame gen and upscaling algorithms such as DLSS4 with multi-frame generation that mean that on some games, the €654 5070 performs better than the €1959 4090.

 

Please don't spread misinformation like that. Frame generation is a smoothing technique, that results in less abrupt transitions between frames. However, it doesn't improve responsiveness of the game, which happens when the number of real frames that are rendered goes up. So the generated frames are not equal in quality to the rendered frames, also because of the artifacts that the generated frames have.

 

There is a reason why Nvidia chose MSFS as their showcase for the technology, because in that game, people tend to make very slow movements, and often move their head slowly, admiring the view. In a combat flight sim, responsiveness matters a lot more, so you don't still see the opponent flying straight, while they are actually already turning.

Edited by Aapje
AEthelraedUnraed
Posted (edited)
5 hours ago, Aapje said:

Please don't spread misinformation like that. Frame generation is a smoothing technique, that results in less abrupt transitions between frames. However, it doesn't improve responsiveness of the game, which happens when the number of real frames that are rendered goes up. So the generated frames are not equal in quality to the rendered frames, also because of the artifacts that the generated frames have.

 

There is a reason why Nvidia chose MSFS as their showcase for the technology, because in that game, people tend to make very slow movements, and often move their head slowly, admiring the view. In a combat flight sim, responsiveness matters a lot more, so you don't still see the opponent flying straight, while they are actually already turning.

There is no misinformation. Yes, frame generation results in less abrupt frame transitions (I'm against the word "smoothing" because it isn't; but let's not get too semantical) rather than "actual" frames, but in most (almost all?) games, input is handled in a separate loop from the graphics, and usually on the CPU rather than GPU. So yes, it doesn't improve responsiveness, but neither does it do anything to reduce it, nor is it something that would generally be affected by a GPU upgrade at all. If you mean visual responsiveness, then yes, it reduces it by a bit (around 50ms) - but most importantly with little added latency over earlier DLSS versions. So if you already use older DLSS frame generation, this is an almost-free doubling in framerate.

 

I'm not sure why you think MSFS is the official showcase for their new DLSS 4 multi-frame generation; in the official DLSS 4 comparison it's Cyberpunk, as is the case in their official "sales pitch" and in most other DLSS 4 videos I've seen. In fact, I haven't seen any MSFS videos at all yet. If you know of any, please share them because I'd be interested, but MSFS is definitely not the "showcase for the technology".

 

Please also note that I explicitly said "on some games". Not "on most" games, not even "on many games." Certainly not "on combat flight simulator games." DLSS 4 apparently takes some developer effort to implement, which not every developer will be able to do. Furthermore, frame generation is not equally suitable to every type of game, as you correctly observed. I agree with you that I expect mediocre results at best in a combat flight sim, for not only the reasons you cite but also the fact that there are many entirely different rendering cases that are all equally important - correctly showing a 1 pixel aircraft at 20km distance is every bit as important as the "eye candy" up close.

 

But anyhow, everything I've read and seen so far points to DLSS 4 being a significant improvement over 3, with less ghosting, shimmering and other artifacts, at very little increased latency and reduced VRAM. I don't expect it to work perfectly and I don't expect it to work in all cases, but I have never claimed that. What I did claim - and I stand by it - is that for some games, DLSS 4 brings such improvements that a new-gen card can in effect outperform a more expensive older-gen card (actually, it was a paraphrased citation from the linked article rather than something I claimed myself, but I do stand by it). Especially in the context of the statement I was responding to - that NVidia is at a relative standstill because they focus on AI which for gamers "isn't very useful," I think a 4x FPS increase on modern games is enough to warrant my belief that modern AI-based technologies can in fact be very useful for gamers.

Edited by AEthelraedUnraed
Posted (edited)

The responsiveness depends on the number of real frames. So if you run at 30 FPS of rendered frames, and then use DLSS 4 to increase that to 90 frames, you will still have the responsiveness of 30 FPS. With 30 FPS of real frames, it takes 33 milliseconds for a change in the game world to be reflected on the screen, while it takes 11 milliseconds when 90 real frames are rendered per second.

 

So you can't just do what Nvidia did, and treat that 90 FPS (on a 5070), where 60 are interpolated, as being the same as 90 FPS of rendered frames on a 4090.

 

And Nvidia definitely did present DLSS 3 with MSFS. DLSS 4 indeed claims to improve it, and perhaps they selected Cyberpunk because the artifacting of DLSS 3 was quite bad there. But for DLSS 3 they weren't making artifacting comparisons, but comparing frame gen to no frame gen, so that is where they needed to minimize the impact of the extra latency, to present frame generation in as good a light as possible.

 

Quote

is that for some games, DLSS 4 brings such improvements that a new-gen card can in effect outperform a more expensive older-gen card.

 

No, it can't. Because even though the FPS counter may show a similar or even higher number, the DLSS 4-generated frames are not the same quality as the rendered frames.

 

It's like arguing that there is no difference between sitting in business class in a plane versus sitting in economy, because in both cases you get to your destination in the same time. But that ignores that the experience is better in business class.

Edited by Aapje
AEthelraedUnraed
Posted (edited)
10 hours ago, Aapje said:

The responsiveness depends on the number of real frames. So if you run at 30 FPS of rendered frames, and then use DLSS 4 to increase that to 90 frames, you will still have the responsiveness of 30 FPS. With 30 FPS of real frames, it takes 33 milliseconds for a change in the game world to be reflected on the screen, while it takes 11 milliseconds when 90 real frames are rendered per second.

I have never claimed otherwise, although I would call this "latency" rather than "responsiveness." The real calculations are a bit more involved than yours since there's also some overhead introduced by the algorithm. I based my 50ms on this video, but I forgot to subtract the latency of a "real" frame from it. So an actual latency increase of ~40ms might be a better number.

 

10 hours ago, Aapje said:

And Nvidia definitely did present DLSS 3 with MSFS.

We're talking about the 5000 series and DLSS 4 here. The 4000 series and DLSS 3 have nothing to do with it.

 

Although MSFS is an excellent example of a case where latency doesn't really matter. If you're flying Cessnas low-level over Paris, a 40ms extra latency likely won't bother you while the increase in image quality can make a huge difference.

 

10 hours ago, Aapje said:

No, it can't. Because even though the FPS counter may show a similar or even higher number, the DLSS 4-generated frames are not the same quality as the rendered frames.

I guess that depends on your definition of "quality". They are not ground-truth, no. But if you cannot see the difference as a human, then I would argue that they are essentially the same quality. To quote from the review I linked: "I honestly couldn’t tell the difference [...] The game was smooth and responsive, and I couldn’t see any notable glitches. [The 5070] can genuinely offer a 4090-level experience in a game that’s well optimized for it." Go argue with the reviewer if you don't agree, but if you can't tell the difference, then in my book it's the same performance.

 

10 hours ago, Aapje said:

It's like arguing that there is no difference between sitting in business class in a plane versus sitting in economy, because in both cases you get to your destination in the same time. But that ignores that the experience is better in business class.

I don't think this analogy holds. A better analogy is JPEG vs PNG, when applied to photographs. JPEG is lossy while PNG is lossless. Still, for real-world photographs, the difference is so small that you usually cannot see the difference while the reduction in file size is dramatic. JPEG doesn't work well in all cases (e.g. text or logos), as DLSS 4 will likely also not work well in all cases. But in the cases where it does work well, the performance of JPEG is such that it essentially gives you free size reduction.

 

Anyhow, whether or not you consider it a degradation of performance that a part of your frames is generated, is beside the point. The point here is that recent AI-related developments in the 5000 series can in fact be useful for gamers.

Edited by AEthelraedUnraed
Posted (edited)

 

5 hours ago, AEthelraedUnraed said:

To quote from the review I linked: "I honestly couldn’t tell the difference [...]

 

PC Gamer has zero credibility with me, though. They are known to be trash. For example, here he bases his conclusion on a few minutes of gameplay, on a demo set up to show the absolute best case scenario. A proper test requires tuning the game for ones preferences/needs, not to come out best in a demo. We now have no clue whether the differences will be amplified when using other settings, but it makes a major difference to your claim whether the 4070 can actually keep up for all reasonable settings or merely in a cherry picked demo.

 

Furthermore, properly testing more subtle latency differences, that can still have a significant effect on PvP competitiveness, tends to require actual skills and sufficient time. People differ and if Ben Hardwidge from PC Gamer doesn't notice a difference, then that absolutely doesn't mean that others won't. So it's better to actually measure it. Nowhere in his reporting does he write down any of these caveats. It's just typical trash gamer reporting, that has been an issue in gaming journalism for a very long time.

 

Besides, your claim wasn't that some people won't notice a difference, but that performance of the 5070 would be better in some cases. If performance is worse when you measure it objectively, but some people don't notice that, then the performance is still objectively worse.

 

Quote

A better analogy is JPEG vs PNG, when applied to photographs. JPEG is lossy while PNG is lossless. Still, for real-world photographs, the difference is so small that you usually cannot see the difference while the reduction in file size is dramatic.

 

A lot of people do notice the artifacts of frame generation though. When a 5070 is running 210 FPS with artifacts and increased latency, while the 4090 is running 200 FPS without artifacts and less latency, then the 5070 is faster than the 4090 in the same way that I'm faster than the fastest marathon runners, if I get to use a car and then have to go by foot.

 

Quote

The point here is that recent AI-related developments in the 5000 series can in fact be useful for gamers.

 

That is a completely different claim than the one I objected to, though. However, even then you should really caveat it with 'for some gamers,' because there are plenty of people who do not consider it useful, and who refuse to use these features.

Edited by Aapje
AEthelraedUnraed
Posted (edited)
2 hours ago, Aapje said:

PC Gamer has zero credibility with me, though. They are known to be trash. For example, here he bases his conclusion on a few minutes of gameplay, on a demo set up to show the absolute best case scenario. A proper test requires tuning the game for ones preferences/needs, not to come out best in a demo. We now have no clue whether the differences will be amplified when using other settings, but it makes a major difference to your claim whether the 4070 can actually keep up for all reasonable settings or merely in a cherry picked demo.

 

Furthermore, properly testing more subtle latency differences, that can still have a significant effect on PvP competitiveness, tends to require actual skills and sufficient time. People differ and if Ben Hardwidge from PC Gamer doesn't notice a difference, then that absolutely doesn't mean that others won't. So it's better to actually measure it. Nowhere in his reporting does he write down any of these caveats. It's just typical trash gamer reporting, that has been an issue in gaming journalism for a very long time.

Again, I never claimed it works "for all reasonable settings." I very explicitly said multiple times already that it most likely doesn't, so why do you keep arguing as if I did? In fact, for my claim to hold it would be enough if one single cherrypicked game shows increased performance.

 

2 hours ago, Aapje said:

Besides, your claim wasn't that some people won't notice a difference, but that performance of the 5070 would be better in some cases. If performance is worse when you measure it objectively, but some people don't notice that, then the performance is still objectively worse.

Again, this boils down to definitions of the word "performance." I object to your usage of performance as "being able to generate pixel-perfect copies only." In fact, all rendering techniques have artifacts of their own so saying that an algorithm doesn't perform well because it isn't absolutely identical to a rendered frame is a bit of pot and kettle.

 

Also, please tell me how you objectively want to measure if artifacts are noticeable. By very definition, noticing something is subjective so if someone doesn't notice any artifacts and we assume he isn't lying, then we have to take that at face value. You can ask more people or test more scenarios, but the outcome remains subjective.

 

2 hours ago, Aapje said:

A lot of people do notice the artifacts of frame generation though. When a 5070 is running 210 FPS with artifacts and increased latency, while the 4090 is running 200 FPS without artifacts and less latency, then the 5070 is faster than the 4090 in the same way that I'm faster than the fastest marathon runners, if I get to use a car and then have to go by foot.

And I argue that if your goal is not to win a marathon but to get to your destination faster, then taking a car is indeed the much better option.

 

You're free to use whatever definitions of "performance" you like - but what I claimed was done so under my definitions.

 

2 hours ago, Aapje said:

However, even then you should really caveat it with 'for some gamers,' because there are plenty of people who do not consider it useful, and who refuse to use these features.

That is exactly my whole point. The claim I was responding to is that AI tech isn't very useful for gamers. This is a too strong claim because there are in fact many gamers who consider it useful. If the original claim had been "5000 looks to be a modest bump at great cost with a priority placed on AI (which for some gamers, isn't very useful)," I wouldn't have bothered to reply.

Edited by AEthelraedUnraed
354thFG_Drewm3i-VR
Posted
7 hours ago, AEthelraedUnraed said:

That is exactly my whole point. The claim I was responding to is that AI tech isn't very useful for gamers. This is a too strong claim because there are in fact many gamers who consider it useful. If the original claim had been "5000 looks to be a modest bump at great cost with a priority placed on AI (which for some gamers, isn't very useful)," I wouldn't have bothered to reply.

Most of the AI tech I was referring to inside the 5,000 series had nothing to do with CUDA cores, but the AI-specific sections (Tensor + Ray Tracing) dedicated to running, training, and developing neural networks. These, I still uphold, have little practical value to gamers (and often little to creative professionals), but seem to be geared to exploiting the AI enterprise/business bubble. Thus, the coda core counts on the 5,000 series are higher, but not as high as one would inspect in three years on a smaller TSMC node.

 

See here:

 

Nvidia cares much more about AI than gaming these days, which is why they're designing cards with minimal gaming gains but exponential AI-application gains. And they can do this, because they have zero true competition (just like Intel of old). They are simply paying lip service to gamers, with absurd prices ($2,000 for a 5090 FE lol) to boot.

Posted
17 hours ago, AEthelraedUnraed said:

Again, I never claimed it works "for all reasonable settings."

 

You claimed that "on some games, the €654 5070 performs better than the €1959 4090," without any caveats that even for those games it probably only works in specific situations. By leaving out those caveats, you made a misleading statement, in my opinion.

 

Quote

I very explicitly said multiple times already that it most likely doesn't, so why do you keep arguing as if I did?

 

I went back to see if I'm crazy, and I absolutely do not see you argue this.

 

Quote

 The claim I was responding to is that AI tech isn't very useful for gamers. This is a too strong claim because there are in fact many gamers who consider it useful.

 

The most useful application of AI tech in GPUs is temporal upscaling. I think that you made a bad choice to point to frame generation. I think that calling something 'very useful' is a pretty high bar, and I don't think that claiming that it works well in a few games, under some conditions, clears that bar for me. I'd call it somewhat useful, or not totally useless, if I'm a little less generous.

 

Anyway, we have this review of DLSS 4:

 

 

 

As expected, it's essentially more of the same, simply increasing the issues and benefits that exists with 2x frame gen. So more latency and more visible artifacts. And the issue remains that it works much better when going from high frame rates to even higher frame rates, and that is even more the case when doing 3x or 4x frame gen. That means that it works best for a specific set of circumstances that is probably relatively rare:

  • A very high refresh rate monitor (240+ rather than 144 or below)
  • A game with relatively static content
  • A game (or game settings) that is so undemanding relative to the power of the GPU that you can produce a fairly high refresh rate before you enable frame gen

That last one is the real killer, because it means that the tech makes your GPU age faster, rather than make it last. Because once you cannot maintain a high enough frame rate of real frames, or if you have to lower the quality settings so much that you have high FPS, but very ugly visuals, you are probably better off disabling the tech to minimize latency and/or to improve the graphics settings. It also means that the lower end the card is, the less useful it is, because you will encounter fewer games where you can use it without making things worse.

 

 

AEthelraedUnraed
Posted
10 hours ago, 356thFS_Drewm3i-VR said:

Most of the AI tech I was referring to inside the 5,000 series had nothing to do with CUDA cores, but the AI-specific sections (Tensor + Ray Tracing) dedicated to running, training, and developing neural networks. These, I still uphold, have little practical value to gamers (and often little to creative professionals), but seem to be geared to exploiting the AI enterprise/business bubble. Thus, the coda core counts on the 5,000 series are higher, but not as high as one would inspect in three years on a smaller TSMC node.

 

See here:

 

Nvidia cares much more about AI than gaming these days, which is why they're designing cards with minimal gaming gains but exponential AI-application gains. And they can do this, because they have zero true competition (just like Intel of old). They are simply paying lip service to gamers, with absurd prices ($2,000 for a 5090 FE lol) to boot.

You do know that the increased Tensor Core count is the reason they changed the DLSS architecture from a CNN to a Transformer, right? DLSS 4 is supposedly mostly run on the Tensor Cores. Sure, Tensor Cores are perhaps more useful for training NNs than for gaming, but your claim is that they are "of little practical value to gamers," and that is a much stronger claim that I do not agree with.

 

And the ray tracing cores you mention have relatively little value for machine learning; they're mainly useful for, lo and behold, ray tracing, as done in games.

 

4 minutes ago, Aapje said:

You claimed that "on some games, the €654 5070 performs better than the €1959 4090," without any caveats that even for those games it probably only works in specific situations. By leaving out those caveats, you made a misleading statement, in my opinion.

I'm not writing a scientific paper here, I'm writing a single sentence in a forum post. Would it have been better if I had phrased it "some situations" rather than "some games"? In hindsight, yes. In the games where it does work well, there are probably situations where it doesn't, and in games where it doesn't work well, there's probably situations where it does. But that doesn't make my post misleading. In the context of a forum post, I think it goes without saying that many ifs and buts are left out as they are not important to get the gist of what I'm saying.

 

1 hour ago, Aapje said:

The most useful application of AI tech in GPUs is temporal upscaling. I think that you made a bad choice to point to frame generation. I think that calling something 'very useful' is a pretty high bar, and I don't think that claiming that it works well in a few games, under some conditions, clears that bar for me. I'd call it somewhat useful, or not totally useless, if I'm a little less generous.

Then talk to 356thFS_Drewm3i-VR if you don't agree with the "very useful" phrasing; it's his, not mine. Except for a single sentence in direct response to his phrasing where I say that it "can in fact be very useful for gamers." Can, not "is" and certainly not "is in all cases."

 

And again, we have entirely different views of what is useful. I'm in the camp that would put DLSS frame gen on, you're someone who would leave it off. And that's fine. It's a personal preference and it doesn't make either of us wrong. But fact is that there are many gamers who do consider frame gen very useful, so that counters the original claim, which was the purpose of that sentence.

 

1 hour ago, Aapje said:

Anyway, we have this review of DLSS 4:

 

 

 

As expected, it's essentially more of the same, simply increasing the issues and benefits that exists with 2x frame gen. So more latency and more visible artifacts. And the issue remains that it works much better when going from high frame rates to even higher frame rates, and that is even more the case when doing 3x or 4x frame gen. That means that it works best for a specific set of circumstances that is probably relatively rare:

  • A very high refresh rate monitor (240+ rather than 144 or below)
  • A game with relatively static content
  • A game (or game settings) that is so undemanding relative to the power of the GPU that you can produce a fairly high refresh rate before you enable frame gen

That last one is the real killer, because it means that the tech makes your GPU age faster, rather than make it last. Because once you cannot maintain a high enough frame rate of real frames, or if you have to lower the quality settings so much that you have high FPS, but very ugly visuals, you are probably better off disabling the tech to minimize latency and/or to improve the graphics settings. It also means that the lower end the card is, the less useful it is, because you will encounter fewer games where you can use it without making things worse.

 

 

He has the same opinions about quality and performance as you have. I could completely live with the artifacting he shows in the 30 to 120 FPS parts, and I think you usually have to look for them to see them. Perhaps you have better eyes than I, I don't know, but fact remains that I think there's a very acceptable quality loss. As for the latency, I don't play many games where the latency would be an issue and for the few where it does, I'd just leave it off. Much of this "specific set of circumstances" is nothing more than applying your own subjective values to the rest of the world and expecting that everyone agrees.

 

So again for the umpteenth time, this is just conflicting definitions and opinions. You can argue until the cows come home that your definition of "performance" is better than mine, but it isn't. Multi-frame gen, like all technologies, has its advantages and drawbacks, and whether or not one outweighs the other is personal.

Zooropa_Fly
Posted

I bet you lot are a great laugh at a party.

  • Haha 3
  • Upvote 1
AEthelraedUnraed
Posted
8 hours ago, Zooropa_Fly said:

I bet you lot are a great laugh at a party.

Lol, yeah the whole thing must seem a bit absurd, doesn't it? :P

 

For what it's worth, I do think that @Aapje is obviously very knowledgeable about technology and I harbour no ill feelings towards him. The cases where we have lengthy discussions (it's not the first time...) are usually nitpicking on some small detail, where we basically agree about the main things but are in opposing camps about something else and the "truth," if it exists, is highly dependent on what you're trying to accomplish and what prior assumptions you make.

 

I think that's the case here, and I think this discussion has run its course. As my last posts have largely consisted of repeating myself, I won't bother to write another one on this subject.

Zooropa_Fly
Posted (edited)

No no - you chaps keep at it if you're not finished.

I was trying to keep up, but lost the plot a bit.

Maybe I'll start again at the top :biggrin:

Edited by Zooropa_Fly
  • Haha 2
Lusekofte
Posted
On 1/23/2025 at 11:25 AM, Art-J said:

In the end, with Korea in itself not being universally acclaimed choice amongst flight sim players, I suppose 1C will closely monitor income from this new game once it's out (impossible to predict now) and adjust hardware compatibility development accordingly.

I had expected a lot more talk about it in social platforms from DCS players, I guess people need to advertize this product more. I am simply not interested, yet, I might be , because I think this will be a good one

  • 3 months later...
Chidorin
Posted

+1 with visionos support so when it is mass accessible everything works perfect and there are devs familiar with mac's specifics

  • 1 month later...
S10JlAbraxis
Posted (edited)

Months later - interesting discussion.  I custom build my own PCs spicificly for gameing with the highest quality components and am constantly updating since I enjoy it.  Until Mac drops at least 50% in price, fully supports all Windows games and can be custom built from parts no way I will ever own one even though they have some very nice selling points.  Mac will never be a good choice for gaming.

Edited by S10JlAbraxis
  • Upvote 1
Dragon1-1
Posted

Apple is phasing out Rosetta in 2027. That tells you all you need to know about the chances of anything not explicitly multiplatform from the start (usually meaning one of the recent mainstream engines like UE5) coming to Mac in the future.

SOLIDKREATE
Posted (edited)

The most expensive Mac would get 'Seal Clubbed' by my last Gen i9 14900KF, 48GB 6600MHz RAM, RTX 4070Ti Super. Now don't get me worng, they look amazing and are very, very well contsructed. They just refuse to support any Windows related. I would however love to have the iPad PRO with the apple pencil for making certain parts of my skins (.pngs for certain layers in the .psd). I have a KAMVAS 24in Pen Display however I cant sit on the couch and draw with that. 

 

 

"The Other Woman" - My Wife calls it that 😁

 

PXL-20250713-044905628-MP.jpg

Edited by SOLIDKREATE

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...