Make sure you do your research on your target region! For example, most players in the US are playing on shitty laptops. It comes as a surprise for many because of echo chambers like r/ battlestations but that’s what polling tends to show. There’ll always be the market for next gen hardware games, but similarly there will always be a market for games that just run well on shitty laptops because turns out most people can’t afford or don’t have space for a nice desktop
Conspiracy answer?
Because somebody's gotta keep the graphics card companies in business.
Me? I've found that now that I'm pushing 40 and finally can afford whatever kinda hardware I could ever want, I just don't care about realistic graphics that much. More often than not they just make the game unplayable for me, and it's got nothing to do with FPS. Grandpa can't keep up with billions of particles and scenes where the contrast between shadows and highlights make the screen look like analogue tv static. Diablo II looked playable. Diablo II remake looks like a god damn migrane nightmare.
Just do what Nintendo does and optimize your game so it can run on a weak handheld console. There are just too many AAA games with bad optimization because of $$
I understand what you‘re saying but the Switch regularly reaches its limit with Zelda Tears of the Kingdom, and you can feel it when the FPS drop every couple minutes, despite it being a 30 FPS games (which is unacceptable nowadays). Optimization can be a game changer but you can only optimize to a certain point.
My view on this is that it's normal. Games requirements go up over time, the only difference is how fast this process gets.
In particular it's fastest roughly every 2-3 years after new consoles generation is released. Since first games are still targeting previous generation so it's not that much. Then dedicated titles show up and suddenly PC gamers go "oh noes, our requirements".
PS5 is fast. 16GB memory, like 5GB/s bursts from SSD, 10 Tflops of compute power. I think a close equivalent in a PC would is 3700X + 16GB RAM + 6700XT + NVMe SSD. That's a lot more than most people have at homes.
Compare that to PS4 that ran on a CPU that was defeated by a 4th gen i3 and a GPU that was comparable to a GTX 1050. Of course these ports ran smoothly since you could in theory beat a PS4 with an office computer by the end of it's lifecycle.
But people WERE complaining about requirements every single time after every console release. PS2, Xbox 360, PS3, PS4. It was less noticeable in PS2 era obviously since hardware's performance literally doubled every 18 months. But now that progress is much slower then it will take longer before PCs catch up to consoles on average.
It will stop being a problem in 2 years time again unless Sony makes a mid generation refresh and shoves 7900XTX equivalent inside their PS5 Pro (at which point you can bet ultra settings in latest games in 4k will ask you for 20GB VRAM).
And then it will become a problem again in about 5 years as PS6 comes out and dedicated games for it also become a thing.
> I think a close equivalent in a PC would is 3700X + 16GB RAM + 6700XT + NVMe SSD.
According to the tests done by Digital Foundry, the PS5 has CPU performance comparable to a Ryzen 3600, and GPU performance comparable to a RTX 2070 Super (which in current gen terms means RTX 3060, RX 6600 XT ballpark). Those were tests done with actual released games running in real time with a FPS counter, not guesses based on specs.
Compared to a 3700X, the PS5 has lower clocks (3.5 GHz vs 4.2 GHz), much less cache (8 MB vs 32 MB), and uses the wrong type of memory with worse latency (GDDR6 vs DDR4) further reducing CPU performance.
Also, the SSD itself is unremarkable on the PS5, what makes it good is the middleware stack around it (more efficient API + hardware-accelerated decompression), which is why PC needs DirectStorage to have the same functionality. But hardware wise, with DirectStorage any non-bottom-of-the-barrel NVMe SSD will match or outperform the PS5.
What hardware requirements do you consider "too high"?
I have seen complaints recently about games having an RTX 20xx graphic card as recommended hardware requirements. Those cards came out *5 years ago*.
It has been like that like, forever?
But I feel it had actually slowed down between 2010-2020 and is getting faster again. In the 2000s, the hardware requirement grew much faster. At least how I remember it.
I played Wing Commander 2 with Speech extension in VGA. And because of my Am286 at 16MHz it was blazing fast. But a classmate had to play on EGA mode with no sound because of low memory and hard disc space. However most classmates never played Wing Commander 2 on their Commodore 64 or NES. SNES version was such a mess. But these systems had cool new games too.
I mean it’s kind of a natural progression. When graphics technology gets better and more process run in the game simultaneously of course it’s going to need an increase in hardware requirements. It’s not like companies raise it just to raise it most of the time. On the flip side of that, I haven’t personally seen a big hike in requirements across the gaming industry. My friend has a 1080ti, an I7 something or other, and 32gb of ram yet he’s able to keep up with my pc extremely well. Even call of duty MW2 (2022) recommends only a 1060 and an i5-6600. The only game I can think of that the latest hardware is required to play smoothly is Star Citizen, but that game is so poorly optimized that it’s unfair to use that as a proper example.
Make sure you do your research on your target region! For example, most players in the US are playing on shitty laptops. It comes as a surprise for many because of echo chambers like r/ battlestations but that’s what polling tends to show. There’ll always be the market for next gen hardware games, but similarly there will always be a market for games that just run well on shitty laptops because turns out most people can’t afford or don’t have space for a nice desktop
Increasing hardware requirements has been the trend for the 30-ish years I've been paying attention.
Conspiracy answer? Because somebody's gotta keep the graphics card companies in business. Me? I've found that now that I'm pushing 40 and finally can afford whatever kinda hardware I could ever want, I just don't care about realistic graphics that much. More often than not they just make the game unplayable for me, and it's got nothing to do with FPS. Grandpa can't keep up with billions of particles and scenes where the contrast between shadows and highlights make the screen look like analogue tv static. Diablo II looked playable. Diablo II remake looks like a god damn migrane nightmare.
Just do what Nintendo does and optimize your game so it can run on a weak handheld console. There are just too many AAA games with bad optimization because of $$
I understand what you‘re saying but the Switch regularly reaches its limit with Zelda Tears of the Kingdom, and you can feel it when the FPS drop every couple minutes, despite it being a 30 FPS games (which is unacceptable nowadays). Optimization can be a game changer but you can only optimize to a certain point.
My view on this is that it's normal. Games requirements go up over time, the only difference is how fast this process gets. In particular it's fastest roughly every 2-3 years after new consoles generation is released. Since first games are still targeting previous generation so it's not that much. Then dedicated titles show up and suddenly PC gamers go "oh noes, our requirements". PS5 is fast. 16GB memory, like 5GB/s bursts from SSD, 10 Tflops of compute power. I think a close equivalent in a PC would is 3700X + 16GB RAM + 6700XT + NVMe SSD. That's a lot more than most people have at homes. Compare that to PS4 that ran on a CPU that was defeated by a 4th gen i3 and a GPU that was comparable to a GTX 1050. Of course these ports ran smoothly since you could in theory beat a PS4 with an office computer by the end of it's lifecycle. But people WERE complaining about requirements every single time after every console release. PS2, Xbox 360, PS3, PS4. It was less noticeable in PS2 era obviously since hardware's performance literally doubled every 18 months. But now that progress is much slower then it will take longer before PCs catch up to consoles on average. It will stop being a problem in 2 years time again unless Sony makes a mid generation refresh and shoves 7900XTX equivalent inside their PS5 Pro (at which point you can bet ultra settings in latest games in 4k will ask you for 20GB VRAM). And then it will become a problem again in about 5 years as PS6 comes out and dedicated games for it also become a thing.
> I think a close equivalent in a PC would is 3700X + 16GB RAM + 6700XT + NVMe SSD. According to the tests done by Digital Foundry, the PS5 has CPU performance comparable to a Ryzen 3600, and GPU performance comparable to a RTX 2070 Super (which in current gen terms means RTX 3060, RX 6600 XT ballpark). Those were tests done with actual released games running in real time with a FPS counter, not guesses based on specs. Compared to a 3700X, the PS5 has lower clocks (3.5 GHz vs 4.2 GHz), much less cache (8 MB vs 32 MB), and uses the wrong type of memory with worse latency (GDDR6 vs DDR4) further reducing CPU performance. Also, the SSD itself is unremarkable on the PS5, what makes it good is the middleware stack around it (more efficient API + hardware-accelerated decompression), which is why PC needs DirectStorage to have the same functionality. But hardware wise, with DirectStorage any non-bottom-of-the-barrel NVMe SSD will match or outperform the PS5.
What hardware requirements do you consider "too high"? I have seen complaints recently about games having an RTX 20xx graphic card as recommended hardware requirements. Those cards came out *5 years ago*.
It has been like that like, forever? But I feel it had actually slowed down between 2010-2020 and is getting faster again. In the 2000s, the hardware requirement grew much faster. At least how I remember it.
There are WAY more games that come out every week and run on low hardware requirements.
I guess I'll just continue playing windows 95 games.
Oh HELL yea! https://archive.org/details/300\_Arcade\_Games\_COSMI\_2000
I played Wing Commander 2 with Speech extension in VGA. And because of my Am286 at 16MHz it was blazing fast. But a classmate had to play on EGA mode with no sound because of low memory and hard disc space. However most classmates never played Wing Commander 2 on their Commodore 64 or NES. SNES version was such a mess. But these systems had cool new games too.
I mean it’s kind of a natural progression. When graphics technology gets better and more process run in the game simultaneously of course it’s going to need an increase in hardware requirements. It’s not like companies raise it just to raise it most of the time. On the flip side of that, I haven’t personally seen a big hike in requirements across the gaming industry. My friend has a 1080ti, an I7 something or other, and 32gb of ram yet he’s able to keep up with my pc extremely well. Even call of duty MW2 (2022) recommends only a 1060 and an i5-6600. The only game I can think of that the latest hardware is required to play smoothly is Star Citizen, but that game is so poorly optimized that it’s unfair to use that as a proper example.