Controversial counterpoint: Having standardised hardware causes optimisation.
What do I mean?
In game development, people often argue that game consoles hold back PC games. This is true to a point, because more time is spent optimising at the cost of features, but also optimising for consoles means PC players are reaping the benefits of a baseline decent performance even on low end hardware.
Right now I am developing a game for PC and my dev team are happy to set system requirements at an 11th generation i7 and a 40-series (4070 or higher) graphics card. Obviously that makes our target demographic very narrow but from their perspective the game runs: so why would I be upset?
For over a decade memory was so cheap that most people ended up maxing out their systems, the result is that every program is electron.
For the last 10 years memory started to be constrained and suddenly a lot of electron became less shitty (its still shitty) and memory requirements were something that you could tell at least some companies started working to reduce (or at least not increase).
Now we get faster CPUs, the constraint is gone, and since the M-series chips came out I am certain that software that used to be useful on intel macs is becoming slower and slower. Especially the electron stuff which seems to especially perform well on M-chips
What do I mean?
In game development, people often argue that game consoles hold back PC games. This is true to a point, because more time is spent optimising at the cost of features, but also optimising for consoles means PC players are reaping the benefits of a baseline decent performance even on low end hardware.
Right now I am developing a game for PC and my dev team are happy to set system requirements at an 11th generation i7 and a 40-series (4070 or higher) graphics card. Obviously that makes our target demographic very narrow but from their perspective the game runs: so why would I be upset?
For over a decade memory was so cheap that most people ended up maxing out their systems, the result is that every program is electron.
For the last 10 years memory started to be constrained and suddenly a lot of electron became less shitty (its still shitty) and memory requirements were something that you could tell at least some companies started working to reduce (or at least not increase).
Now we get faster CPUs, the constraint is gone, and since the M-series chips came out I am certain that software that used to be useful on intel macs is becoming slower and slower. Especially the electron stuff which seems to especially perform well on M-chips