The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
Nvidia doesn't share dies between their high-end datacenter products like B200 and consumer products. The high-end consumer dies have many more SMs than a corresponding datacenter die. Each has functionality that the other does not within an SM/TPC, nevermind the very different fabric and memory subsystem (with much higher bandwidth/SM on the datacenter parts). They run at very different clock frequencies. It just wouldn't make sense to share the dies under these constraints, especially when GPUs already present a fairly obvious yield recovery strategy.
You can't turn a GB200 into a GB202 (which I assume is what you meant since GP102 is from 2016), they are completely different designs. That kind of salvage happens between variants of the same design, for example the RTX Pro 6000 and RTX 5090 both use GB202 in different configurations, and chips which don't make the cut for the former get used for the latter.
AMD will be very happy when they do. They are already making great cards, currently running an RX7800XT (or something like that), and it's amazing. Linux support is great too
I got an.. AMD (even today I still almost say “ATI” every time) RX6600 XT I think, a couple years ago? It’s been great. I switched over to Linux back in the spring and yes the compatibility has been fine and caused no issues. Still amazed I can run “AAA” games, published by Microsoft even, under Linux.
PC gaming will be fine even without 8K 120fps raytracing. It will be fine even if limited to iGPUs. Maybe even better off if it means new titles are actually playable on an average new miniPC.
More realistically I guess we get an AMD/Intel duopoly looking quite similar instead.
It will probably be a bigger blow to people who want to run LLMs at home.
That doesn't sem very plausible, how many people are driven away from CounterStrike or like League of Legends because the graphics weren't as good as Cyberpunk or whatever?
Theres a LOT of games that compete with AAA-massive-budget games on aggregate like Dwarf Fortress, CS, League, Fortnite, people are still playing arma 2, dayz, rust, etc Rainbow Six: Siege still has adherents and even cash-payout tournaments. EvE: Online, Ultima Online, Runescape, still goin'
These games have like no advertising and are still moneymakers. Eve and UO are like 20 and 30 years old. Heck, Classic WoW!
Dunno (maybe wow?) but is it the most expensive graphics hardware giving AAA all the money/air or because they have a great reputation as games, solid consistent advertising, a strong network effect and a spot on the top of new release lists?
I feel like league of legends has, wrt the genshin $s, I honestly haven’t checked!
Huh? No? It means that the overall platform is already at 'good enough' level. There can always be an improvement, but in terms of pure visuals, we are already past at a point, where some studios choose simple representations ( see some 2d platformers ) as a stylistic choice.
It is not a question of want. Gaming will exist in some form so I am simply uncertain what you are concerned about.
Can you elaborate a little? What, exactly, is your concern here? That you won't have nvidia as a choice? That AMD will be the only game in town? That gpu market will move from duopoly ( for gaming specifically ) to monopoly? I have little to go on, but I don't really want to put words in your mouth based on minimal post.
Not a locked ecosystem console or a streaming service with lag!
I think if nvidia leaves the market for AI, why wouldn’t AMD and intel, with the memory cartel. So DIY market is gone. That kills lots of companies and creators that rely on the gaming market.
It’s a doom spiral for a lot of the industry. If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
There was no DIY market on 8 and 16 bit home computers with fixed hardware, yet bedroom coding (aka indies) not only did thrive, they were the genesis of many AAA publishers, and to this day those restrictions keep the Demoscene alive and recognised as World culture heritage.
PC was largely ignored for gaming, until finally EGA/VGA card, alongside AdLib/Soundblaster, became widespread in enough households to warrant development costs.
Interesting. Does nvidia offer control? Last time I checked they arbitrarily updated their drivers to degrade unwelcome use case ( in that case, for crypto ). It sounds to me like the opposite of that.
Separately, do you think they won't try to ingratiate themselves to gamers again once AI market changes?
Do you not think they are part of the cartel anyway ( and the DIY market exists despite that )?
<< So DIY market is gone.
How? One use case is gone. Granted, not a small one and one with an odd type of.. fervor, but relatively small nonetheless. At best, DIY market shifts to local inference machines and whatnot. Unless you specifically refer to gaming market..
<< That kills lots of companies and creators that rely on the gaming market.
Markets change all the time. EA is king of the mountain. EA is filing for bankruptcy. Circle of life.
Edit: ALso, upon some additional consideration and in the spirit of christmas, fuck the streamers ( aka creators ). With very, very limited exceptions, they actively drive what is mostly wrong with gaming these days. Fuck em. And that is before we get to the general retardation they contribute to.
<< It’s a doom spiral for a lot of the industry.
How? For AAA? Good. Fuck em. We have been here before and were all better for it.
<< If gaming is just PlayStation and switch and iGPUs there is a lot less innovation in pushing graphics.
Am I reading it right? AMD and Intel is just for consoles?
<< It will kill the hobby.
It is an assertion without any evidence OR a logical cause and effect.
Ahaha are you trolling for entitled gamers? Yeah wouldn't want the real world having to face those. No worries: as long as there are people willing to drop money into expensive gear, somebody will sell it.
That's one hell of a long shot. Are your views applicable to the rest of the entertainment industry? There's plenty of people wasting away in front on Netflix, after all. Or why just entertainment, any "useless" hobbies that are repeatedly done for fun but have no real productive output. Is any comparable "pleasurable" activity that also hooks a minority of people in an unhealthy way bad, or just gaming?
But what's most insane is trying to draw any parallels between gaming and these other things - something that was literally engineered to ruin human lives, biologically (hard drugs) or psychologically (gambling). The harm and evil caused by these two industries is incomprehensible (especially the legal parts of them, like alcohol and casino gambling/sports betting/online gambling), and trying to fit gaming in among them both downplays the amount of suffering inflicted by gambling and hard drugs, as well as villainizes normal people - like the hundreds of millions of people who play games in a sane, non-problematic way or indie game devs who make games because they want to express themselves artistically.
Anyways, I gotta log off HN for a while. I can feel my gaming withdrawal kicking in. I've bankrupted myself four times by only spending my money on gaming, and I've been in and out of rehab centres and ERs as I've been slowly destroying my body with gaming in a spiral of deadly addiction. I think I'll have to panhandle and threaten strangers on the street to buy some Steam cards.
But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
If NVIDIA exits the market, there is still AMD, Intel and PowerVR (Imagination Technologies is back at making discrete PC GPUs, although currently only in China).
Is that due to some kind of issue with the architecture, or just a matter of software support?
In the latter case, I'd expect patches for AMD or Intel to become a priority pretty quickly. After all, they need their products to run on systems that customers can buy.
NVIDIA would still have service contract obligations to fulfil, and would provide support for its existing products for a period of time.
Don’t worry about Nintendo. Their pockets are deep and they are creative enough to pivot. They would retool their stack to support another ARM chip, or another arch entirely.
What goes into a Nintendo console is not prime silicon. When it's time to design the next console, I am sure Nvidia will still be more than happy to give them a design that they have laying around somewhere in a drawer if it means they ship 100M units.
Nothing of value would be lost if all the PC gaming went away. It would be a huge improvement in life skills and mental health and even dating prospects for millions.
Really, I have been gaming before even getting my Timex 2068 in the mid 80's, starting with Game & Watch handhelds, and I don't get "build your aquarium" culture of many PC gamers nowadays.
It is so bad that is almost impossible to buy a traditional desktop on regular computer stores, there are only fish tanks with rainbows on sale.
I was tempted to respond with an offhand comment about the size of the industry or similar, but what axe do you have to grind about PC gaming? You'd prefer folks go to the far more injurious mobile gaming space?
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
reply