A shame, really. 68k was (and is) much more approachable for those learning assembly. No need to deal with 64k segmented memory, for instance.
As an aside… National Semiconductor also had an ill-fated architecture in the NS32000, which I also wish took off. On paper, it really did a lot right (VAX-like design, flat memory model, 32-bit almost immediately out of the gate) yet NS was woefully incapable of producing masks without bugs. It took them many tries to get it right, and before then, they already were being beat to market by their competition.
Then to add insult to injury, NS’ own compiler for NS32000 chips was producing rather unoptimized code. It took GNU porting GCC to the platform in 1987 for them to fully realize their potential, years after they missed their chance.
If NS did have their act together… dare I say an IBM PC built around their CPU would have been possible and more interesting than the 8088 they ultimately went with.
AFAIK, NS used the Green Hills compiler; at least, my ns32532 dev system comes with it. It's not great, but not terrible. I personally don't remember the compiler being in the top 5 show-stopper issues with the 32k (the first 3 were 'cpu bugs', 'mmu bugs' and 'fpu bugs'). And it was slow, particularly if you used the MMU.
The 32000 line (like the 68000) found a very long life as an embedded processor, particularly in the printer/fax space (ns32cg16 and followons, ns32gx32).
The 32332 was a nice processor. The 32532 was very, very nice. Both way too late.
Given what IBM was trying to deliver with the PC, I doubt they'd have looked at the 32000. Single source, few i/o support chips, relatively expensive, etc., etc. Way more likely that a non-Intel IBM PC would have had a Z8000 inside (and not a 68k, for mostly IBM political reasons).
That said, I’d possibly contest you on the single source issue you brought up. IBM likely would have told NS… much like they told Intel back in the day… that if they wanted to do business with them, that they needed to ensure second sourcing was possible.
Judging by how desperate NS was willing to make deals, I’m quite sure that hurdle would have been overcome quite easily, with AMD or even MOS Technology stepping up to fill the void.
If we want to pick nits, NS had a second source: TI. But that was, afaik, just paperwork at that point (and I honestly don't know if TI ever produced a 32k processor). It takes time to bring a second source on-line. And given the trouble NS had building the chips themselves, if I was IBM, I'd have Questions.
That said...even if NS could wave a magic wand and produce a second source, there were plenty of other reasons to discount the 32k, and I've never seen the slightest evidence that IBM ever considered it.
Dream away. How much weirder a dream would it be if IBM had gone Zilog? Fanbois endlessly arguing the superiority of segmented over paged memory? Terrible marketing from an oil company? Wordstar still the standard? I sorta like that multiverse option.
As an aside… National Semiconductor also had an ill-fated architecture in the NS32000, which I also wish took off. On paper, it really did a lot right (VAX-like design, flat memory model, 32-bit almost immediately out of the gate) yet NS was woefully incapable of producing masks without bugs. It took them many tries to get it right, and before then, they already were being beat to market by their competition.
Then to add insult to injury, NS’ own compiler for NS32000 chips was producing rather unoptimized code. It took GNU porting GCC to the platform in 1987 for them to fully realize their potential, years after they missed their chance.
If NS did have their act together… dare I say an IBM PC built around their CPU would have been possible and more interesting than the 8088 they ultimately went with.