In addition, from what I vaguely recall, one of the big selling point of NT LAN Manager was the ability to deploy network server applications on the server.
Novell responded with Netware Loadable Modules, but they weren’t as versatile and needed specialised knowledge/tools.
Yep, Netware ran entirely in Ring 0. In linux terminology it was a kernel with no user space, and NLMs were kernel modules. Very fast for file serving, but any application could crash the system. Stability was largely a result of lots of updates. NT had userspace, protected memory, etc, and a GUI for setting up TCP/IP.
Netware existed and thrived before NT LAN Manager. NT LAN Manager seemed like the one MS product that couldn't make inroads against established competition. It simply wasn't as good as Netware.
The way I remember it NLMs were pretty stable. Anything on Windows was not stable, userspace or otherwise. Netware's TUI was just as good as NT's GUI for what it needed to do. It wasn't a liability. Netware's superior directory service was more important.
Netware's demise was the transition from IPX to TCIP/IP and the explosion of the WWW. And from my perspective it wasn't really NT that knocked Netware down. It was Linux and Solaris. Novell kinda saw that coming and tried to figure out a future with SuSE. They just never got the combination of their directory server with Linux right in time. Microsoft stumbled around for some years, but they got their directory services figured out before Novell got their OS story straight in the new world.
LAN Manager is a whole family of programs. Don't confused LanMan the family with the one implementation in NT -- LanMan is quite a bit older than WinNT.
LanMan is an opened-up version of 3Com's proprietary DOS-based server OS, 3+Share. I installed many 3+Share boxes in the late 1980s and early 1990s.
3+Share used NetBEUI but LanMan was protocol-neutral, which was rare and exceptional back then. E.g. AppleShare only ran over AppleTalk, Netware only ran over IPX/SPX, and Unix spoke unto Unix -- and nothing but Unix -- over TCP/IP. (Addons to run TCP/IP on other OSes existed but most of them cost money. Often more money than the OS itself, in the case of DOS. And many had proprietary APIs: so for example Quarterdeck DESQview/X used TCP/IP but it couldn't talk to the free TCP/IP stacks Microsoft and IBM eventually distributed. (Two different TCP/IP stacks, natch.)
LanMan ran on OS/2 1.x, on various proprietary Unixes, and on DEC VMS, which DEC marketed as part of its PATHWORKS suite: file/print serving via LanMan, plus Email, terminal emulation, X11 servers for DOS and Windows... all over the DecNet protocol.
I think this depends on what NLMs you were running. An old job had a NetWare 3.12 server running btrieve/pervasive and it ABENDed enough that I learned how to use the debugger to get the console back and dismount volumes to avoid triggering VREPAIR on restart.
> NT had userspace, protected memory, etc, and a GUI for setting up TCP/IP.
That's because Microsoft hired Dave Cutler who previously worked on VMS and knew what he was doing. Microsoft even had their own Unix, but didn't know what to do with it.
Microsoft Xenix (never knew more about it than the name).
For small to medium sized businesses Netware had the advantage that with IPX networking there was nearly no configuration necessary.
No subnetting, assigning of IP addresses to clients or running DHCP services.
The availability of software on the server was limited (i remember backup services, licensing software). But for central file service and printing it was rock solid, even in a bit larger (for the time, around 1995) environments without any issues.
(IRC >200 clients on a single 486 CPU and 4 MB RAM)
> Microsoft Xenix (never knew more about it than the name).
For a year or two there, the only other commercial Unix workstation not made by Sun could be had from Radio Shack: the TRS-80 Model 16 running Xenix. Enough small businesses ran Xenix, with up to 3 simultaneous users on a single stock machine (console + 2 terminals) that Radio Shack kept supporting these things until the late 1980s; with up to an 8 MHz CPU, up to 7 MiB of RAM, and an actual (external) MMU, the Model 16 could handle more workload, theoretically more stably than an x86 machine running Xenix until about the time Xenix/386 came out.
Apollo made competitive workstations at the time until they got swallowed by HP. The Unix workstation market was bigger than Sun, but since they were the most successful nobody remembers how competitive that segment was. The model 16 was a footnote not a competitor
Apollo's Domain/OS (formerly AEGIS) was impressive, but did not gain a full POSIX layer until later in the 80s, as I understand it. So the Model 16 really was the only other commercial Unix workstation, besides Suns, in early 1983. This advantage wouldn't last long; by 1984 other Unix desktops like the HP Integral had emerged.
I believe Apollo had a proprietary OS with limited Unix compatibility. So maybe the grandparent poster is right about the Model 16 being the only other non-Sun desktop Unix for a while, as long as you define Unix tightly enough.
Sun gets the crown because prior to the sun/1 there wasnt really any such thing as a UNIX workstation. You had a terminal connected into a host running UNIX (or VMS) and that was that. My pet theory is that Sun succeeded against Apollo because Sun decided to sell to Wall St quants for their day job number crunching whereas Apollo (and later HPE) sold to engineers doing simulations and CAD. Naturally the quants told their colleagues and the stock went brrr.
Later entrants like SGI targeted their workstations at media creatives (helpfully, Apple were in crisis by this time so A/UX wasnt remotely a problem). IBM and DEC just produced me-too workstations but there was nothing special about AIX or Ultrix unless you were already a customer.
The UNIX wars of the 90s were basically the UNIX vendors trying to take over the whole market and not just their classic turf.
It was so expensive, that we shared a PC tower with the whole class.
Not timesharing, rather we would prepare our C applications on MS-DOS 3.3 with Turbo C 2.0, using with mocks for UNIX APIs, and then take turns of 15 minutes per group, trying to make it work on the Xenix tower.
In other words you have a network, with lots of small computers (clients) talking to one or more big computers (servers).
That is so pervasive since the 1990s that you seem to assume it's how everything worked. It is not. Xenix was strong in the earlier era of host based computing.
The core concept is that you only have 1 computer, the host. It's kept in a special server room somewhere and carefully managed. On users' desks they have just terminals, which are not computers. They are just screens and keyboards, with no "brains". Keystrokes go over the wire to the host, and the host sends back text that the terminal displays.
No network, no computers in front of users.
In the '70s and early '80s this was the dominant model because computers were so expensive. Before microprocessors host machines cost tens of thousands to hundreds of thousands of $/£ and companies could only afford 1 of them.
Most were proprietary: proprietary processors running proprietary OSes with proprietary apps in proprietary languages.
Some companies adapted this in the microprocessor era. For instance Alpha Micro sold 680x0 hosts running a clone of a DEC PDP OS called AMOS: Alpha Micro OS. It sold its own terminals etc. It was cheaper and it used VHS videocassettes as removable media, instead of disks.
Unix replaced a lot of this: proprietary versions of the same basic OS, on those proprietary processors, but with open standards languages, open standard terminals, etc.
Xenix was the dominant Unix for x86 hosts. It let you turn an 80386 (or at a push a 286) PC into a host for a fleet of dumb terminals.
Xenix as stock came with no networking, no C compiler, no X11, no graphics, no GUI, nothing. Each box was standalone and completely isolated.
But a 386 with 4MB of RAM could control 10 or 20 terminals and provide computing to a whole small business.
No Ethernet, no TCP/IP, no client/server stuff.
Client server is what killed Xenix's market. When PCs became so cheap that you could replace a sub-$1000 terminal with a sub-$1000 PC, which was way more flexible and capable, then Xenix boxes with dumb terminals were ripped out and replaced with a PC on every desk.
Not even the articles from the webpage you've linked talks about "...big computers (servers) [...] with dumb terminals" nor that the concept of client/server is
> The core concept is that you only have 1 computer, the host. [...] On users' desks they have just terminals, which are not computers
Opposite of what you write the linked page starts with
"In a client-server system, a large number of personal computers communicate with shared servers on a local area network" and later explicitly lists
continues with references to Microsoft (N)OS.
And then the refecence from NOS leeds us to
"There are only a few popular choices – Novell, UNIX, Linux, and Windows. The complexity of NOS forces a simple overview of the features and benefits."
So I don't really understand what you point that Netware neither was a file-- nor a print-server.
The issue with UNIX on PCs was the $1000 or whatever licensing cost.
Just as some trivia, Novell bought UNIX System V R4 from AT&T and planned to merge it with Netware to create "SuperNOS", which would have been a direct competitor to NT. But they never got it out the door and spun-off UNIX to (old) SCO.
Around 2006/2007, I was playing around with NetWare 6.5 at work. We had heaps of it but lots of talk about replacing NetWare/eDir/GroupWise with Windows/AD/Exchange (which I think finally did happen after I left the place). My recollection was it was quite unstable - because, having come from Linux, I was playing with bash and SSH. bash would crash a lot (something that very rarely happens on Linux) but it wouldn’t bring down the whole server (which was a dev/test NetWare server anyway). I don’t remember what exactly I was trying to do: I had some work-related justification, which I forget now - something something identity management - but my real reason was just to explore the system. The instability of it convinced me to not take any ideas I had any further.
The software I was working on in the late '80s made use of Btrieve, a ISAM database server running on Netware. IIRC there was also a SQL server of some sort that we used with it, mostly for reporting.
Novell responded with Netware Loadable Modules, but they weren’t as versatile and needed specialised knowledge/tools.