All of Microsoft's applications used to do that! The 16-bit versions of Word, Excel, PowerPoint, etc all were implemented using bytecode. That technology was only integrated in Microsoft C and made public in 1992 [1], before that the Microsoft applications group used their own private C toolchain. A copy of it can be found in the Word 1.1 source release [2] ("csl").
There was a proprietary programming language called CSL, also named after CharlesS. This language, based on C, had a virtual machine, which made it easier to run on other operating systems (in theory) and also had a good debugger—attributes that were lacking in the relatively immature C product from Microsoft.
CharlesS is Charles Simonyi, ex-Xerox PARC employee, hired away by Microsoft and worked on MS Word as well as creating the Hungarian naming system (the Apps version is the definitive version, not the bastard watered-down Systems version used in the Windows header files) - see https://en.wikipedia.org/wiki/Hungarian_notation.
The blog post included excerpts from internal MS docs for apps developers. An OCR version of one such page in the blog post follows:
====
One of the most important decisions made in the development of Multiplan was the decision to use C compiled to pseudo-code (Pcode). This decision was largely forced by technological constraints. In early 1981, the microcomputer world was mainly composed of Apple II's and CP/M-80 machines; they had 8-bit processors, and 64K of memory was a lot; 128K was about the maximum. In addition, each of the CP/M-80 machines was a little different; programs that ran on one would not automatically run on another. Pcode made the development of ambitious applications possible; compiling to machine code would have resulted in programs too big to fit on the machines (even with Pcode it was necessary to do a goodly amount of swapping). It also allowed us to isolate machine dependencies in one place, the interpreter, making for very portable code (all that was necessary to port from one machine to another was a new interpreter). For Multiplan, this was an extremely successful strategy; it probably runs on more different kinds of machines than any other application ever written, ranging from the TI/99 to the AT&T 3B series.
Of course, Pcode has its disadvantages as well, and we've certainly run into our share. One disadvantage is that it's slow; many of our products have a reputation for slowness for exactly that reason. There are of course ways to speed up the code, but to get a great deal of speed requires coding a goodly amount in tight hand-crafted assembly language. Another disadvantage is our Pcode's memory model. Since it was originally designed when most machines had very little memory, the original Pcode specification supported only 64K of data; it was not until Multiplan 1.1 was developed in early 1983 that Pcode was extended to support larger data spaces. A final disadvantage of Pcode is that we need our own special tools in order to develop with it; most obviously these include a compiler, linker, and debugger. In order to support these needs, there has been a Tools group within the Applications Development group almost from the beginning, and we have so far been largely unable to take advantage of development effort in other parts of the company in producing better compilers and debuggers. (It should be noted that the Tools group is responsible for considerably more than just Pcode support these days.)
Although portability was one of the goals of using Pcode, it became apparent fairly early on that simply changing the interpreter was not sufficient for porting to all machines. The major problem lay in the different I/O environments available; for example, a screen-based program designed for use on a 24 by 80 does not adapt well to different screen arrangements. To support radically different environments requires radically rewriting the code; we decided the effort was worth it for two special cases: the TRS-80 Model 100 (first laptop computer) and the Macintosh. In retrospect, the Model 100 was probably not worth the effort we put into it, but the Macintosh proved to be an extremely important market.
====
P-Code was a very important technology for Microsoft's early apps. Cross platform was one reason, as Steven writes. It was also very important for reducing the memory size of code. When I started, we were writing apps for 128k (K, not m or g) RAM Macs. There were not hard drives, only 400k floppy disks. (Did I mention we all had to live in a lake?)
P-Code was much smaller than native code so it saved RAM and disk space. Most Macs had only one floppy disk drive. A market risk for shipping Excel was requiring 512k Macs with two disk drives which allowed for the OS and Excel code to live on the first drive and user's data on the second. Mac OS did not have code swapping functions, each app had to roll its own from memory manager routines, so the P-Code interpreter provided that function as well.
On early Windows versions of Excel the memory savings aspect was extremely important. The size of programs grew as fast as typical RAM and hard disk sizes for many years so saving code size was a primary concern. Eventually Moore's Law won and compilers improved to where the execution trade-off was no longer worth it. When Windows 95 introduced 32 bit code these code size dynamics returned for a different reason – IO bandwidth. 16 bit Excel with P-Code outperformed 32 bit Excel in native code in any scenario where code swapping was needed. Waiting for the hard drive took longer than the ~7x execution overhead of the P-Code interpreter.
I am surprised to hear Steven say that the app teams and Excel in particular were looking in any serious way at the Borland tools. The reality was the CSL compiler had a raft of special features and our only hope of moving to a commercial tool was getting the Microsoft C team to add the features we needed. This was the first set of requirements that came from being the earliest GUI app developers. Because of the early performance constraints a lot of "tricks" were used that became barriers to moving to commercial tools. Eventually this was all ironed out, but it was thought to be quite a barrier at the time. About this time the application code size was starting to press the limits of the CSL P-Code system and we really needed commercial tools.
And Steve Sinofsky's reply:
Technically it was the linker not the compiler. The Excel project was getting big and the apps Tools team was under resource pressure to stop investing in proprietary tools while at the same time the C Tools group was under pressure to win over the internal teams. It was *very* busy with the Systems team, particularly the NT team, on keeping them happy. We’re still 5 years away from Excel and Word getting rid of PCode. Crazy to think about. But the specter of Borland was definitely used by management to torment the Languages team who was given a mission to get Microsoft internally using its tools.
C was very much the Javascript of its day. Hand rolling toolchains and compilers to the bytecode they knew and loved (or despised).
This is awesome history. Formal history only remembers the what’s, the when’s, rarely does it catalog the why’s, or how’s. The decision making process of those early programmers helped shape a whole industry.
Rosetta uses software emulation for x87 floating point. That's slow, but in practice that doesn't matter much. Mac software never had a reason to use x87 FP, every Intel Mac had at least SSE3 support.
Looks like a demonstration that using `long double` math requires dipping into x87 instructions, specifically the `fldt` instruction: "floating point load ten bytes".
The signedness of `char` is implementation-defined, it is signed on x86 but unsigned on ARM. So assigning a plain char to a wider integer type is suspicious, did the programmer expect sign-extension or zero-extension?
It's not implementation-defined in Java because there aren't any unsigned types.
Personally I think explicit typecasts are even more suspicious, because introducing explicit semantics is worse than implicit semantics if the explicit ones are wrong.
To save people opening the link...in France it would be a judge not a prosecutor. France has an Inquisitorial rather than the Adversarial legal system the UK and US have. Put simply, a judge doesn't merely decide between the two cases presented to them, they try and establish the facts
Edit: I said 'UK' where I should have said 'England and Wales'. Scotland and Northern Ireland have their own legal systems, although I believe both have Adversarial systems they are different in some ways. The US system could, however, be seen as a continuation of the English system.
A lot of game devs of that era sadly treated the Adlib/SoundBlaster OPL2 chip as nothing more than a very poor MIDI synth, but it was capable of much better. For example, listen to some of Stéphane Picq's music. https://vgmrips.net/packs/composer/stephane-picq
For sure. That the MU80 (or later) is the best way to hear MIDI game music from the 90s and early 2000s should take nothing away from the passionate musicians and coders who did brilliant (and sometimes unnatural) things to extract better results from the innately weaker hardware of early PC sound cards (Sound Blaster et al).
Yamaha's many various OPL chips were based on cost and feature reduced versions of the FM synthesis engine in Yamaha's professional standalone music synthesizers. These chips powered not only PC sound cards but also a huge variety of 80s and 90s arcade machines, game consoles and home computers (https://en.wikipedia.org/wiki/Yamaha_OPL).
It's fascinating to explore the evolution of the OPL technology as this "lower cost, lower end" alternative to pro gear gained in sophistication and power culminating in 1994's OPL4 chip (aka YMF278) and its even more capable cousin the YMF292 used in the Sega Saturn console. These chips had impressive synthesis power for their consumer console/sound card price point. By the late 90s onboard music synthesis in sound cards, consoles and arcade machines began dying due to increasing storage and CPU power permitting playback and mixing of fully sampled music tracks. By the early 2000s the only remaining consumer application of Yamaha's powerful onboard synthesis chips was for generating ringtones in mobile phones. It's kind of sad such a rich musical legacy ended up playing 5 second mono tracks on half-inch speakers (although those mobile phone chips probably made more money for Yamaha than all their pro music gear til that point combined).
In those days malloc would use sbrk to allocate memory. And yes, mmap was designed to memory map files. Using it to allocate anonymous pages came later.
I always wondered how secure AS/400 actually is. The original implementation might have checked tag bits in hardware (I don't know), but the current (PowerPC) implementation relies on the compiler generating a "check tag bits" every time a pointer is dereferenced [1]. So it seems that any arbitrary code execution vulnerability would be absolutely devastating. And the "SLIC" is not a small microkernel -- it also contains the compilers, the database and other system components. It'd be hard to believe there would no exploitable bugs in there.
[1] https://sandsprite.com/vb-reversing/files/Microsoft%20P-Code... [2] https://github.com/danielcosta/MSWORD