For the disassembled code and the ROM images discussed here, see this repository
Something intriguing turned up recently over at the Vintage Computer Federation Forums. Member GearTechWolf occasionally rescues and dumps random ROM chips that show up on eBay, and makes the contents available so they aren’t lost to the ages. One of his hauls turned up two pairs of EPROMs labeled “IBM” in plain dot-matrix: one showing part numbers (and “© IBM CORP 1981,1985”), the other tagged with a specific date (“25/05/90”).
Enigmatic IBM EPROMs
They came with no further identification, and no hints about their origins, or what machines they may have come from. And just to establish that proper setting of suspense and mys…
For the disassembled code and the ROM images discussed here, see this repository
Something intriguing turned up recently over at the Vintage Computer Federation Forums. Member GearTechWolf occasionally rescues and dumps random ROM chips that show up on eBay, and makes the contents available so they aren’t lost to the ages. One of his hauls turned up two pairs of EPROMs labeled “IBM” in plain dot-matrix: one showing part numbers (and “© IBM CORP 1981,1985”), the other tagged with a specific date (“25/05/90”).
Enigmatic IBM EPROMs
They came with no further identification, and no hints about their origins, or what machines they may have come from. And just to establish that proper setting of suspense and mystery, neither pair could be content-matched against any known IBM firmware.
Much poking and prying commenced. I didn’t delve very deeply into the ‘25/05/90’ odd/even pair, but it seemed to be some sort of PS/2 BIOS: it proved to contain those telltale extra VGA fonts (on which past verbiage abounds), and to share some other bits and pieces with known Model 35 SX/40 SX ROMs. Seeing that, I prodded the Ardent Tool crew, and Major Tom identified it as rev. 2 of the 35-/40-SX firmware - earlier than other known variants. It’s now up for download on the System ROMs page.
The more intriguing one for me was the ‘1981, 1985’ duo (yellow labels in the photo). A cursory look in a hex viewer revealed the following:
- EPROM 6448246 has the even addresses, 6448238 the odd addresses.
- The internal part numbers are 6480442 and 6480441, respectively.
- The BIOS date stamp in the standard location (F000:FFF5) is 03/08/85.
- At F000:330A, there’s yet another date stamp - 02/14/85.
- The model byte (in the second-to-last position) is FCh.
In 1985, the FCh model byte could only mean the 5170 (PC/AT),1 and the even/odd byte interleaving does point at a 16-bit bus. But there are three known versions of the PC/AT BIOS released during the 5170 family’s lifetime, corresponding to the three AT motherboard types. This one here is clearly not one of them: its date stamps and part numbers don’t match, and the actual contents are substantially different besides.
My first thought was that this may have come from one of those more shadowy members of the 5170 family: perhaps the AT/370, the 3270 AT/G(X), or the rack-mounted 7532 Industrial AT. But known examples of those carry the same firmware sets as the plain old 5170, so their BIOS extensions (if any) came in the shape of extra adapter ROMs. Whatever this thing was - some other 5170-type machine, a prototype, or even just a custom patch - it seemed I’d have to inquire within for any further clues.
The PC/AT BIOS: Known Versions
This was a good time to brush up on the three official revisions of the AT BIOS: how to tell them apart, and how they correspond to hardware options. The following table was compiled mostly from the pages at Minus Zero Degrees (IBM 5170 BIOS Revisions) and PC DOS Retro (IBM PC BIOS version history), and from the info in Ralf Brown’s Interrupt List. Where the sources weren’t in total agreement, I went with what seemed to conform with IBM’s published source code listings.
| PC/AT BIOS revision | Rev. 1 | Rev. 2 | Rev. 3 | Date (US format) | P/N (internal/mask ROM) | P/N (EPROM label) | ID bytes: Model, Submodel, Revision level | Technical Reference (with source code listing) | PC/AT model and mainboard type | CPU clock supported | Keyboards supported | Floppy drives types supported | Hard drive types supported | Checks for “multiple data rate” drive controller? c |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 01/10/84 | 06/10/85 | 11/15/85 | ||||||||||||
| U27, even: 6181028 U47, odd: 6181029 | U27, even: 6480090 U47, odd: 6480091 | U27, even: 62X0820 U47, odd: 62X0821 | ||||||||||||
| U17/U27, even: 6181024/5 U37/U47, odd: 6181026/7 | U27, even: 6448896 U47, odd: 6448897 | U27, even: 61X9266 U47, odd: 61X9265 | ||||||||||||
| FCh, N/A, N/A a | FCh, 00h, 01h | FCh, 01h, 00h | ||||||||||||
| March 1984 | September 1985 | March 1986 | ||||||||||||
| 068, 099 (Type 1) | 239 (Type 2) | 319, 339 (Type 3) | ||||||||||||
| 6 MHz (not tested in POST) | 6 MHz (tested in POST) | 8 MHz (tested in POST) | ||||||||||||
| 84-key AT | 84-key AT b | 84-key AT 101/102-key Enhanced | ||||||||||||
| 360 KB 1.2 MB | 360 KB 1.2 MB 720 KB | 360 KB 1.2 MB 720 KB 1.44 MB | ||||||||||||
| 14 (types 01-14) | 22 (types 01-14, 16-23) | 22 (types 01-14, 16-23) | ||||||||||||
| No | Yes | Yes |
a
Rev. 1 has a model ID byte, but the function to return submodel and revision (INT 15h/AH=C0h) did not exist yet.
b
Rev. 2 appears to have partial/unfinished support for the Enhanced Keyboard: INT 09h performs the necessary decoding (or some of it), but none of the enhanced INT 16h functions are implemented.
c
Rev. 2 and 3 show a “601 Diskette Error” if not found (see the above -0° page for more info).
That’s the low-down on what we have to compare against. In this yet-unknown revision, both of the date stamps within the BIOS image (03/08/85 and 02/14/85) place it in-between revisions 1 and 2. So for the sake of convenience (read: laziness), I’ll be stunningly original and refer to it below as “rev. 1.5”.
Are You AT Enough?
Before I embarked on some actual reverse-engineering, I thought I’d try a little experiment first: how would 86box’s IBM AT emulation fare with this firmware? Easy enough to find out; just substitute these two ROM images for the expected even/odd pair (86box uses the rev. 3 BIOS by default) and see what happens.
Since I don’t have an actual IBM 5170, that’s as close to an “AT compatibility test” as I could get, but this firmware appeared to pass muster:
Our ‘rev. 1.5’ BIOS makes it through POST with no issues. The IBM AT Advanced Diagnostics v2.07 disk loads up, and dutifully reports the firmware’s P/N and date string; System Checkout (which lists the installed hardware) and the SETUP procedure (which is where you configure it) both run as expected.
Evidently this is some manner of PC/AT BIOS, or close enough to make Diagnostics happy. Poking at it under an emulator isn’t going to tell us a whole lot more than this, however: we don’t know exactly what this revision assumes about the hardware, so we can’t expect to tell compatibility issues from configuration errors, or just sketchy emulation. At this juncture (if you’ll excuse the imagery) we might as well roll up our sleeves and start rummaging through the entrails.
Findings
For more detailed notes about the disassembly itself, have a look at the repository. I’ll just mention that the goal was to figure out just where and how this ‘rev. 1.5’ BIOS differs from the others, and this would have been much more difficult if it wasn’t for two things:
- The published source listings for all official AT BIOS versions, from the respective editions of the PC/AT Technical Reference - available on Bitsavers (and on the Internet Archive: #1, #2, #3).
- The excellent reconstructions of the source code by Vernon Brooks over at PC DOS Retro, which have the listings in plain text (and they can be successfully reassembled, too).
I didn’t go quite as far as trying to reconstruct a version that actually builds, but thanks to the above I believe I have things mostly figured out, so here’s my analysis.
The Code Base
Just to get this out of the way, this is very clearly not a custom patch, or a little localized modification. Most obviously, all offsets/addresses differ from the other versions, and not just by a simple, easily-explained shift. The exceptions are certain entry points and tables (mostly in the top 4 KB) deliberately forced to fixed addresses, something that’s been done in every PC-family BIOS.2
In fact, with some (important) differences which will be detailed below, overall the code base appears to be something you’d expect if you were looking at a snapshot of some interim state between rev. 1 and 2. Some sections are closer to their rev. 1 counterparts, others to rev. 2; many contain elements of both, or follow the general logic of one version while still showing certain practices more common in the other.
For instance, a routine may perform essentially the same thing it did in rev. 1, but include certain optimizations which are mostly found only in rev. 2, such as immediate multi-bit shifts (you might see SHL BL, 2 in place of two SHL BL, 1 instructions),3 or updating segment registers with PUSH/POP (instead of MOVing the value through a go-between).
Other routines appear more or less in the same form they have in rev. 2, while signs of the rev. 1 coding style still persist. For example, all the I/O required to access CMOS memory is done inline, as opposed to rev. 2 which calls two new routines for this purpose (CMOS_READ and CMOS_WRITE). Or the encoding of jumps: two-byte (short) jumps are often found padded with a NOP instruction, like in rev. 1, something that no longer happens in the later iterations.4
The more interesting parts, of course, are the sections which are unique to ‘rev. 1.5’ and don’t have direct counterparts elsewhere, but I’ll be getting to them in a bit. Those aside, the whole thing does look like an authentic version of the BIOS code, caught in some intermediate state of development between revisions 1 and 2 - including some (but not all) of the changes that later made it to the second revision, as well as a few modifications that didn’t.
Top-Level Organization (and Build Environment)
The other editions of the AT BIOS were all generated from multiple source files. Here we only have the final image, but by comparing the overall structure against the other versions, we can deduce the breakdown into separate source modules. The arrangement of the code and data here is the same as in rev. 1, which suggests that ‘rev. 1.5’ was built before the restructuring that can be seen in the second revision.
Rev. 1 was apparently assembled with MASM v1.0, but rev. 2 switched to v2.0, as we’re told by the page titles in IBM’s source listing. If the structural overhaul was down to that change, then ‘1.5’ was likely still built with MASM v1.0... a form of cruel and unusual punishment if there ever was one, but perhaps they had some inside scoops from Microsoft on how to deal with all the errata in that famously bug-infested mess of an assembler.
Functionality (and Hardware Support) Comparison
Now for some of the actual similarities and differences between ‘rev 1.5’ and its older/younger siblings. The code may be closer to the first revision in its general structure, but if we take the date stamps at face value, it postdates rev. 1 by more than a year - while the next revision was only three or four months away.
So it wasn’t a complete surprise to find quite a few similarities with rev. 2. For instance,
- It supports 720K (3.5“ DSDD) floppy disks, officially introduced only in rev. 2.
- 21 different hard drive types are available: 01-14 and 16-22, just one short of rev. 2 (and 3), which add drive type 23. Only types 01-14 were recognized in rev. 1 (15 is always reserved).
- It implements INT 15h function C0h (“Get System Configuration”), which didn’t exist in rev. 1, but was present in rev. 2 (and in all later PC compatible BIOSes).
- Keyboard support is more or less the same as in rev. 2: only the 84-key AT keyboard is (fully) supported, but some code for the 101/102-key Enhanced Keyboard is already present. The hardware IRQ handler (INT 09h) attempts to detect it, and uses its expanded scan code tables, but the enhanced INT 16h services are not available.5
In certain other respects, however, there’s more in common with the first revision:
- It doesn’t attempt to verify the 286’s clock speed, a test that was added to POST in revisions 2 and 3 (at checkpoint 11h).67
- When testing the floppy/hard drive controller (“combo card” in IBM-speak), it won’t throw up a “601 diskette error” if it cannot find the “multiple data rate capability” indication bit. What this means in practice is that more third-party controllers should be supported.
- POST checkpoint codes 02 and 03 mean the same things as in rev. 1 (respectively, these tests verify the CMOS Shutdown Byte and the BIOS ROM checksum). The later revisions swap these two tests around.6
Then you’ve got those peculiar sections where ‘rev. 1.5’ does its own thing entirely. The most significant ones handle RAM testing and parity errors: this also provides our biggest clue about just what sort of AT this firmware came from, so I’ll expand on this down below.
For now, just a few notes about some of the above:
The System Configuration Table
This is where we find the machine ID bytes: model, sub-model, and revision level. A pointer to this table is returned by the (new) BIOS function INT 15h/AH=C0h. Per RBIL, this was available in the PC XT since the 1986/01/10 BIOS, in the PC/AT since 1985/06/10 (second revision), and in all subsequent PC and PS/2 machines; but evidently ‘rev. 1.5’ had it first.
The odd part is that it returns sub-model 01 and revision level 00, which is the same as the third revision BIOS - the second revision has sub-model 00, revision level 01. Perhaps the meaning (or the order) of these bytes was still not quite final at this point. On the other hand, the fourth (‘feature’) byte is 70h, like in rev. 2 and 3.
F000:E6F5 08 00 CONF_TBL dw 8 ; LENGTH OF FOLLOWING TABLE
F000:E6F7 FC db MODEL_BYTE ; SYSTEM MODEL BYTE
F000:E6F8 01 db SUB_MODEL_BYTE ; SYSTEM SUB MODEL TYPE BYTE
F000:E6F8 ; [* 1, like rev. 3 (0 in rev. 2) *]
F000:E6F9 00 db BIOS_LEVEL ; BIOS REVISION LEVEL
F000:E6F9 ; [* 0, like rev. 3 (1 in rev. 2) *]
F000:E6FA 70 db 1110000b ; 10000000 = DMA CHANNEL 3 USE BY BIOS
F000:E6FA ; 01000000 = CASCADED INTERRUPT LEVEL 2
F000:E6FA ; 00100000 = REAL TIME CLOCK AVAILABLE
F000:E6FA ; 00010000 = KEYBOARD SCAN CODE HOOK 1AH
Floppy Drive Support
This is one area where the ‘rev 1.5’ code base diverges from all other revisions, but the actual functionality doesn’t seem to be too different from rev. 2. It looks more as if 3.5“ 720 KB media support was shoehorned into the rev. 1 code (which only handled 1.2 MB and 360 KB disks and drives). This somewhat over-complicated the logic of many routines, especially those dealing with state/format bits and variables in the BIOS Data Area whose formats couldn’t be changed, for the sake of compatibility.
Rev. 2 refactored and simplified the floppy code, in part by implementing what the comments call a “new architecture”, along with two routines (XLAT_NEW and XLAT_OLD) which convert such data fields to a new internal format when entering a BIOS function, then back to the old format on exit. The floppy code in ‘rev 1.5’ is therefore noticeably messy compared to the other revisions, and in places it seems to use certain “reserved” state bits for temporary purposes which I haven’t fully grokked yet (see set_dskstate_* in the disassembly).
Still, the interface already has the familiar form it would retain later. For instance, INT 13h function 17h (Set Disk Type for Format), named FORMAT_SET in the code (address F000:28D3), allows you to set the new 720 KB type by specifying AL=4, like rev. 2 and all subsequent PC firmware.
Function 08h (Get Drive Parameters), which in rev. 1 was only available for hard drives, works for floppies here. It returns the same data in the same registers as rev. 2 and onward, although it’s executed differently. The data is populated from a table at F000:EF5A, which includes the parameters for 720 KB:
F000:EF62 00 F0 d720_seg dw 0F000h
F000:EF64 C7 EF d720_off dw offset DISK_BASE ; [* 720k: ptr to DISK_BASE *]
F000:EF66 09 d720_spt db 9 ; [* 720K: sectors/track *]
F000:EF67 4F 00 d720_mxt dw 79 ; [* 720K: max tracks *]
F000:EF69 01 d720_mxh db 1 ; [* 720K: max heads *]
720 KB floppy support might just be another innovation of this firmware. The first PC-family machine with 3.5“ drives was the IBM JX (late 1984), but those were 360 KB only, up until 1986.8 The next one was the Convertible, with a BIOS dated September 1985; the XT didn’t get a 720K-capable BIOS until 1986, either. As the parameter table tells us, this March ’85 AT firmware knows about the full 80-track format, so this was likely the first occurence of actual 720K media support in the PC family.
It’s All About that Base RAM
At last, the interesting part: what this BIOS does about memory - and how this appears to hint at a machine that isn’t quite your garden-variety AT. Incidentally, the whole thing could also explain a curious little riddle in IBM’s source code for the later revisions of the AT BIOS.
System Board RAM (and the Keyboard Controller)
A bit of background to keep in mind here: through the AT’s lifetime, IBM never saw fit to release a model with room for more than 512K of RAM on the system board, unlike the XT (and the XT Model 286). A 128K expansion board can be used to bring a 512K system up to the 640K “base RAM” (AKA conventional RAM) limit.
Now, most of the AT’s configuration options are kept in CMOS memory, but a couple of things still have to be set the old way - as with the PC and XT, via motherboard switches and jumpers. These settings can be read from the 8042 keyboard controller’s input port (by sending C0h to port 64h, then reading port 60h).
┌───────┬───────────────────────────────────────────────┐ │ Bit 7 │ Keyboard inhibit switch │ │ │ 0 = Keyboard inhibited │ │ │ 1 = Keyboard not inhibited │ │ Bit 6 │ Display switch - Primary display attached to: │ │ │ 0 = Color/Graphics adapter │ │ │ 1 = Monochrome adapter │ │ Bit 5 │ Manufacturing Jumper │ │ │ 0 = Manufacturing jumper installed │ │ │ 1 = Jumper not installed │ │ Bit 4 │ RAM on the system board │ │ │ 0 = Enable 512K of system board RAM │ │ │ 1 = Enable 256K of system board RAM │ │ Bit 3 │ Reserved │ │ Bit 2 │ Reserved │ │ Bit 1 │ Reserved │ │ Bit 0 │ Reserved │ └───────┴───────────────────────────────────────────────┘
PC/AT: 8042 controller input port bit definitions
During the POST procedure, the 5170 BIOS reads these switch settings and stores them in the BIOS Data Area at address 0040:0012 - a byte that was previously unused, except on the PCjr. The AT BIOS listings label this byte MFG_TST, although the manufacturing test jumper status is just one of the bits used.
One of these settings (determined by jumper J18) specifies the amount of RAM on the system board. Type 2 and type 3 AT motherboards come with the full complement of 512 KB; on a Type 1 board, either 256 or 512 KB may be populated, so on these early 5170s this setting can take either value.
In the data byte obtained from the 8042 input port, that’s what bit 4 indicates. Seems simple enough: as far as the official documentation is concerned, this is the only jumper or switch setting that has anything to do with the amount of on-board memory.
But hold on: in the definition table, you will also notice a bit 3. In all three versions of the PC/AT Technical Reference, bit 3 is marked “undefined” or “reserved”.9 Nothing very special about that in itself, because the same goes for bits 0–2... but that’s where that anomaly in IBM’s source code comes in.
The “Base Planar Memory Extension”
The BIOS code listings include IBM’s comments for all symbolic constants, most of them at the very start (POSTEQU.INC in the reconstructed source files). There, the sources for revisions 2 and 3 (but not for rev. 1!) sneak one more entry into the list of bits in the keyboard controller’s input port: 10
C ;--------- 8042 INPUT PORT BIT DEFINITION SAVED IN @MFG_TST --------------------
= 0008 C BASE_MEM8 EQU 00001000B ; BASE PLANAR R/W MEMORY EXTENSION 640/X
= 0010 C BASE_MEM EQU 00010000B ; BASE PLANAR R/W MEMORY SIZE 256/512
= 0020 C MFG_LOOP EQU 00100000B ; LOOP POST JUMPER BIT FOP MANUFACTURING
= 0040 C DSP_JMP EQU 01000000B ; DISPLAY TYPE SWITCH JUMPER BIT
= 0080 C KEY_BD_INHIB EQU 10000000B ; KEYBOARD INHIBIT SWITCH BIT
This BASE_MEM8 would be our “reserved” bit 3. Note how it’s described: “base planar R/W memory extension 640/X” - ‘planar’ is IBMese for the motherboard. You could perhaps interpret this as a poorly-worded reference to the 128 KB Memory Expansion Option mentioned above, but that’s not it. The presence of this external card is signified by a CMOS register (33h, bit 7), not by a system board switch, and rev. 1 does support it perfectly well.11
Anyway, the POST process does just what that block comment says on the tin: around checkpoint 11, it reads the switch settings, which were temporarily stored in the DMA_PAGE+1 register a bit earlier. Then it strips off the unneeded bits, and saves the result to the @MFG_TST byte in the BIOS Data Area.
BIOS revisions 2 and 3, which know about bit 3 (as BASE_MEM8), take care to preserve it - again, unlike rev. 1:
;----- GET THE INPUT BUFFER (SWITCH SETTINGS)
05CB E4 82 IN AL,DMA_PAGE+1 ; GET THE SWITCH SETTINGS
05CD 24 F8 AND AL,KEY_BD_INHIB+DSP_JMP+MFG_LOOP+BASE_MEM+BASE_MEM8 ; STRIP BITS
05CF A2 12 00 MOV @MFG_TST,AL ; SAVE SETTINGS
“But what do they actually do with this bit once they’ve read it?”, you ask. To channel Trade Master Greenish, “that’s a good question, with a very interesting answer”: they do precisely nothing whatsoever with it at any point. Whether in the POST process or elsewhere, this piece of information is consulted a grand total of zero times.
I should mention that I couldn’t find any unofficial explanation of this bit, either. None of the usual references and books have anything better to say about it than “reserved” or “undefined”, and that includes such ne plus ultra sources as Ralf Brown, or The Undocumented PC by Frank van Gilluwe. A most curious state of affairs.
So what could “base planar memory extension 640/X” stand for, and how did revisions 2 and 3 of the AT firmware end up acknowledging its existence - only to completely ignore it?
The 640/X Factor
At this point you can likely guess where this is going: the ‘rev 1.5’ AT BIOS does pay attention to this undocumented ‘640/X’ bit. It’s checked in a couple of places, but always in the same context, and it’s a rather enlightening one: RAM parity checking.
A Perfunctory Parity Primer
For every byte of memory in the IBM PC-family architecture, there’s one bit of parity, as a basic means of detecting RAM corruption. A parity mismatch will initiate a Non-Maskable Interrupt (NMI); this is triggered by setting off one of two signals - depending on the general direction that the error came from. In the AT, it works like this:
- RAM on the system board triggers “Parity Check”, which sets bit 7 of port 61h (if enabled by setting bit 2).
- RAM expansion cards trigger “I/O Channel Check”, which sets bit 6 of port 61h (if enabled by setting bit 3).
These commonly-seen signal names may be somewhat confusing, but for our purposes it’s enough to remember that they both indicate parity errors, and both are concerned with RAM; “I/O channel” simply means the expansion bus.12 The BIOS’s NMI handler has the additional job of telling you more precisely just where things went lopsided, and in this ‘rev. 1.5’ BIOS, that’s where our mystery bit 3 crops up.
Many NMIs Bring Much Honor
When the NMI service routine is invoked, it tries to determine what has roused it from its slumber. If the cause was a parity error, it will display the message “PARITY CHECK 1” (if the source was on-board RAM) or “2” (a RAM expansion card), and then promptly halt the system - not a bad idea if your RAM chips are out of whack. But first, it’ll try to reproduce the error in the first 640K; so if that’s where the problem lies, you could find out which chip is acting up.
Interestingly, this test doesn’t try to match the specific type of parity error that raised the NMI: for each 64K region, it decides which of the two parity check signals it’s going to look for. Those switch settings in MFG_TST play into this choice, so we can get some insight by looking at the logic. In all versions of the AT BIOS, the relevant code is in NMI_INT_1.
For comparison, the Rev. 1 BIOS does it as follows. The diagram is somewhat simplified, but this is the general logic:
That is, below 256K it always watches for on-board RAM parity checks; between 256K and 512K, the RAM is either on-board (in a 512K system) or on an expansion card (in a 256K system), so it selects the signal to watch for based on the size of on-board RAM. Above 512K, it has to be expansion RAM, so it always goes with the “I/O channel” error.
In revisions 2 and 3, the NMI handler’s memory test is less revealing: no matter the RAM address, it always watches for both types of parity error, and treats either one as a good enough reproduction. Likely, the designers decided that being picky about it wasn’t worth the extra code, since the test already disregards the type of error which caused the NMI to fire in the first place.
‘Rev. 1.5’, however, still insists on being pedantic. Which is fortunate for us, because the change in logic from rev. 1 is very instructive:
The crucial difference is in the region above 512K, where the ‘640/X’ flag (AKA BASE_MEM8) comes in. When the machine has 512K on the system board and the ‘640/X’ bit is set, the test in revision “1.5” will expect on-board parity errors in this block of RAM.
If following a bunch of arrows around isn’t your idea of a good time, this format may be easier on the eyes:
| Region | Rev. 1 checks for | Rev. ‘1.5’ checks for | Rev. 2, 3 check for | 0K–256K | 256K–512K | 512K–640K |
|---|---|---|---|---|---|---|
| On-board RAM error | On-board RAM error | (Any parity error) | ||||
| [256K system]: I/O channel error [512K system]: On-board RAM error | [256K+‘640/X’]: I/O channel error [otherwise]: On-board RAM error | (Any parity error) | ||||
| I/O channel error | [512K+‘640/X’]: On-board RAM error [otherwise]: I/O channel error | (Any parity error) |
“Yeah, yeah, get to the point”: what does this mean, then?
Well, let’s imagine that the matching motherboard had something like an extra 128K bank of RAM, plus a switch or a jumper to indicate that it was populated. If the ‘640/X’ bit reflected the state of that switch, this would all make sense - and so would the “base planar” terminology from IBM’s later listings.
What doesn’t make quite as much sense is the behavior of the “256K on board” setting: whether it’s used in tandem with the ‘640/X’ flag or without it, there’s some region of memory where the error-catching logic seems to be all wrong. But it’s entirely possible (even likely, as we’ll see in a bit) that these motherboards simply had a single 512K bank, like the Type 2 and Type 3 AT boards. That would make the “256K” setting rather pointless, so the code paths involved might have been neglected or disregarded. Unless this jumper setting could bewas repurposed to mean something else, although if it was, the code doesn’t make it immediately clear.
The POST Memory Test Loop... and the >1MB Oddity
There’s another place where the ‘rev. 1.5’ BIOS refers to the status of the ‘640/X’ flag: the cold-boot RAM checkup. This one performs a full read/write test, but it also watches out for parity errors, as in the NMI handler. However, the latter only bothers with conventional (‘base’) memory; for the boot-up test, the POST has to enter protected mode and go through all RAM in the system, including extended memory above 1 MB (if any).
For conventional memory, when determining the type of parity check to expect for each range of addresses, the cold-boot RAM test broadly follows the same logic we’ve seen in the NMI handler. For each of the 3 official BIOS versions, the respective assumptions from the NMI routine are repeated here, and that’s still true for this revision. But past the 1 MB mark, it goes and does its own thing again.
In rev. 1, all non-conventional memory is reasonably assumed to reside on some sort of expansion board, so when testing the >1 MB region it looks for I/O Channel parity errors. Revisions 2 and 3 always check for both types of errors, regardless of the address, and they’re as non-specific about extended memory as they were about the first 640K.
But for some reason, ‘rev. 1.5’ here appears to reserve special treatment for the address range between 1 and 1.5 MB:
F000:0FFC E21_C1M: ; [* 1MB boundary *]
F000:0FFC C6 06 64 00 10 mov byte ptr ds:DS_TEMP+BASE_HI_BYTE, 16
F000:1001 C6 06 4C 00 10 mov byte ptr ds:ES_TEMP+BASE_HI_BYTE, 16
F000:1006 B0 40 mov al, IO_CHECK ; [* I/O Check mask (>1M on exp. card) *]
F000:1008 E6 87 out DMA_PAGE+6, al ; [* temporary storage *]
F000:100A 1E push ds
F000:100B
F000:100B ; [* this rev. only: get hardware configuration again *]
F000:100B
F000:100B B8 18 00 mov ax, RSDA_PTR ; [* system data area for POST *]
F000:100E 8E D8 mov ds, ax
F000:1010 A0 12 00 mov al, ds:@MFG_TST ; [* get mfg test config *]
F000:1013 1F pop ds
F000:1014 24 18 and al, BASE_MEM+BASE_MEM8 ; [* Planar RAM configuration bits: *]
F000:1016 3C 10 cmp al, BASE_MEM ; [* bit 4 (512k planar) ONLY? *]
F000:1018 75 04 jnz short E21_C1M5 ; [* no: keep I/O Check mask *]
F000:101A B0 80 mov al, PARITY_CHECK ; [* yes: use Parity check mask (planar) *]
F000:101C E6 87 out DMA_PAGE+6, al ; [* and save to temporary storage *]
F000:101E
F000:101E ; [* this rev. only: check for 1.5MB boundary (24*64K)? *]
F000:101E
F000:101E E21_C1M5:
F000:101E 80 3E 64 00 18 cmp byte ptr ds:DS_TEMP+BASE_HI_BYTE, 24
F000:1023 72 04 jb short NEXT1 ; [* continue if below 1.5MB *]
F000:1025 B0 40 mov al, IO_CHECK ; [* reset to I/O Check mask above 1.5MB *]
F000:1027 E6 87 out DMA_PAGE+6, al ; [* temporary storage *]
The logic here seems to be this: if the 1–1.5 MB region contains any RAM at all, the code checks whether the “512K on board” bit is set, and the ‘640/X’ bit is clear. If (and only if) this is the case, it watches for on-board RAM parity errors when testing these first 512K of extended memory. Otherwise, it goes with the I/O Channel (expansion RAM) check, which is what you’d expect on a standard 5170.
What that could mean is anybody’s guess. But if we take this at face value, then our theoretical motherboard may have had two selectable configurations:
- 640K on board (the usual 512K plus 128K extra), which would then fill up conventional memory to the limit; OR,
- 1024K on board, set up as 512K base plus 512K extended, with the latter mapped between 1 and 1.5 MB.
In the first case, you’d set the hypothetical switch or jumper one way, causing the ‘640/X’ bit (AKA BASE_MEM8) to be set. In the second case, the switch goes the other way, which would clear it. With a full meg on board, the logic implies that the 128K expansion board can still be used, bringing your base RAM up to 640K.
The PCB Real-Estate Question
512K RAM bank from a Type 3 IBM AT board: that’s an awful lot of space there, isn’t it? [image courtesy of Rodney/knaapic.nl]
One may wonder where a megabyte of RAM might go on the 5170 mainboard (or on some plausible variant of it). But that’s not too far fetched if we assume 256 kbit DRAM chips, like the Type 2 and 3 AT boards.
In fact, looking at those later mainboards, the layout around that single 512K RAM bank seems rather cozy and spacious, which sort of stands out next to the cramped organization of the rest of the board. Those two rows of chips are flanked on both sides by curiously empty space - coincidentally, it looks like there’s just enough room there to double the chip count with a minimal change in design.
If we roll with this observation, we can just about arrive at a scenario where the first redesign of the 5170 mainboard - corresponding to our ‘rev. 1.5’ AT BIOS on the timeline - could have accommodated as much as 1024K of memory. For reasons of their own, the powers that be at IBM decided not to pursue this as a finished product, and the next AT models to hit the market (with the Type 2 mainboard, and the rev. 2 BIOS) were stripped down to one bank of 512K, leaving all that board real-estate unused.
Since the ‘rev. 1.5’ BIOS code seems to imply a selectable cofiguration of either 1 MB or 640 KB, could such a RAM subsystem support either setup at the flip of a switch? Supposing one 16-bit bank of 512K (using 256 kbit DRAM chips), that second 16-bit bank would then have to accept either all-256 kbit or all-64 kbit DRAM, without mixing and matching capacities. Not exactly a common design on PC motherboards, but entirely feasible, and some memory add-on cards with flexible capacities did do things that way.13
Skyrocket: The Real Deal?
Conjectures are fun and all, but can this hypothetical PC/AT variant be identified with anything like an actual real-world machine? Nothing’s for sure, but there may be a rather fat clue lurking in an old Usenet post from ex-IBM employee Tony Ingenoso (thanks to David of IBM Museum for reminding me of this one).
The thread in question is ‘OT: “Skyrocket” the AT that never was’, posted to comp.sys.ibm.ps2.hardware back in 2001. Tony shares the following:
Sorting through the piles a bit I’ve (re)discovered a bit of ancient AT history – the machine that had the internal codename “Skyrocket”. Looks just like a normal AT with the primary difference being there is 640K on the planar rather than 512K as shipped in all the retail models.
How did I happen across this rather unique piece of history? Well, I was in the right place at the right time when the Boca Raton site was being shutdown and thrown to the wind. IBM was firesaleing off all sorts of gear to the employees and I’d bought a stack of AT’s for $10/each. On closer examination, one of the stack turned out to be the rare Skyrocket...
Further along the thread, we get more details:
Skyrocket’s planar is traditional “big” AT style (not the shortie 339 or XT-286 style), but uses normal DIP’s not the old Mostek/TI DIL’s (positioned on the usual place the double deckers would be on the planar though). It’s got a two rows of 41256 and two of 4164 type DIP.
I believe Skyrocket probably predates XT-286, and may have been concurrent with 339 at some point. When the project was shelved, some number of advanced prototype had been produced and were distributed about the site on an IUO basis - I had one on my desk in 86’ during OS/2 1.0 development (real 339’s were rare and all being shipped out to customers). Where the “AT” badge would be is a metalic emblem shooting star logo, which leads me to believe it was fairly well along when it was shot down...
My sense is that the Skyrocket machine was tanked because the 339 planar was ready, cost less to manufacture, and with PS/2 ready to be unveiled, it just wasn’t worth the bother to get the extra 128K onto the planar for a machine that would have a distinctly limited lifespan.
That DRAM layout is just as predicted above for the 640 KB option: one Type 2/3-style bank of 256 kbit DIPs (512 KB), plus one bank of 64 kbit DIPs (for the extra 128 KB). In a different thread, Tony also mentions that “Skyrocket” had an 8 MHz system board.
...Or is it?
Now, I’m not completely convinced beyond a shadow of a doubt that this is what we have here. Given what I can make of this BIOS, the particulars of this cancelled AT prototype do fit... but not exactly like a glove.
The most obvious smoking gun is of course the 640K system board, but there’s also that alternative 1 MB option implied by the cold-boot RAM test loop, which Tony doesn’t mention.
He makes it sound like “Skyrocket” came along fairly late in the 5170’s lifespan, possibly concurrent with the Model 339/Type 3 AT, and not all that long before the launch of the PS/2. But the date stamps in this ‘rev. 1.5’ BIOS (as well as the code!) date it to before the Type 2 boards.
A “traditional big AT style” planar (which I take to mean a ‘Type 1’ form factor) does fit the time frame; but an 8 MHz CPU clock wouldn’t be my first guess with that sort of thing, since even the Type 2 still limped along at 6 MHz.
But on the other hand:
The option to switch between 640 KB and 1 MB configurations may be technically supported in the ‘rev. 1.5’ BIOS, but perhaps the mechanism to do this (along with the extra address decoding logic) was simply never implemented in the prototype system board... or maybe there just weren’t enough 256 kbit DRAM parts to go around, and all the “Skyrocket” ATs ended up with a fixed 640 KB on board.
ROM date stamps never correlate very well with actual availability anyway, and we have to keep in mind that the 5170 had less than 3 years of official lifetime as a product. Within that period, I could easily imagine various projects going on simultaneously, with all sorts of fun bureaucratic delays to ensure maximum confusion.
I don’t see anything in the ‘rev. 1.5’ BIOS code to suggest that it can’t be an 8 MHz system. In fact nothing seems to indicate any particular clock rate, because it’s missing the CPU speed test which was added in rev. 2. There are a few instances of speed-dependent “busy wait” loops scattered throughout the code, but the counter values used for those weren’t even consistent between the two 6 MHz BIOS editions, so they tell us very little.
Since there’s nothing to absolutely preclude this BIOS from being the “Skyrocket” firmware, and the ‘640/X’ business with RAM capacity sure seems to be a good match, I’ll invoke Occam’s trusty razor and say it’s probably “Skyrocket”... or at least something very closely related to it.
NOT an actual “Skyrocket” 5170 case badge
With that talk about “a metallic emblem shooting star logo” in place of the “AT” one, I couldn’t resist making this little mock-up of what the Skyrocket’s case badge might have looked like. Of course, that’s probably not even close. But this being the internet, where any form of creative license is certain to be misrepresented as gospel truth sooner or later, I’ll just repeat again (very slowly) that this is not the real thing.
Whichever 5170 variant this BIOS came from, it wasn’t a released product, but it makes sense for a prototype that made as far as “internal use” at IBM: the EPROMs have part numbers (on printed labels and in the data), which seems to hint that this project got to a respectable stage in the development cycle before they canned it.
I’m given to understand that Tony Ingenoso is regrettably no longer with us, so we won’t be able to verify whether the BIOS ROMs in his “Skyrocket” are the same as this ‘rev. 1.5’. But maybe someone else out there will be able to shed some more light here.
Amusingly, it’s a lucky thing that this firmware still retains rev. 1’s pickiness about specific parity error types - it could have been less fussy about it (like the later revisions), and our only clue that there was anything funny about the RAM setup would have gone down the chute. But it didn’t, and now we can also explain that little riddle from the rev. 2 and 3 BIOS listings, where they mention the undocumented “640/X” bit: evidently someone done goofed, and simply forgot to clean up the source files.
“Skyrocket” or not, this was a diverting little game of connect-the-dots. Huge thanks go to GearTechWolf for rescuing and dumping these ROMs!
Notes
- The same model byte was used by other IBM systems, namely the XT-286 and the 7552 “Gearbox” industrial computer, but these only arrived in late 1986. See the model byte table from Ralf Brown’s Interrupt List. [↑]
- This was done to support certain very early, “badly behaved” software, which accessed BIOS functions and structures by calling absolute addresses instead of using the interrupt services - some authors had the PC mixed up with an Apple II, apparently. IBM explains this in the comments for the ‘compatibility’ section, for instance in the second PC AT Technical Reference (Sep. 1985), p.5-182 (which does a great job conveying the intended tone of disapproval and contempt).
This practice was carried on by third-party BIOS vendors: see ROM Address Compatibility Table in System BIOS for IBM PC/XT/AT Computers and Compatibles (Phoenix Technologies, Ltd., 1989), p.58. [↑]
- Immediate multi-bit shifts and rotates were a new feature of the 80286 (or more accurately, of the 80186, but IBM skipped that one). The rev. 1 AT BIOS tends to stick to 8088-compatible instruction forms - this likely has to do with MASM 1.0, which didn’t support any 286 instructions whatsoever, so when they do appear they’re implemented using macros. [↑]
- This too is an artifact of early MASM versions, which used a two-pass assembly process. In the first pass, symbols haven’t been resolved yet, and jumps within a segment are encoded as near (3-byte) jumps. The second pass applies the resolved addresses, and if the target is within range, it’ll go with the short (2-byte) form. That would shift things around, but the two-pass process can’t deal with offsets being changed at this point, so the third byte is simply replaced with a
NOP(90h).
To get around this you can explicitly specify “JMP SHORT”, and the first pass will use the two-byte form. Of course, this gets you an error if the target isn’t within the short jump range, but IBM evidently did the legwork for the second AT BIOS revision. For more about this (and other ancient MASM quirks), see this writeup at OS/2 Museum. [↑]
- The Type 2 AT Technical Reference (Sep. 1985) doesn’t mention the Enhanced Keyboard at all, other than the hints for this rudimentary support in the BIOS listing - where the comments refer to it only as “KBX”. Perhaps the notion of a “keyboard X” gave off the right mixture of suspense and enigma, but the eagle-eyed would have noticed references to keys like F11 and F12.
The Enhanced Keyboard was officially introduced with the 7531/7532 Industrial Computer, which used the 5170 AT BIOS. I’m not sure which revision(s), but the 7531/7532 Technical Reference from July ’85 includes the source for rev. 1, even though it fully describes the new keyboard. Makes no sense to me... but St. Augustine would have said, “if you understand it, it is not IBM”. [↑]
- For the sequencing and meaning of POST checkpoint codes on the 5170, see the IBM 5170 - POST Codes page at Minus Zero Degrees. [↑]
- Infamously, the speed test prevented owners of the 6-MHz Model 239 AT (with its rev. 2 BIOS) from overclocking it by replacing the clock oscillator, which was often done with earlier ATs. The common conspiracy theory says that IBM did this purely out of greed, since they didn’t want a “too-fast” 5170 to bite into the sales figures of more expensive systems (which?), or of the “official” faster ATs they were planning to introduce shortly.
I’m not sure I buy that. The fact is that the 286 CPUs supplied with these models were only rated for 6 MHz, and the stability of the rest of the system wasn’t guaranteed beyond that either, since the bus ran off the same clock. The AT had already caused PR problems for IBM, with those early CMI hard drives going teats-up en masse, and I suppose they weren’t going to risk any more reliability issues if they could help it. [↑]
- The original IBM JX BIOS could only handle 40 tracks, so the total usable capacity was 360K; the earliest JX BIOS that could handle the full 720K has a 1986 date code. See “System Specifications” on the IBM JX Information Page. [↑]
- Keyboard Controller: Input Port Bit Definitions, The IBM Personal Computer AT Technical Reference. Rev. 1 (Mar. 1984): p.1-44; rev. 2 (Sep. 19