Motherboard: Biostar M6TBA

Today’s motherboard is from a Taiwanese manufacturer called Biostar Microtech International (know as Biostar for short). They are one of the old guard of PC motherboard manufacturing having started in 1986, although until finding this board I haven’t used much or any of their hardware. In the past Biostar have made boards for OEMs as well as the high-end, mid and lower-end of the market. These days they seem to be focusing on higher end parts. This particular example is a M6TBA ver 1.3 which from the date codes on the board appears to be made in 1998.

Here’s an overview of the board.

Biostar M6TBA

It supports slot 1 processors up to 800 Mhz surprisingly, this would have been faster than any of the Pentium II chips available at the time. It was probably a forward looking design, perhaps with the capability to support a Pentium III in a slotket, although this isn’t documented in the manual. The slot 1 system was implemented by Intel for a few reasons, but mainly they wanted to move away from the ageing socket 7 standard, partly because it was limiting the memory bandwidth and partly because it was used by many of their competitors.

They initially designed another socket, which was called socket 8, this saw limited production as there was a problem with how the cache was integrated onto the CPU. It actually had two separate dies on the one package, which proved to be a production problem that increased wastage when either a cache die or processor die turned out to be faulty. The slot 1 design solved this problem by putting the CPU and cache in separate packages on an installable module. They did miss an opportunity to put the voltage regulation on the module, which would have made the slot 1 more flexible and future-proof.

The board is pretty standard fair for it’s time, having an Intel chipset ( 82243bx being the north bridge) with PCI, ISA and AGP slots, which were all fairly standard for the time. It supports up to 384MB of RAM in three 128MB SDRAM modules, which was quite alot of memory at the time. Many people had 64Mb-128Mb.

Front panelHere’s a close up of the front panel header and the CPU configuration jumper block (the ones with yellow jumpers). The front panel is marked, adequately,  but not really nicely. The CPU configuration block has no markings at all, requiring you to look up the manual, which luckily is still available. Looking at this block it is unclear what speed of CPU was installed, as according to the manual these settings don’t match any of the speeds. Perhaps the board can detect the speed automatically.

Heatsink footprint


An unusual feature of this board is the un-populated footprint next to the main power supply connected. It looks like a large heatsink was to be attached to the board with a linear voltage regulator, something you don’t normally see. It could be leftover from when the board was in the prototype phase, and thus not used in production, but still on the layout. Whatever the reason, the size of the heatsink foot print leads me to believe it was going to be dealing with a significant amount of power.

Speaking of power supply, check out the bulk capacitors next to the CPU slot.

Bulk capacitance

Some of the capacitors are visibly bulging, a sure sign they are failing. Sometimes a board will continue to function like this, but it usually means the board is about to fail. You can replace these components, but it’s tricky soldering on these multi-layer boards, it’s easy to damage them. It’s something I haven’t been brave enough to attempt yet.


Another interesting feature is this chip here, an SMC FDC37M602. It’s a Super IO chip which integrates floppy disk controller, serial ports, PS/2 ports and a parallel port. It’s quite some distance from the boards floppy connector, so I’d say the chipset is supporting that, at a guess I’d say this chip is driving all the rear port with the exception of the USB ports.

The thing I noticed however was the copyright notice on the chip, the date reads 1994 from a company called American Megatrends, a software house known for writing BIOS ROMs for PCs. This particular board however uses an Award BIOS which of course comes from a rival company! Not so much a technical achievement, but interesting to see.

Beginning to sum up this board, it’s not really ideal for the technician working on it. Mostly due to the CPU configuration jumpers having no markings at all. Otherwise it’s very similar to working on other boards of the same vintage. This would have been a more expensive board at the time, mostly because Intel boards and CPU’s tended to be more expensive. It has lots of standard slots allowing for upgrades and expansion. So it’s not all bad, just a bit inconvenient to change CPU with.


Huffman Coding – Compression results

Before the Christmas break I started writing a encoding/compression library based on a technique called Huffman coding. I’ve since completed and tested the encoder and decoder, today we’ll discuss the results of several compression tests in comparison to the RLE encoding technique. Here are links back to the posts for Huffman Coding and Run-Length Encoding.

So I’ve collected a few different types of data files with different characteristics. First up is a MS-DOS executable, these binary data files typically don’t contain runs of data and will tend to contain most if not all symbols. Next is a ASCII text file, which will typically only use alphabetic, numeric and punctuation characters. Lastly a small number of graphics files, in raw bitmap form, these particular images are fairly basic (only a handful of colours).

Here’s a table showing the results.

Compression ratio table

You’ll note that the Huffman encoder achieves better results in every example! The worst result for it being the MS-DOS executable, which is of course expected. I was pleasantly surprised that the small graphic files compressed quite well, this is likely because of how few symbols are in these files. If there had been a large variety of symbols the dictionary and the encoded data would be much larger.

Run-length encoding was obviously much worse, and in the case of ASCII text and executable data it actually increased the size of the data! Knowing how it works this is hardly surprising. I did expect RLE to perform better than Huffman coding for the graphic data, but I suspect the low colour count in these files is an influencing factor.

So I thought I’d create a graphic data file that had a higher colour count that would favour RLE to see if it can do better under specific conditions. The graphic data I created is a sprite 32×32 pixels with horizontal coloured stripes, each a unique colour. The size of the raw file came out at 1026 bytes.

The Huffman encoder produced a file of 766 bytes and the RLE produced one of 68 bytes! Whilst the Huffman encoder still managed to compress the data, it couldn’t match the best case scenario for RLE.

There are good reasons to use both compression techniques. Run-length encoding doesn’t require much CPU power to encode or decode, so it can be done on very weak machines (even old 8-bit machines), but it only compresses data with many runs in it well. This turns out to be useful for graphics and level data. Huffman coding will pretty much always reduce the size of the data encoded, sometimes significantly, but it is much more complex. It requires more RAM and CPU power to achieve this result, so it can’t be used on every machine, although most modern processors would have no trouble at all.

I’ve made the code for the Huffman encoder/decoder available in pascal for download. be aware the encoder still has some debug code in it that will write to the screen.


Xerix for DOS

The past week here the temperature has been hot, very hot. When it’s over 40 degrees every day it’s difficult to do much more than rest under a fan, unless of course you have air conditioning. Not even playing a computer game is on the books when it’s that hot, but luckily a cool change came through recently providing relief, so today I took the opportunity to play a game which is called Xerix.

XerixXerix was made in 1992 by Brenden Reville who at the time was 15 years old. It is probably one of the better examples of a teenage-made home-brew game. Whilst not being up to the same standards of commercial or professional shareware from the era, it does have some decent features, partly due to the libraries used in its development.

The Xerix Story

The Xerix Story

The graphics support is VGA in both colour and monochrome (64 level grey-scale). Obviously the colour graphics are better, the grey scale graphics suffer from objects being difficult to differentiate. I’d say it simply uses a graphics filter rather than having a separate graphic set. In terms of artistry the graphics are quite good, with nicely drawn sprites and animations, although within each level there isn’t a whole heap of variety.

The sound system supports PC Speaker, Ad Lib, and Sound Blaster devices. Sound effects are fairly basic, but the music is much more developed. It’s not amazing, but reasonably good, it is played during the title and story sequences but not during levels oddly.

The first Boss

The first Boss

The game-play is probably the part that could have used the most work. The controls work quite well, but the enemies are quite simplistic. There are two basic enemies, a ball that bounces up and down the screen towards you (sometimes at speed) and stationary turrets that shoot along the diagonals.

Yet more story...

Yet more story…

The bouncy balls move exceptionally quick and seem to have a random initial speed and angle. They move so fast that it is often a matter of luck whether you’ll be able to dodge them or not. The turrets behave much more reasonably, being much easier to avoid shots. However the graphic for the turret isn’t obviously one and is used in places where no turret is programmed into the level.



There are only two levels, each ending with a boss, which are both essentially balls that move extremely quickly in a pattern. The only danger they offer is collision with them, so once you work out the pattern they aren’t terribly hard.

The Second Boss.

The Second Boss.

Unfortunately this is another game where one hit results in your death, and it will happen frequently. Enemies that are destroyed remain on screen for a moment whilst they explode and remain dangerous whilst they do. If you play in the expert difficulty level you only get a few lives (with no feedback about how many) so you don’t last long, but luckily there is a Novice difficulty level with unlimited lives.

I know it sounds like I don’t think the author did a good job, but I’m holding the game to standards of game-play that we’d expect from a professional developer, as most people playing it would. The truth is it’s really quite a good effort for a 15 year old high school student, some-one who would be new to making games.



Something that does shine as quite exceptional is the technical aspects of this game. The graphics animate and scroll smoothly and the sound system performs quite well. Normally it would take quite some time to write the code needed to drive the graphics and sound, but the author has opted to use some external libraries, in this case Fastgraph and the Creative Labs developer kit. I think that this was a good idea, as it must have saved development time and allowed more effort to be put into the art for the game. This is generally the norm for games developed with modern tools.



Whilst this isn’t a game you’d play and enjoy on the same level as a commercial or shareware release, it is exceptionally good for a home-brew game made by a student. You have to remember that little information about game programming was readily available and making this would have taken considerable effort.



This slideshow requires JavaScript.


Xatax for DOS

XataxToday’s game is called Xatax and is a shoot-em-up made in 1994 by a company called Pixel Painters. I hadn’t heard of them before which is unsurprising as they only made 3 games during the MS-DOS era.

The story in Xatax is fairly simple like most shooters. Humanity disarms after many years of peace and is attacked by something not surprisingly called the Xatax. It absorbs the biomass and materials of anything it encounters and adapts based on the knowledge it acquires in the process. Humanity restores a star fighter stored in a museum so you can fight back.

The game supports VGA graphics and has some quite good art, the artist favouring the use of gradients quite a bit. The animation and scrolling seem to be implemented nicely and move smoothly with no flicker. The sound system supports Adlib, Sound Blaster and Sound Blaster Pro cards utilising FM synthesis rather than digitised effects. The music sounds quite decent, but is a little repetitive whilst the sound effects are fairly simple.

Running the defensive gun gauntlet

Running the gauntlet

The controls work quite well, although if you don’t get the rapid fire upgrade you have to mash the fire button quickly to achieve a good rate of fire. The power-ups come out of enemies when destroyed and come in a few different flavours which are picked randomly. This can be a blessing and a curse as you might get a good power-up early on or ones which don’t help all that much. The missile power-up is probably the most useful.

Missiles FTW!

Missiles FTW!

When you collide with an enemy or the scenery you’re destroyed straight away and if you’re hit by a projectile you lose one of your power-ups and are destroyed when you have none. This in combination with the rarity of power-ups means you’re usually destroyed in one hit which makes the game quite punishing. It does have the option to change the difficulty level, but that only seems to change how many shots are required to destroy enemies.

Big O's!

Big O’s!

The main obstacle in the levels I’ve played are mostly defensive guns fixed to the scenery of the level. They shoot at regular intervals so you can get through their line of fire if need be, but it’s usually better to destroy them if you can, especially where there are a number of them. You can use a bomb to destroy everything on screen if you’re in real trouble, but like the power-ups there aren’t that many bombs to pick up.

Xatax is mostly fun, but punishing when you make any kind of mistake. I would have preferred a shield or energy system so I could keep the power-ups. I found it difficult to get through many levels because of how many times I died. Progress is difficult to make which can lead to the game being a little repetitive. It’s not bad by any means, just not as good as it could have been.

This slideshow requires JavaScript.


Another Belated Xmas Post: Holiday Hare 1995

Christmas time was particularly busy this year for me. Just managing to get all the shopping and family stuff done alone whilst my partner continued to work took up most of my time. Now having finally some time to rest and take a holiday from the holiday I can write the first post for the new year.

Over the Christmas break I did manage to find some time to play a Christmas themed DOS game. This time it was Jazz Jackrabbit Holiday Hare 1995. Obviously it will have much in common with the original shareware Jazz that I played last year.

The graphics and sound are of the same level of quality as the shareware game, that is to say the graphics and music are both very well done. The music in game consists of remixed christmas music such as Carol of the Bells and what I think is Little Drummer Boy.

There are two new worlds to visit, Candion and Bloxonius, each with two levels. Candion is themed with Christmas candy and is filled with blind mice as enemies. Bloxonius is themed around toys, lego mostly, but has enemies like rag dolls throwing bombs and toy planes. All the levels are reasonably large and reasonably challenging, with the last of the set being the most difficult.

I quite enjoyed playing Holiday Hare 1995, the music probably being the highlight as it definitely gave the game a very Christmas atmosphere. It manages this with-out losing the spirit of how the original game felt, enhancing it rather than replacing anything. Some of the same annoyances are still here, such as Jazz’s speed making avoiding enemies difficult, but this is easier to deal with as you learn the levels.

This slideshow requires JavaScript.


Mainboard: Generic Socket 3

Today’s motherboard is a bit of a mystery as it doesn’t have any obvious markings that identify who made it or a model number. It is a late socket 3 board that supports fast 486 chips and the Pentium Overdrive. Socket 3 is interesting as it is where the CPU designs started to really diverge depending on the manufacturer, but the chips all still ran on the same boards. It didn’t last as long as the later socket 7 standard as it came part way through the life of the 486.

Here is the board in all it’s glory (or infamy). There are a few things to note about it, firstly it must be a later board as it has a PCI Bus instead of a VLB and only ISA slots. PCI wasn’t common on 486 boards, and some had early buggy versions of it (or the BIOS), but this board could be running a later 2.0 or 2.1 version of PCI which had the kinks worked out. Judging by the date codes it was made in late 1995.

Also you’ll note it has a very integrated chipset made by ALi (Acer Laboratories incorporated). I usually have seen their chipsets on boards that are in brand name PC’s rather than your usual beige boxes. They also ended up on value boards like SiS chipsets used to, but had better software support when that was required. This particular chipset is more integrated than many early Pentium boards, and is much smaller. It could have fitted a smaller chassis than normal.

Like other socket 3 boards this one has some cache on board in the form of the SRAM chips. There were a few dodgy manufacturers that put fake chips on their boards instead, but I don’t think this is one such board. To start with this has a genuine ALi chipset, where as the dodgy brothers boards often had a generic chipset that wasn’t much chop. To hide the fact they had a chipset like that they stuck stickers with the chip markings of other manufacturers over the laser etchings.

These cache chips are marked as Writeback, which turns out to be the chip maker rather than the cache type. I couldn’t find a datasheet, but from the basic information available they are basically SRAM chips arranged in two banks. There are an odd number of chips, one of which I’m guessing is used for parity checking.

Ugh look at the arrangement of the configuration jumpers. From a technicians standpoint this is horrendous. There is no silk screen for most of the jumpers, so I have no idea how you’d configure this board without the manual. Luckily the front panel jumpers are labeled, and the voltage selection for the CPU is labeled. Like many Socket 3 boards, this one will take both 3.3V and 5V CPUs.

This is the only mark on the board that identifies anything about it, apparently it is version 1.2A. I got this board from an uncle that had it in a beige case, and that unfortunately was generic, so I can’t even chase a possible manufacturer that way.

It might have been an ok board from an end users perspective, I got this board with a 100Mhz 486DX4 chip in it, and it did run Doom exceptionally well. I even was running DSL (Damn Small Linux) on it for a time. It had something like 16Mb, but probably could have taken more. It was unfortunately at the end of the DOS era, so it would have depended on whether Windows 95 was on it or MS-DOS. Windows 95 did run on 486 machines, but not really all that well.

It used to work some time ago, but something went wrong with the chipset as it doesn’t detect it’s memory properly anymore and it doesn’t boot. I’ve been thinking I’d like to have a go a reflowing some of the solder connections as I suspected some dry joints. Unfortunately I don’t have a heat gun, so that will have to wait. One thing I really liked is the use of a coin cell for the RTC and CMOS settings, so I might have to attempt a repair on it one day.


Huffman Coding

Quite some time ago I did a short post about a compression technique called run-length encoding (or RLE), that was a commonly used compression method for graphic and level data. At the time I wrote and implemented it for the graphics and level data for my home-brew game Bobs Fury with quite the success, significantly reducing the disk space required for the base game.

I do however have some files left, which are largely just ASCII text, that don’t compress well or at all using that technique. I’d like to make them smaller, and of course harder to read as they contain story text, not that my game has a brilliant story, but you get the idea. Step in an encoding technique called Huffman coding.

Essentially the encoding algorithm encodes symbols as a variable length stream of bits. The most frequently used symbols are represented by shorter bit streams, whilst the least used have longer ones. Normally in a computer each symbol would be encoded as a fixed number of bits, such as say 8-bit or 16, so this results (hopefully) in shorter encodings for most symbols, and longer ones only for the rarely used ones.

The tricky part is creating the Huffman tree, which is basically a code-book representing how each symbol is encoded or decoded. Here is a quick tutorial on how they are created, which will also give you a feel for how the encoding works. It’s also commonly known as a dictionary, or code book.

A fixed tree can be used for everything, but would not do the best job for every set of data being compressed. So typically a tree is created and stored along with the encoded data to achieve the best compression possible. This of course does add some overhead, which could be a problem if the resulting encoding isn’t much shorter than the original.

Huffman coding typically works poorly when the symbols all appear in the text at roughly the same frequency. The worst case being everything with exactly the same frequency. Notably it won’t produce an encoding that is longer than the original data, although with the overhead of storing the tree, you could end up with a larger data file. In practise this rarely happens.

Other data such as English text stored in ASCII actually compresses quite well. As usually not all the 255 characters are used, most encodings will be shorter than 8 bits per symbol. Also because natural language uses some letters more than others the average encoding length for all the symbols will be shorter.

Huffman coding was actually invented quite some time ago (1951) by David Huffman, well before it came into common use. Check out the Wikipedia page for more information. It’s a part of many commonly used compression programs and formats such a Zip and Bzip. Older 8-bit machines typically weren’t powerful enough, so it wasn’t commonly used until more powerful machines with more memory became available.

It took me much longer than usual to write this post primarily because I began the process of writing an encoder and decoder, but because of the complexity, it’s taken up much more time than I expected. Currently I have just finished the encoder, but have yet to test it. I had hoped to get the code running first, but that will have to wait.

Blogs I Follow

Enter your email address to follow this blog and receive notifications of new posts by email.

Mister G Kids

A daily comic about real stuff little kids say in school. By Matt Gajdoš

Random Battles: my life long level grind

completing every RPG, ever.

Gough's Tech Zone

Reversing the mindless enslavement of humans by technology.

Retrocosm's Vintage Computing, Tech & Scale RC Blog

Random mutterings on retro computing, old technology, some new, plus radio controlled scale modelling.


retro computing and gaming plus a little more

Retrocomputing with 90's SPARC

21st-Century computing, the hard way


MS-DOS game reviews, retro ramblings and more...


Get every new post delivered to your Inbox.

Join 25 other followers