Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.

The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.



The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.

Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.

Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.


I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.

And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.


Nearly correct. The DX/4 100MHz had a 33MHz bus. The DX/4 75MHz had the 25MHz bus. I remember well because I had both.


Now I remember being annoyed that it wasn't the DX/3 as it should have been!


Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)


I think 5x86 had more to do with marketing than anything else, because the Pentium had already been on the market for a while when the Am5x86 came out.


I think it’s a bit of both. It absolutely tried very hard to pretend that it was a ”586” (Pentium class) but also ”5x” is right there and implies that if the DX4 is 4, this is 5.

The full name on the chip on some of them is ”Am5x86-P75 DX5-133” which implies a lot of things, some of which are flat out misleading (it does not get very close to ”P75” performance)


I had one of these back on the day. A very fine 486


I remember being so excited when I figured out how to jumper my DX/4 100 and operate it with clock doubling and a 50 MHz front side bus speed. Same core speed, faster memory and I/O.

My peripherals seemed to take it. My graphics output showed some slight glitches, which I was OK with for the speed.

However, I think it was a bit unstable and would fail a correctness challenge like compiling XFree86 or the Linux kernel, which were like overnight long runs. Must have been some bit flips in there occasionally. I seem to recall that once that reality settled into my brain, I went back to the clock tripler config.


I still remember scribbling on Athlons with a pencil to max them out - we probably spent as much on heatsinks as we saved on CPUs.


As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.

The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.

1: https://news.ycombinator.com/item?id=47717334


> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).


Ah, I was under the impression that it had a native chunky mode but it was a built-in C2P routine? Anyhow, seems it was useful (1) when running on stock CD32's but not in conjunction with faster machines.

1: https://forum.amiga.org/index.php?topic=51616.msg544232#msg5...


Which brings me to my pet peeve, the already slow 68020 (680ec20) at 14MHz was crippled by, even though it had a 32-bit bus, was only connected to a 16-bit RAM bus. (Chipram.)

This 16-bit memory (2 megs) is also where the framebuffer and audio lives, so the stock CPU in A1200 has to share bandwidth with display signal generation and the graphics and audio processing.

All-in-all, it meant the Amiga 1200 had only about twice the memory throughput of the Amiga 500. (About 5 megabytes/s vs about 10 megabytes/s)

If the A1200 had at least some extra 32-bit memory (it existed as a third party add-on) the CPU could have had its own uncontested memory with a troughput of about 20-40 megabyte/s.

Imagine the difference it would have made if the machine had just a little extra memory.

That's just a tiny detail. That the chipset wasn't 32-bit was another disappointment.

The bigger problem was that Commodore as a company was aimless.


Yeah, and it took ~7 years to make those marginal improvements over the earlier Amiga chipset! I'm ignoring ECS, since it barely added anything over OCS for the average user.


Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"


Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.

If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.

By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.


5 sins in 1992: - 8 bit planar instead of chunky - progressive display (vs interlaced) - sound was not 16-bit - should have been 68030 with mmu support (vs 68020ec) - HD mandatory.

If they addressed this, the Doom experience would have run better on Amiga.


The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.

I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…


Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).

Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.


The 68060 is pretty good to be fair, but it never ended up being widely used and Motorola definitely saw PPC as the future.

Maybe if these theoretical new 68k Amigas became a huge market hit they could have taken the arch further and it could have remained competitive, but all the other 68k shops had already pretty much given up or moved on already (Apple was already going PPC, Sun went SPARC, NeXT gave up on their 68k hardware, Atari was exiting the computer business entirely, etc) so I don’t know that the market would have been there to support development against the vast amount of competition from both the huge x86 bastion on one hand and the multitude of RISC newcomers on the other.


Right, and I think that is a junction. Had Motorola not been enamoured with the new shiny as a chipcompany and realized that they already had a huge market that just wanted improved performance of their software and pushed 68k improvements instead of a new PPC architecture, both Apple and (a better managed) Commodore could've been competitive with improved 68k designs.

Remember, Intel also barked up the wrong tree with Itanium for 64bit and didn't really let go until AMD forced their hand with x64.


The argument is that 68k is "CISCier" than x86, the addressing modes in particular, so making a performant modern out-of-order superscaler core that uses it would be harder than x86.


I believe in that. But Commodore could have plunked a cheap 68020 in their machines for backwards compatilibity (like how MSX2 had a SOC MSX1 inside, PS2 had a PS1 SOC, PS3 had a PS2 SOC, and so on) and put another "real" socketed CPU as a co-processor. Or made big-box machines with CPUs on PCI cards, for infinite expansion options. "True" multitasking, perfect for CAD, 3D rendering and non-linear video editing. It would have been very cool with an architecture where the UI could be rendered with almost hard realtime and heavy processing happened elsewhere.


This is almost exactly what the plan was, until C= went out of business:

https://en.wikipedia.org/wiki/Amiga_Hombre_chipset

It was going to be HP PA-RISC based and have an AGA Amiga SoC, including a 68k core.


How much of Hombre is myth-and-legend? Given how little progress with made with OCS->ECS->AGA, it seems unlikely they could even have built an Amiga SoC, nevermind designed a new 64-bit chipset.



Don't agree there considering x86 has MODRM, size-prefix(16/32 and later 64bit operand sizes), SIB(with prefix for 32bit), segment/selector prefixes,etc.

Biggest difference perhaps where 68000 is more complicated is postincrement but considering all the cruft 32bit X86 already inherited from 8086 compared to the "clean" 32bit variations of 68000 I'd make it a toss at best but leaning to 68000 being easier (stuff like IP relative addressing also exists on the RISC-y ARM arch).

Apart from addressing the sheer number of weird x86 instructions and prefixes has always been the bane of lowpower x86.


There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different (edit: mutually exclusive even!) tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.

Edit: I don't mean that their success was certain if they executed better. I mean they did almost nothing and got the guaranteed outcome: failure. (And their engineers were brilliant but had very little resources to work with.)


At that point in time I would not have called it Wintel yet. That started after Windows 95, IIRC.


Yep. 486DX/2 was when I started seriously looking at moving on from the Amiga. I wound up with a DX/4 100 sometime in 1994.


My classmate kept his Amiga 1200 a bit longer! ...eventually he got a PC with Pentium 60 MHz.


Yeah, there were holdouts of course but the DX/2 really seems like the breaking point.

(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)


Pentium is a bad processor? It's way faster than 486, especially on FP it's not even close.


The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.

Second gen Pentiums, starting with the 75 MHz, were great.


I had a P60 that had the F0 0F bug; Windows would crash for weird reasons on it, but Linux ran like a champ because it actually had a workaround. Luckily my chip was already recalled for the FDIV bug so it wasn't a total boat anchor. Loved that machine. I had BeOS, QNX, and one time I made Linux look like Solaris with all the Open Look stuff - really enjoyed that aesthetic.

Now we have these amazing displays and graphics cards and there's literally no way to make my Mac have different window titlebars or anything. So boring


Didn't do you try again Linux recently?


Actually the first generation Socket 4 Pentiums (60 and 66 MHz) had the FOOF bug (and yes, they were bad processors — but overall system architecture with the very first PCI bus implementation with ISA legacy rather than ISA and a single VESA Local Bud expansion) was a huge step forward compared to the 486.

The FOOF bug was actually discovered on the first step of the later 90 MHz Pentium (which was released with the 100 MHz Pentium, which also suffered from the bug). However this was corrected with a hardware stepping. The 75 MHz Pentium was actually released as part of this later stepping, and it was a binned 90/100 MHz part. There were no first step 75 MHz Pentiums.


Idk if the 75 was really that great tho, mostly in that it had a 50Mhz FSB rather than 60 or 66Mhz like most other parts.

Another factor for the later P1s being better IIRC was improved chipsets.


We had a 90 overclocked to 100Mhz that served as the family computer, I inherited from it when the family computer was upgraded to a K6 II and it chugged along as my personal computer until ~2001 thanks to Linux whike the Ghz barrier had been broken for a while already in the Intel world.

I think my next computer came with an AMD Duron 900Mhz, an entry level at the time but the jump from the pentium 100Mhz was such a huge gap it still felt like a formula 1.


To be more exact, I think the first great Pentium was the 133, but the 75 is the first that was a real, proper jump in performance from a fast 486 and represented decent price/performance.


It didn't help that the earliest P5 Pentiums ran on a 5V rail. Newer revisions starting with the P54 core used 3.3V and helped with keeping the chips cool.


The Pentium was great, but the 60 and 66MHz versions were not liked, they ran way too hot.


I think from the price people also expect a similar performance boost as going from 386 to 486. What made Pentium also confusing is that during this time Intel introduced PCI.

From a 486 with VLB to a Pentium with PCI everything became a lot nicer.


They ran on 5V supplies and it was only later that the whole architecture was changed to 3.3 V with the 90 and 100 MHz Pentiums (which were then discovered to have the infamous FOOF division bug).


Many tasks perhaps, but running Quake was not one of them.


Yeah, it does alright and is a significant difference to a DX/2, but Quake came out in ’96 and the P60 came out as a super expensive workstation class CPU in ’93. If you were a gamer in ’96 it is unlikely you were rocking a P60 because it was not ever good value for money.


You could play 320x200 Quake acceptably on a P60. On a DX4 too, though barely - my family had both in the mid 90s. I'd be surprised if Quake is playable on a DX2.


Slightly before DOOM came out, the killer 486 app for me was Fractint (https://en.wikipedia.org/wiki/Fractint)


I distinctly remember having a Strike Commander poster in my bedroom saying “Strike really flies on a 486 DX/2”. Fond memories indeed.


Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.


Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.


My boss then - who's still a very dear friend - purchased a work computer to play Doom. He was already mentally checked out of that job and was looking for his next opportunity. Spent a lot of time at work playing Doom and got quite good at it.

I think it was 1994. It was a loaded 486 with the best 17" CRT monitor money could buy at the time. I think he spent over $7000.


I wonder, I wonder where one could find a good book about the software architecture of that game… oh, well


They need to bring back the turbo button.



How could I possibly forget the lock!


++1



My first Intel based PC was actually a 486DX/2-66 “Houdini” card for my PowerMac 6100/60 in late 1994. It had a SB16 daughtercard and could either share RAM with the host Mac or use a 32MB dedicated SIMM. I added a dedicate SIMM when prices dropped to $300 for it.


...and with 8 MB (-eight- for the youngsters ;-) RAM you were absolutely the king ruler :-D




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: