Today marks the 25th Anniversary of the original Apple Macintosh. Yesterday we marked the occasion by looking at 25 milestones of the Mac over the last 25 years.

And while Apple may currently be cooler than a Slush Puppy cocktail in an ice hotel, it hasn't always been that way.

In fact, some of the decisions it has made have been downright disastrous. Here are just seven of them.

1. It used too much proprietary technology

Apple's determination to do things differently has often cost its customers, both in cash and cachet. Early Macs were stuffed full of proprietary connections, formats and programs. For example, it was virtually the only company to adopt NuBus expansion card slots (Steve Jobs' NeXT computer was another) when everyone else was plumping for PCI.

And it ensured file incompatibility with Windows PCs by using Group Code Recording (GCR) for floppy disk media instead of Modified Frequency Modulation (MFM). Other proprietary technologies adopted primarily by Apple include ADB and LocalTalk.

2. It took a big RISC with its processor tech

In the early days of computing, coming up with standards you hoped would become industry practice was commonplace.

When Apple adopted Reduced Instruction Set Computers (RISC) chips supplied by IBM and Motorola for the Mac, a lot was made of their superiority to the Complex Instruction Set Computers (CISC) chips pioneered by Wintel.

RISC architecture has the ability to carry simple instructions in a single processor cycle (Hz), while CISC architecture carried out complex instructions across multiple cycles.

In other words, a CISC-based PC needed a lot more processing power to achieve the same result as a RISC-based Mac.

RISC CPUs had numerous other advantages over CISC: they consumed less power, ran less hot and so were better suited to laptop applications. The downside, of course, is that RISC chips weren't very widely adopted, partly because Intel was big and powerful enough to plough on with CISC regardless.

Intel also won the marketing battle between the two architectures, because by measuring processor prowess solely in Megahertz or Gigahertz, Intel chips were always going to sound more powerful than their rivals. Which would you buy: a Mac equipped with a 1GHz PowerPC G4 CPU or a PC with a 1.7GHz Intel Pentium 4? The PC, obviously, even though in practice they both benchmarked the same.

By 2005, it became obvious that Apple simply wasn't big or powerful enough to demand faster, better chips from IBM or Motorola and none of the three could match Intel's R&D. Result: Apple jumped on the Intel bandwagon and hasn't looked back since.

3. It lost the plot in the 1990s

For five long years between 1990 and 1995, Apple drifted rudderless while Microsoft and Intel carved up the PC market between them. What went wrong? Everything! Apple employees seemingly forgot they were working for a company that had to sell products and frittered away its cash pursuing ideas it hardly ever put into practice.

Apple's reached its nadir in 1995 when it had over $1 billion worth of orders for the new Power Macintosh and no way of supplying them, and a chronic oversupply of PowerBook laptops without customers to buy them. Apple's problems were so bad that you couldn't mention the company without attaching a 'beleagured' tag to it. Time summed up Apple's situation best in 1996: "One day Apple was a major technology company with assets to make any self-respecting techno-conglomerate salivate. The next day Apple was a chaotic mess without a strategic vision and certainly no future."

4. It became synonymous with over-priced, under-performing computers

One of the greatest myths about Macs today is that they cost way more than their PC equivalents, when a direct spec-to-spec comparison between the two often proves that is not the case. But the myth persists because that was the situation in the 1990s when Apple churned out a succession of indifferent computers that costs hundreds more than their competitors. The Macintosh IIfx was a great example. It would have set you back between $9-$12K.