Processors are tiny bits of silicon the size of a fingernail that do the grunt work inside your machine.
First came the single, then the dual-core and now Intel and AMD are promising four processors on a chip.
They sparkle like gems and are easily worth as much to mankind: the copper coloured disks of silicon form which the chips are cut are the platters that modern life is built on.
AMD's first quad-core chip was unveiled in September
Each little square on these wafers as they are known is a fully-blown processor chip and these small chunks of silicon have changed how we live our lives.
We are not just talking personal computers here: think power stations, cars, television, hospitals and anything and everything is run by bits of silicon.
The first ever microprocessor drove a rudimentary calculating machine but since then, the technology has grown and the chips have got smaller.
The problem has always been the hunt for performance. Chips need more and more power to keep up with the latest application and operating system developments and it is playing catch-up that causes the headaches.
The faster a chip runs the hotter it gets, so the aim is to make a chip more powerful so it can run more slowly and therefore cooler. But that makes the chip bigger.
So you need to grow its performance but not its size - and that's a difficult trade off.
We are now at the stage where shrinking the core of the chip, the part of the chip that does the thinking, is getting less efficient.
"It takes an extraordinary amount of electrical power to eke out even a small increase in the performance of a single core," said Justin Rattner, chief technical officer at Intel. "So as power has become the predominant concern, we've had to step back and rethink this.
"A much more energy efficient strategy is to limit the individual core power, or even reduce it, and increase the number of cores.
"The challenge is the software required to use those large numbers of cores, which is a problem we're facing."
That software problem he is talking about is trying to make applications that run on multi-core processors make real use of the power available.
The programs need to effectively distribute tasks amongst the separate cores to get maximum bang for buck.
But the software issue is not slowing down development of these devices.
Intel have quad-core chips on the market already and are launching more.
Intel's next generation of chips have elements 45 nanometres wide
Paul Otellini, CEO of Intel said: "This year we have 45 nanometres core tube with Penryn, next year we'll have a new micro-architecture called Nehalem.
"You'll be able to configure the real-time needs of the system as it relates to the microprocessor: turning cores, caches and threads on and off, enabling and disabling power states and so forth to be able to optimise the performance for the given task at hand."
Nanometres are billionths of a metre. The human hair is 2,000 nanometres thicker, so that gives you an idea of what we are talking about.
Intel have been getting a bit of flack recently because their quad-core chip is merely two dual-core chips glued together whereas AMD have built their's from the ground up.
Very soon AMD will be delivering its new high-end gamer system, Spider, which is a combination of some seriously heavy duty graphics cards with its new desktop Pheonom quad-core processor.
"The power of the Spider platform is awesome. In fact it's marking a new era of gaming called 'gaming supercomputing'," said Pat Moorhead, vice president at AMD.
"Essentially that's going to deliver the cinematic quality of movies that were launched this year, rendered in real-time, right on your desktop.
"Just to put the power in perspective, we're providing teraflops of performance."
Mr Moorhead claimed that it would deliver "Eighty times the performance of the Sony PS3, 200 times the performance of IBM's Deep Blue computer."
It will not stop there, once they start developing it seems hard for them to stop.
AMD president Dirk Meyer said: "We'll see eight processors from AMD in a single microprocessor socket, roughly seven or eight quarters from now."
One 78-year-old engineer said it would happen this way. Back in 1965 Gordon Moore - co-founder of Intel - wrote a paper predicting the growth of processor development and promptly got a law named after him.
"I plotted this curve showing the complexity, measured by the number of transistors and resistors on a chip, versus time," he said.
"In its first few years we'd gone from one transistor to about 60, essentially doubling every year. So I just continued that, said if we continue to double every year for the next 10 years we'll go from 60 to 60,000.
"It turned out to be amazingly close to what actually happened. One of my colleagues gave that the name Moore's Law."
Moore's law still stands. As ever with the future of computing, you under-estimate it at your peril.