Page last updated at 13:19 GMT, Thursday, 30 October 2008

The Tech Lab: Andrew Herbert

Multicore chip, Intel
More processors will make computers smarter, says Andrew Herbert

Andrew Herbert is head of Microsoft Research in Cambridge. Here he takes a look at the changes that multi-core computers could usher in.

If you've been looking at buying a new computer lately, you can't fail to have noticed that multi-core technology is the "next big thing".

The latest PCs have suddenly sprung two cores while servers rejoice in four, and we are promised yet faster computing.

So why the excitement? After all, if a multi-core computer opens a document a millisecond faster than my old PC, it's hardly going to change my life, is it?

I'd like to suggest that the answer to this question is both yes and no. In altering the way in which we interact with computers, technology has the capacity to become ever-more helpful, and ever-less invasive.

List processing

Before multi-core, the PC was a sequential machine that executed a set series of tasks - much like a flow chart.

The thinking behind multi-core is relatively simple, following the principle that two heads (or four, or 32, or 64) are better than one.

The principle might be simple but it offers a significant change in potential -there are now multiple flow charts being executed and the way they interact is far more exciting.

Today we set computers tasks; tomorrow we will ask them questions and set them objectives.
Andrew Herbert
For exciting, also read "complicated"; this presents huge programming challenges as we have to address the immensely complex interplay between multiple processors (think of a juggler riding a unicycle on a high wire, and you're starting to get the idea).

The pay-off is that your PC suddenly has time on its hands, with one processor doing what you've asked it to do and the others free to do other things.

Exploring what these "other things" could be is now driving software research. The effects are difficult to predict, but let's start by looking at what it might mean for our hardware.

We're thoroughly familiar with the technology that lets us interact with the computer, but we're still not comfortable with it. Voice, gesture and handwriting recognition have been available for a number of years but the keyboard and mouse remain by far the most pervasive tools.

The advent of multi-core, however, means this is set to change.

Handwriting recognition systems, for example, work by either identifying pen movement, or by recognising the written image itself. Currently, each approach works, but each has certain drawbacks that hinder adoption as a serious interface.

Now, with a different processor focusing on each recognition approach, learning our handwriting style and combining results, multi-core PCs will dramatically increase the accuracy of handwriting recognition. So, don't get too attached to your keyboard.

Clever tools

A computer's ability to gain this kind of knowledge, and then intelligently apply it, is made possible through the combination of two research techniques: "speculative execution" and "machine learning".

At the moment my computer is just a tool - I tell it what to do and it does it. This means that when using my computer to book regular business trips, I have to enter my preferences each and every time.

Andrew Herbert, Microsoft
Herbert: Computers could get genuinely useful thanks to multicore

A multi-core computer can learn what I'm like- and what I like- and through speculative execution, start making educated guesses about how I want to travel and what I want to do next. Like the perfect PA, the computer will be able to anticipate and know what I'm about to do, even before I do.

This has important implications in internet search- our window on the world.

At the moment we search based on keywords, so if I want to find a quote that I vaguely remember, which might have been in a book by a particular author, I will enter keywords into the engine - and hope.

With speculative execution I will be able to ask the computer the question directly and it will deliver exactly what I want, not just a list of websites that contain my keywords. Today we set computers tasks; tomorrow we will ask them questions and set them objectives.

Predicting how we will live in the future can be the technologist's graveyard, but I'm going to have a go. In five years' time I will walk into my office (I predict there will still be offices) where there will be multiple screens - all an extension of my PC.

There will be a screen on my desk, a whiteboard style screen on the wall and a screen embedded in the surface of my desk. I will be able to manipulate these through my keyboard (despite what I said earlier, I predict there will still be a keyboard!), my voice and my finger tips.

Smart help

The PC will automatically begin communicating with my personal, portable devices - updating contacts, diary information and downloading any pictures I have taken.

When I show my computer a letter that arrived in the post, it will immediately scan it into the digital world.

The computer will know that on most mornings I check my e-mail, look at BBC Online and access an Excel spreadsheet to check my lab's accounts- all this will be ready and waiting for me.

Human mouth, BBC
We could be spending more time talking to computers
I will deal with my e-mails by voice, with the computer intuitively knowing addressees when I mention their forename. Technology will just "happen".

Sounds good, doesn't it, but what's the catch?

Computers that learn from and anticipate humans may remind some of 2001: A Space Odyssey's HAL, running amok in the deeps of space. But my answer is that technology is amoral.

I can use a spade to dig a hole or to hit someone over the head with - the choice is mine (and for those wondering, I would use it to dig a hole, mostly).

Our objective, as computers play a greater role in our lives, is to ensure that they are imbued with human concepts such as ownership, privacy and personal freedom. What is important is that as humans, we are aware of technology's implications and given choices on how we interact with it.

At its worst, technology today can be invasive as well as pervasive - clumsy and unwieldy it can demand a lot of our time and attention.

My vision of a multi-core future is not some science fiction extravaganza where we use a vast array of gadgets in a world substantially different from our own. In fact, my hope is almost the exact opposite - I see the potential of multi-core computing being the ability to take the hard edges off technology.

In my future, technology will be less visible, more human and simply make our lives easier.


Print Sponsor


RELATED INTERNET LINKS
The BBC is not responsible for the content of external internet sites


FEATURES, VIEWS, ANALYSIS
Has China's housing bubble burst?
How the world's oldest clove tree defied an empire
Why Royal Ballet principal Sergei Polunin quit

BBC navigation

BBC © 2014 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.

Americas Africa Europe Middle East South Asia Asia Pacific