Working at the nanoscale presents its own problems
Dr John Pethica, chief scientific advisor at the UK's NPL writes about the promise, and pitfalls, of nanotech.
As someone who has commercialised new science in nanotechnology to build successful companies, I sometimes get asked what is the next "big thing" to look out for.
The way we think about nanoscience has only really been fully formulated in the last decade so it is still a bit early to see a lot of nano-related products. There are some fantastic concepts in nanoscience but many will be dashed on the rocks of reality before becoming working technologies.
Why is this? Well there is an important distinction to make between a conceptual model and what can be made to work in actual, material, biological reality. As Yogi Berra said: "In theory there is no difference between theory and practice. In practice there is."
Look at what happens when speculative theory enters the public realm, such as the furore over grey goo and nanobots. That is robots on the nanoscale supposedly able to manipulate atoms and molecules and turn them into any structure we like.
Nice idea but... matter may not stay where we want it to go, and also it can be ludicrously costly in energy implement.
Experiments have been done where the atoms are pushed into place, but when you stop the pushing they immediately fall apart. Reality and stability exist no matter what our imaginations would like.
Furthermore, the energy costs are very high and nowhere near the efficiency that can be achieved by self assembly of molecules - which has always been going on in the world, and goes by the name of chemistry. Biology does it already!
So nanobots don't work because the concept on which they are based has serious reality limits.
An exciting concept which is much less in the realms of science fiction than nanobots, is quantum computation. If it were to be fully developed, it would represent a staggering change in data processing.
Quantum computation is a way of handling information in the states of atoms and molecules, where the "bits" are entangled, rather than separate.
If you have 8 bits in a standard computer there are only 256 ways that they can be arranged. With a quantum computer every state of each bit can be linked with all the other bits. So the possible ways in which you can manipulate data are vastly increased, as are the number of states the computer can effectively calculate with a limited size.
Amongst the many calculations that would be staggeringly faster with quantum computation are factorisations for code breaking. This could blow a huge hole in information security, as stored encrypted data from the past would become readily readable.
Solid state drives might spell the end of the hard disk
However, at the moment quantum computation remains a serious technical challenge. One problem for quantum entangled states is decoherence, which is where a subtle interaction with the environment breaks or corrupts the quantum states.
This new physics and technology is exciting and challenging, with great useful potential, but we are not yet sure how it will work out in practice.
Another big area for nanotechnology is the interface between biological and physical systems. While direct silicon and biological interfacing has been seen in film and fiction, in reality it is tricky.
One reason is down to simple mechanical effects - the boundary between hard and soft materials produces the wrong kind of stresses. We have to find some way of mimicking biological systems, and also develop computational data handling structures closer to the "soft" logic that goes on in a brain.
Help may be at hand in the new area of plastic electronics, which is based on organic molecules. Whilst some kind of USB stick plugged into the brain is rather improbable, much more interesting are possibilities for health care monitoring and sensing.
The journey from concept to a product that consumers will buy can be long and take unexpected turns. Even when the theory can be shown to work, radical disruptive technology needs to become economically viable to get to market.
Just as likely is for existing, established technology to be enhanced by new concepts and materials, and as a result create radical new applications.
Look at the market for memory hardware. I recall around 10 years ago the chief technology officer of a leading company predicting a one gigabyte memory stick was on the cards. This was regarded as a pretty wild prediction at the time.
Memory capacities are growing all the time
Now of course we have 32GB memory sticks so cheap that you can afford to carry your life's photos and music in your pocket - and for important government databases to be left on a pub table.
Solid state memory is now replacing magnetic hard discs in the 256GB and upwards size. Nothing we can see at the moment indicates that this kind of memory device cannot grow to very much larger sizes.
It might eventually mean that the way all the data inside a computer is accessed becomes like RAM, changing the whole serial paradigm for computers. Speeds could be hugely increased for some calculations, including searches, correlations and image handling.
This shift from magnetic to electric storage has major implications but it is not a disruptive innovation "coming in from nowhere". Other nanotechnologies such as nanotubes and graphene could be more disruptive if successfully incorporated into electronics.
The application you first have in mind for a new science result might not be the eventual most important one. On top of that, unexpected but important new ideas can come in from a quite different area. This is why it's hard to predict really radical new products.
The search for new science and technology needs to be broad, and we need to be careful that too much output planning doesn't constrict speculative research - killing the goose that lays the golden eggs.
However, making a real saleable product from an idea is always the tough, expensive part, and is where most of the effort and resource has to be put. The hard work starts after filing the patent.
Finally, the end user has to be sure the product works "as it says on the tin". That kind of assurance, along with new technologies, is what comes from laboratories like NPL.