Star Wars Episode IV may have recently been named the most influential visual effects film of all time but its director, George Lucas, was unsatisfied with the technology available in the 70s.
He went on to create his own effects company, Industrial Light and Magic (ILM), which pioneered new ways of getting the shots he wanted.
Star Wars was groundbreaking when first released
In the original films, scenes were formed of several different elements - from painted backdrops, to stop-motion animated models, to good-old fashioned puppets. By the time the sixth film was finished in 2005, computer generated imagery had taken over.
Today, scenes are still composed of many individually created elements, but sometimes it is just the human actors that really exist, computer generated backdrops and characters are added in later, opening up infinite possibilities for the director.
CG characters need to move in a convincing fashion, so real actors are employed in a special separate motion capture - or MoCap - studio.
This allows for their movements to be captured cleanly on specially positioned cameras and then mapped onto whatever shape and size body the director desires.
If acting inside an unpainted set is difficult, acting opposite a character who is not really there presents actors with even more of a challenge.
"Some are really good at it," said Steve Sullivan from Research and Development at ILM.
Bill Nighy's acting was the basis of the computer generated Davy Jones
"Some don't like it all, don't get it, and feel that it's very unnatural because it is very unnatural. You're acting into space with nothing in the environment supporting you."
ILM's work on the three Pirates of the Caribbean movies has pioneered a different MoCap technique, one which does away with the need for separate performances, and one which allowed the character of Davy Jones to be played by actor Bill Nighy, on set, opposite the rest of the cast.
"The effects supervisor on the film had gone through the first Pirates movie and noticed how the interaction on set was very different from the interaction of the skeletal pirates, who were shot separately on a MoCap stage," Mr Sullivan said.
"He felt like a lot of performance was being lost. So that film really sponsored this technique of doing motion capture on set, having actors working together much more organically and the director can just sum up right there what they want.
"We brought that data here and then wrapped the Davy Jones character around Bill Nighy's actual performance, adding the digital makeup, as we call it," he added.
Spandex and balls
So how does it work? To find out, I donned a spandex suit inside a Lucasfilm studio and was turned into an alien in a computer generated environment. Then I acted live on set for the director to see as he followed me with the camera.
As with traditional MoCap, the secret is in the highly reflective balls stuck to the spandex suit and, as it happened, my head.
Forty infrared cameras were mounted on the walls of the film set, recording the position of the balls from wherever I was in the studio.
The director also needed to see my character from the movie camera's point of view, as the action happens, so balls were mounted on top of the camera to tell the studio where it is.
Combine that with information about where I was and you have got yourself a virtual space, which means an actor can do anything and the camera can move around and capture the performance from any angle.
The technology is built on top of a video game graphics engine.
Of course, what the director sees live is a low resolution approximation of what the digital artists will eventually produce. Months of design, animation and hand painting, frame by frame, all go towards making the final scenes, each frame of which may take hours to render.
Good job they have got 4,400 processors in the basement, working flat out, 24 hours a day, seven days a week. The server room is dubbed the Death Star.
It is not just photo-realistic characters that put a strain on the processors, some of the hardest scenes to create are simulations of Mother Nature.
"The most intensive effects we've seen are fluid dynamics," said Kevin Clark from IT Ops at Lucasfilm.
"You need random number generation training fluid as an actor. It's a very complex scenario from a computational perspective. That takes up the most resources for us."
With each new movie, digital artists attempt more and more complex effects, demanding more processing and more storage - in fact the total number of terabytes needed for an ILM movie has doubled in each of the last three years.
Although the elements and calculations within a movie are increasing, this does not mean that the end result is a higher resolution picture.
Most of ILM's work is done at 2K, that is around 2000 pixels across. Amazingly that is the same resolution as a high-definition TV picture.