Meet the image engineers at Framestore

This article was taken from the January 2015 issue of WIRED magazine. Be the first to read WIRED's articles in print before they're posted online, and get your hands on loads of additional content by subscribing online.

In a screening room in London's Soho, William Bartlett watches as a Roman Holiday-era Audrey Hepburn TM (Framestore is legally required by Hepburn's estate to add a TM after each use of her name) steps out of a bus on the Italian Riviera, flirts her way to a free ride in an open-topped saloon and breaks into a bar of Galaxy chocolate. Roman Holiday was released in 1953, Galaxy was first sold in 1960 and Hepburn died in 1993. But the ad, made in 2013, brings her back to life.

Until 2014, few people outside advertising and film-making had heard of special-effects firm Framestore (Bartlett, 45, is its creative director). Based in central London, with offices in Montreal, New York and Los Angeles, it has, for most of its 28-year existence, operated behind the scenes, providing computer-

generated special effects for commercials, movies and TV. At best, it could enjoy a credit in a programme's end titles. But, mostly, as with the Galaxy ad, its work has gone unnoticed except by industry insiders.

However, Framestore was thrust into the limelight in 2014 when Tim Webber, the company's VFX supervisor, won the Academy Award for Best Visual Effects for the company's work on Alfonso Cuarón's

Gravity.

In the film's spacewalk scenes, which make up most of the film, the only non-digital elements on screen are the heads of Sandra Bullock and George Clooney. Everything else, from their spacesuits and jet packs, through the tools they use and control panels they operate, have never existed except as 1s and 0s on Framestore's computers. More than 80 per cent of the film is computer generated imagery (CGI), yet the film appears to have been shot in a physical environment. It's hard to believe that Bullock and Clooney are not crashing into solid matter.

As Duncan Kenworthy, producer of Four Weddings and a Funeral, Notting Hill and Love Actually, who has worked with the company on and off for the past 20 years, says. "Framestore is in the business of making the unreal real."

Founded in 1986, with a staff of four, Framestore quickly built a reputation in the advertising world for technically impressive computer graphics. In 1996, however, it was approached by Kenworthy to work on a TV adaptation of Gulliver's Travels, in which Ted Danson, filmed against a blue screen, would be inserted into the imagined world.

Layering (or "composting") one image into another has been used in film-making since the beginning of the 20th century. There are many ways to do it, from double printing a piece of film ("matting") to shooting a scene with back projection. But today, most composting is done digitally, filmed against a blue (or green) screen, and replacing every blue (or green) pixel in a shot with another image. "The problem was the shadows," says Webber, 49, who worked on the series with Bartlett. "The superimposed character didn't cast the right shadows on to their environment. So what we did was develop a way of also lifting the shadows and incorporating them into the scene." The result, even 20 years on, is impressive.

Gulliver's Travels was followed in 1999 by Walking With Dinosaurs, a BBC six-parter in which CGI dinosaurs feed, fight and reproduce in an equally realistic (real-world and CGI) environment. Dinosaurs allowed Webber to build on Gulliver, especially when it came to how the superimposed character interacted with the environment into which they were inserted. "You could make things like leaves move as they pushed past," he says, "or dust comes up off their feet, which made it all look more believable."

Dinosaurs also gave Framestore a way to explore creating creatures in CGI -- everything that appeared on screen had to be built and brought to life digitally. And, most importantly of all, both it and Gulliver gave Webber a breakthrough he had long hoped for: a move into long-form storytelling. "It got us into a whole stream of work that was different from the short-form commercials," Webber says. "Gulliver's Travels involved something like 750 shots, which, for anything back then, was a huge number."

Television miniseries (Merlin in 1998, Alice in Wonderland in 1998, Arabian Nights in 2000) followed and, in 2002, after a merger with The Computer Film Company, which had been using similar approaches, Framestore started work on the Harry Potter films, a relationship that would last until the final movie in 2011. The company was in the movie business.

In a sense, Webber and his team have, at every stage, been riding the wave of technological advances that has taken place over the past two decades. "When we started there was really very little CGI work in film," he says, "because computers couldn't handle the huge amount of data that was needed. You need a higher resolution to go on the big screen and there were just too many pixels to deal with." But they were also careful to build a reputation above simply good use of technology. "The challenge is about using the technology," Webber says, "pushing the technology, but always as part of a creative innovation. You can't have one without the other. Well, you can,

but it's not successful."

Kenworthy agrees. "Then and now their knowledge of technology was very important," he says, "but it was only a part. You take it as a given that they know everything that the most up-to-date technology is capable of. The skill is the way they work with it -- the creation of a reality that doesn't exist."

Like real-world film-making, CGI is about building sets, lighting them, filling them with actors and recording the light that bounces back off them into the lens of a camera. The difference is that CGI exists entirely as a series of functions, algorithms and outputs in the processor and memory of (often many) computers. Framestore's software R&D department is on the fourth floor of a nondescript block in a mews just north of London's Oxford Street. Better lit and with a higher average age than the company's darkened animators' studios in Soho -- where black T-shirts appear to be standard work uniform and, when WIRED visited, there was the faint smell of curry and sci-fi-nerd sweat in the air -- R&D is run by Martin Preston, an affable 40-year-old software engineer who has been with the company since 2007. Preston and his team are essentially hackers, (though they laugh a little uncertainly when WIRED suggests that) modifying standard CGI packages such as Maya (used for creating sets, objects and characters) and Nuke and Flame (used for composting) to meet the more elaborate requirements of Framestore's ongoing projects. The company typically works on about 50 projects at any one time.

Although all three packages are feature-rich -- "ready made for a small company to do a spinning logo or something like that", says Preston -- achieving the kind of complexity that Framestore is known for requires modding and some cunning workarounds, provided by Preston's department in the form of C++ plug-ins, sometimes running to 200,000 lines of code. It is these technical patches, fiercely guarded from competitors (getting to Preston's office involves two locked doors and a lift, all of which can only be operated by security card), that give Framestore its edge. "Our job," he says, "is to anticipate what's going to be required in the next project we do and to write code to make the artists' life easier."

Since its academic and mainframe origins in the 60s and 70s, most CGI animation has been cartoony and unrealistic, and for a good reason: it's extremely hard to recreate the real world in digital form. Predicting how cloth or fire will move in the wind, how water flows over pebbles or how fur, hair or skin reflect light are huge challenges involving virtually limitless variables.

Artists have avoided them by dressing characters in tight clothes or giving them firmly moussed coifs. But, in the past 15 years or so, the industry has been solving many of these problems through the application of clever software and increasingly powerful processors. However, realistic depictions of elements such as human skin remains an enormous challenge -- but even small advances made by the likes of Preston can give an advantage.

Take, for example, moving water. For a recent project, Framestore was asked to inundate a major European city with a flood. The streets and buildings are easy to build as CGI, but making realistic

CGI water is extremely difficult."Audiences expect more now,"

Preston says. "So animators have to understand more deeply how fluids work, and we write the code to help them do that."

The code itself was written by Preston's colleague, Michael Blain. "An artist," he says "can build a receptacle, fill it with a fluid and throw something into it, and the plug-in will make the fluid flow and splash in a realistic way. That means they can stop worrying about whether it looks real or not and concentrate on the drama of the scene. Maybe this wave needs to be bigger, or arrive at a certain point more quickly, so the initial trigger perhaps needs to have more force."

Fluid dynamics involve a huge number of data points, especially when you are flooding a major city, but even that challenge pales into insignificance compared to the data processing required to get the movement of light correct, a set of calculations known as rendering.

At almost the final stage of a CGI animation, characters and scenery -- often until this point little more than wireframe skeletons or very basic figures -- are dressed in their outer "skin" and the complex procedure of calculating how light would reflect off them begins. Using a program called a shader, animators are able to see and adjust how the light they have set up in their scene colours the objects they have placed there (or more specifically how that light bounces off those objects into the virtual camera).

Shaders are continually being refined, not only as the industry as a whole understands the complexities of how light reflects off the likes of hair, fur and skin, for example, but also because good shaders make good CGI. "In many ways, shaders define our product," Preston says, "the bit we hand over to the film studios,

the bit people see."

Rendering requires a huge amount of computation and Framestore maintains ultra-high-bandwidth (10-20Gbps) pipes to very large server arrays, mostly in London's Docklands, which are able to crunch the numbers. Latency is low -- a packet can travel from an artist's workstation in Soho to the Docklands and back again in about 0.1 milliseconds -- but rendering complex scenes can still take hours or even days to complete due to the large amount of data involved. A three- to four-second explosion can comprise as much as 2TB of information and, as Preston's colleague Mark Hills points out, much of a project's cost can come in terms of energy use and cooling the servers themselves.

Animation also places huge demands on hardware. "We have something in the order of petabytes of storage available to us,"

Hills says. "But what catches the vendors out when they are pitching to us is the pressure we put on those drives, the access speeds we need. During rendering there's a huge amount of information being churned all at once."

Preston laughs in agreement: "You have a massive file pushing in a way that looks like a denial of service attack."

The better the rendering, the more real the final product will look -- and "real" is a word that comes up a lot in conversations at Framestore. Almost everyone will talk about the quest for verisimilitude at some point or other, whether it's in the context of a cleverly stitched together composite in a 30-second ad or the long shots of a film such as Gravity. And nowhere is the challenge of "real" more palpable than in creating CGI humans -- and especially CGI human faces.

As Webber explains, the problem with humans is threefold. First, we (viewers) know faces. "We spend a lot of time [in real life] looking at faces," he says. "And unconsciously we can see that something is wrong, even if we don't know why. The subconscious mind is incredibly skilled at analysing the human face and working out if it is happy, sad, ill, lying... So if there is something that's just wrong, it will spot it immediately."

Secondly, faces themselves are hugely complex. An array of 43 elaborately interconnected muscles can produce an almost infinite range of expressions that, consciously or unconsciously, we are alert to. This is why it is easy to tell if someone is unhappy even if they are smiling. And this range of expressions seems to be a core part of a person's identity. "We're constantly looking at and interpreting tiny movements in someone's face and understanding what they mean," Bartlett says. "We're so familiar with them that it makes this a very thorny subject. It's something where, so easily, you can go, 'I don't know why it doesn't look right, but it doesn't look right.' With faces you just know."

What's more, it seems, we all move our faces in a unique way.

Early iterations of Hepburn's face in the Galaxy ad, even though they looked exactly like existing images of Hepburn from her early days, became unconvincing as soon as they were made to move. "Basically they looked like the [lookalike] actor we had employed for the live shoot," Bartlett says.

Finally, there is the problem of depicting skin. Skin reflects light in an unusual way. Some photons bounce straight back off the surface, but others penetrate the epidermis, scattering sideways through the skin's lower layers and exiting somewhere else. This "subsurface scattering" is still not entirely understood in the physical world (materials such as marble, shells and wax do the same) and it is hard to synthesise digitally, which is why many CGI humans' skin seems plasticky or just odd.

Framestore's skill with subsurface scattering (and, in the final version, Galaxy's Hepburn looks very much the genuine article) comes mainly from work the company did on Cuarón's previous film, Children of Men (2006). Cuarón wanted a key character to give birth on screen -- not in the usual long shot/crying sound/actor-holding-up-a-doll way, but in all its very human detail. "Obviously," says Webber, wryly, "it was going to be hard to film this live. Apart from anything else it wouldn't have been easy to find a pregnant actress whose schedule fitted that of the film.

So the decision was made to create a CGI baby and animate its birth."

There were many challenges, he says. "You were creating something that had to be believably human.

People had done CGI humans before, but not where they had to feel real. It's different if it's a fantasy movie, for example. You can accept they are not real. This was not a fantasy movie and it had to feel absolutely convincing." Added to this, the shot, as is Cuarón's trademark, was long, giving plenty of opportunities for the magic to be ruined. And, of course, most of what viewers would see would be skin. "Getting the movement right was very important," Webber says, "but it was withChildren of Menthat I think we really cracked subsurface scattering. The scene is shot in quite a low light, a single paraffin lamp, and that helped a lot, of course. But it was also a four-and-a-half-minute shot which is a lot of time for a viewer to see the joins. Yet I think we did it. I think it was the first time people came out of the cinema and thought, 'that was a real baby.'"

Similarly, viewers have been coming out of Gravity feeling that they witnessed something that was shot in space. (Buzz Aldrin said he was "very, very impressed" by the film.) Yet almost everything was filmed inside a two-million-LED light box, about ten cubic metres, which the company built at Pinewood Studios outside London. "Space is zero gravity, and simulating zero gravity is unbelievably difficult -- way harder than you realise," Webber says. "There is no up and down, and things like skin and clothes don't hang or crease like they do on Earth. Even when we were just filming their faces, if they lean over you start to see a strain on the neck because [on Earth] there is gravity, and that won't do.

Every tiny little thing you do is a challenge."

The light box allowed Cuarón to keep his actors relatively upright (avoiding gravity-induced sagging) while the "set" -- CGI objects pixellated into the LEDs -- and the (actual) camera, on a robot arm, moved around them, correctly lighting their faces and creating what, in the end product, is a powerful sense of movement (especially in the 3D version).

Meanwhile on the CGI side, Webber had to untrain something that is virtually hardwired in good animators: a sense of weight. "Weight is a big part of what animators learn when they are training," he says. "It's almost in their bones. But in space there is no weight and we had make them think again. Objects in space behave oddly. Once something starts moving, for example, it doesn't stop until it hits something else, as there is no air resistance.

This is something very alien to us on Earth."

As part of his research, Webber went up in Nasa's "vomit comet" to experience weightlessness first hand. ("I didn't vomit," he says. "Though others did.") And he spoke to astronauts about their experience on the International Space Station. "They were fascinating," he says. "One of the astronauts said that she could take a single hair, use it to push herself off one wall of the ISS and she wouldn't stop until she reached the other end. All of this we had to replicate in the world we built for

Gravity."

Webber likes to say he is not led by the capabilities of technology. "I prefer to take on a project, and then see how we can use technology to respond to it creatively," he says. But 5,500km away, in Framestore's New York offices, Mike Woods is taking the opposite tack, experimenting with new tech and seeing if he can find a client who is risk-friendly enough to take it on.

Although he has been in the company for almost as long -- he joined Framestore in 1996 -- at 40, Woods is younger than most of the veterans. He cut his teeth mashing up home-produced videos and uploading them to the web, and, he says, he is "always on the road with new things. We create things internally, make a prototype and take it out to sell to clients."

One example is the Oculus Rift, which Woods got his hands on at a very early stage and used to develop a Game of Thrones VR experience for this year's SXSW. Ten thousand people used the installation in five days. "The hardware breakthrough was the affordability of the device," he says. "Basically it's just a smartphone in a ski mask. And the 4D was also a real help." (Woods used a wind machine and a Rumble Pak to create a multisensory experience.) "But the real strength came from the work we did on real-time rendering."

Photorealistic real-time rendering requires massive processing power, usually in a single computer, constantly recalculating the way light reflects off objects' surfaces as they are moved around by users. Games consoles use powerful graphics processing units to do the maths (which is why console games are often much more realistic than those on a PC). Woods's breakthrough was to integrate a similar real-time rendering engine into Framestore's traditional CGI process. "People still use the same publishing tools [to create objects, light them and move them around]," he says. "But in the end we put in a real-time engine, which opens up the way to live virtual reality of a very high quality."

The results aren't perfect. "Asking one machine to do something to the same quality as a huge server farm is a big ask," he says. "But we're getting closer. Textures and things like mist and water are great. Human and animal animation [are next]. If you look at, say, the PS4 trailer for Watch Dogs, it's really close to perfection. Take the humans out and it's very filmic."

It's quite possible, of course, that no company will ever crack what could be called the Turing test of CGI, a computer-generated human who acts, moves and emotes like a real person. "We're a long way from pushing a button and having Audrey Hepburn selling beer," Preston says. "Right now she doesn't even open her mouth. Besides, the artist will still be central to the process. We have e-books, for example, but we still need authors."

And this is probably not the real aim of anyone in Framestore, where human creativity is valued as highly as tech savviness. The company thrives on responding to creative challenges using a mix of science, technology, some sleight of hand and a lot of old-fashioned storytelling. (Webber's next goal is to direct a film himself. "I have some ideas in development," he says.) And should the time come when computers can do creative work as well as people, you get the feeling that Webber, Bartlett and their colleagues would be just a little disappointed. "If that happens," says Woods, "we can make a holodeck and I can walk into the sunset knowing there is nothing more to do. You could walk into an office and be given a headset and be anywhere, and you wouldn't even know you were wearing a headset. Luckily, I think that is a long way off."

This article was originally published by WIRED UK