"Are supercomputers on the verge of creating Matrix-style simulated realities? Michael McGuigan at Brookhaven National Laboratory in Upton, New York, thinks so. He says that virtual worlds realistic enough to be mistaken for the real thing are just a few years away.
In 1950, Alan Turing, the father of modern computer science, proposed the ultimate test of artificial intelligence – a human judge engaging in a three-way conversation with a machine and another human should be unable to reliably distinguish man from machine.
A variant on this "Turing Test" is the "Graphics Turing Test", the twist being that a human judge viewing and interacting with an artificially generated world should be unable to reliably distinguish it from reality.
"By interaction we mean you could control an object – rotate it, for example – and it would render in real-time," McGuigan says.
Photoreal animation
Although existing computers can produce artificial scenes and textures detailed enough to fool the human eye, such scenes typically take several hours to render. The key to passing the Graphics Turing Test, says McGuigan, is to marry that photorealism with software that can render images in real-time – defined as a refresh rate of 30 frames per second.
McGuigan decided to test the ability of one of the world's most powerful supercomputers – Blue Gene/L at Brookhaven National Laboratory in New York – to generate such an artificial world.
Blue Gene/L possesses 18 racks, each with 2000 standard PC processors that work in parallel to provide a huge amount of processing power – it has a speed of 103 teraflops, or 103 trillion "floating point operations" per second. By way of comparison, a calculator uses about 10 floating operations per second.
In particular, McGuigan studied the supercomputer's ability to mimic the interplay of light with objects – an important component of any virtual world with ambitions to mimic reality.
He found that conventional ray-tracing software could run 822 times faster on the Blue Gene/L than on a standard computer, even though the software was not optimised for the parallel processors of a supercomputer. This allowed it to convincingly mimic natural lighting in real time.
Not there yet
"The nice thing about this ray tracing is that the human eye can see it as natural," McGuigan says. "There are actually several types of ray-tracing software out there – I chose one that was relatively easy to port to a large number of processors. But others might be faster and even more realistic if they are used in parallel computing."
Although Blue Gene/L can model the path of light in a virtual world both rapidly and realistically, the speed with which it renders high-resolution images still falls short of that required to pass the Graphics Turing Test.
But supercomputers capable of passing the test may be just years away, thinks McGuigan. "You never know for sure until you can actually do it," he says. "But a back-of-the-envelope calculation would suggest it should be possible in the next few years, once supercomputers enter the petaflop range – that's 1000 teraflops."
But others think that passing the Graphics Turing Test requires more than photorealistic graphics moving in real-time. Reality is not 'skin deep' says Paul Richmond at the University of Sheffield, UK. An artificial object can appear real, but unless it moves in a realistic way the eye won't be fooled. "The real challenge is providing a real-time simulation that includes realistic simulated behaviour," he says.
Fluid challenge
"I'd like to see a realistic model of the Russian ballet," says Mark Grundland at the University of Cambridge. "That's something a photographer would choose as a subject matter, and that's what we should aim to convey with computers."
Grundland also points out that the Graphics Turing Test does not specify what is conveyed in the virtual world scene. "If all that is there is a diffusely-reflecting sphere sitting on a diffusely-reflecting surface, then we've been able to pass the test for many years now," he says. "But Turing didn't mean for his vision to come true so quickly."
McGuigan agrees that realistic animation poses its own problems. "Modelling that fluidity is difficult," he says. "You have to make sure that when something jumps in the virtual world it appears heavy." But he remains optimistic that animation software will be up to the task. "Physical reality is about animation and lighting," he says. "We've done the lighting now – the animation will follow."
Research available on the arXiv preprint server (pdf)"
In 1950, Alan Turing, the father of modern computer science, proposed the ultimate test of artificial intelligence – a human judge engaging in a three-way conversation with a machine and another human should be unable to reliably distinguish man from machine.
A variant on this "Turing Test" is the "Graphics Turing Test", the twist being that a human judge viewing and interacting with an artificially generated world should be unable to reliably distinguish it from reality.
"By interaction we mean you could control an object – rotate it, for example – and it would render in real-time," McGuigan says.
Photoreal animation
Although existing computers can produce artificial scenes and textures detailed enough to fool the human eye, such scenes typically take several hours to render. The key to passing the Graphics Turing Test, says McGuigan, is to marry that photorealism with software that can render images in real-time – defined as a refresh rate of 30 frames per second.
McGuigan decided to test the ability of one of the world's most powerful supercomputers – Blue Gene/L at Brookhaven National Laboratory in New York – to generate such an artificial world.
Blue Gene/L possesses 18 racks, each with 2000 standard PC processors that work in parallel to provide a huge amount of processing power – it has a speed of 103 teraflops, or 103 trillion "floating point operations" per second. By way of comparison, a calculator uses about 10 floating operations per second.
In particular, McGuigan studied the supercomputer's ability to mimic the interplay of light with objects – an important component of any virtual world with ambitions to mimic reality.
He found that conventional ray-tracing software could run 822 times faster on the Blue Gene/L than on a standard computer, even though the software was not optimised for the parallel processors of a supercomputer. This allowed it to convincingly mimic natural lighting in real time.
Not there yet
"The nice thing about this ray tracing is that the human eye can see it as natural," McGuigan says. "There are actually several types of ray-tracing software out there – I chose one that was relatively easy to port to a large number of processors. But others might be faster and even more realistic if they are used in parallel computing."
Although Blue Gene/L can model the path of light in a virtual world both rapidly and realistically, the speed with which it renders high-resolution images still falls short of that required to pass the Graphics Turing Test.
But supercomputers capable of passing the test may be just years away, thinks McGuigan. "You never know for sure until you can actually do it," he says. "But a back-of-the-envelope calculation would suggest it should be possible in the next few years, once supercomputers enter the petaflop range – that's 1000 teraflops."
But others think that passing the Graphics Turing Test requires more than photorealistic graphics moving in real-time. Reality is not 'skin deep' says Paul Richmond at the University of Sheffield, UK. An artificial object can appear real, but unless it moves in a realistic way the eye won't be fooled. "The real challenge is providing a real-time simulation that includes realistic simulated behaviour," he says.
Fluid challenge
"I'd like to see a realistic model of the Russian ballet," says Mark Grundland at the University of Cambridge. "That's something a photographer would choose as a subject matter, and that's what we should aim to convey with computers."
Grundland also points out that the Graphics Turing Test does not specify what is conveyed in the virtual world scene. "If all that is there is a diffusely-reflecting sphere sitting on a diffusely-reflecting surface, then we've been able to pass the test for many years now," he says. "But Turing didn't mean for his vision to come true so quickly."
McGuigan agrees that realistic animation poses its own problems. "Modelling that fluidity is difficult," he says. "You have to make sure that when something jumps in the virtual world it appears heavy." But he remains optimistic that animation software will be up to the task. "Physical reality is about animation and lighting," he says. "We've done the lighting now – the animation will follow."
Research available on the arXiv preprint server (pdf)"
Nenhum comentário:
Postar um comentário