Posts Tagged ‘New York Times’

According to a recent New York Times article, military pilots of unmanned Predator drones, who operate the drones from Las Vegas, over 7,500 miles away from where the drones are flying, experience more fatigue than actual pilots flying manned planes. The reason? Sensory isolation.

Since drone pilots operate remotely, they rely entirely upon cameras mounted on the planes to guide the planes through their environment. Unlike pilots who fly manned planes, who are embodied in the very situation that they’re operating within, drone pilots experience “significantly increased fatigue, emotional exhaustion and burnout”, according to the article. This overwhelmingly showcases the importance of taking into account the embodiment of perception and awareness.

Perhaps due to the predominance of a disembodied paradigm, these results may seem anti-intuitive. Shouldn’t the pilots who operate remotely, thousands of miles from harms way, have it easy? It turns out that reliable, healthy perception and cognition rely heavily upon sensory cues which can’t be abstracted from the conditions of lived, physical embodiment. Basically, it’s difficult to perceive the environment unless you’re actually in it.

That’s a powerful lesson. Understanding perception, or building a computer or robot which perceives, necessitates embodying the subject into the environment.

Read Full Post »

In this recent NY Times article, we get asked: “What happened to all of those early promises of having cogent robots, fully or partially integrated into our society, helping us out with all of our daily tasks?” Where are our robot maids, like in The Jetsons? Robots to dramatically and obnoxiously warn us of impending dangers, such as in Lost in Space? Robot pets? It’s already just about 8 years after Stanley Kubrick’s ominous prediction of 2001, so where are they?

‘Artificial intelligence’ has become a radical misnomer with the focus more on the ‘artificial’ rather than the ‘intelligence’. Cutting edge roboticists at the MIT robotics laboratory have an astute answer. Quite simply, early models of artificial intelligence took it for granted what seem to us to be simple bodily tasks, motor functions and perceptive abilities. It turns out that those are actually the most difficult kinds of abilities to program.

If cognition is fundamentally embodied, then it’s no surprise that intelligence hasn’t emerged in robots. Before you can have real, versatile intelligence, you have to master simple motor tasks in non-structured environments. It turns out our ability to do things like reach and grab for objects or walk through a changing environment has more to do with higher cognition than anything else. And so far, we can’t even put together a robot with the same motor capabilities of a newborn; or a cockroach, for that matter.

On the bright side, there’s also no foreseeable danger of assassin terminators taking over the world, either.

Read Full Post »