Saturday, July 9, 2011
Nexus S to serve as brain for 3 robots aboard the ISS
The shuttle Atlantis is set to carry two Nexus S phones into orbit that will turn a trio of floating satellites on the International Space Station into remote-operated robots.
The 135th and last flight of the shuttle program, set for 11:26 a.m. ET, will help advance the cause of robotkind when the Android handsets are attached to the bowling ball-size orbs.
Propelled by small CO2 thrusters, the Synchronized Position Hold, Engage, Reorient, Experimental Satellites (Spheres) were developed at MIT and have been in use on the ISS since 2006.
As seen in the vid below, they look like the Star Wars lightsaber training droid but are designed to test spacecraft maneuvers, satellite servicing, and flight formation.
Normally, the Spheres orbs carry out preprogrammed commands from a computer aboard the ISS, but the Nexus Android phones will give them increased computing power, cameras, and links to ground crew who will pilot them.
The ISS Nexus-powered robots aren’t an entirely unique concept, of course. Toy maker Hasbro showed off something similar at Google I/O 2011, its conceptual male and female Nexus S robotic docks. The toys are able to move around and interact with their environment and even get dizzy when shaken by mischievous handlers.
Labels:
Android OS,
exploration,
flying,
Google,
hardware,
space
A robot gets sensitive skin
"The robot has moved a step closer to humanity," concludes a news release put out today by a German research institute on the development of a robotic skin.
The "skin" consists of 2-inch square hexagonal plates packed with sensors for things like touch, acceleration and temperature that are joined together in a honeycomb-like configuration.
"We try to pack many different sensory modalities into the smallest of spaces," said Philip Mettendorfer, who is developing the skin at the Technical University of Munich, in the news release. "In addition, it is easy to expand the circuit boards to later include other sensors, for example, pressure."
The technology, according to the researchers, will provide robots with tactile information to complement their camera eyes, infrared scanners and gripping hands. Tap it on the back, in the dark, and it will know you're there.
In the video above, researchers test the sensors on a robotic arm by doing things such as brushing it with a piece of tissue paper and touching it with a warm hand to show how the robot quickly jerks away. In another test, the accelerometer allows it to keep a cup on a tray steady as the arm is moved around.
For now, the skin consists of just 15 sensors, though the researchers plan to create a prototype completely draped in the skin-line sensors that can interact with its environment.
The research effort, described in June issue of IEEE Transactions on Robotics, joins other quests around the world for robotic skin.
Ali Javey's group at the University of California at Berkeley, for example, recently reported on a new material for e-skin that can detect a range of pressures. This could, for example, allow a robot to distinguish between an egg and a frying pan and adjust its grip accordingly.
NASA scientists reported development of a skin that would give robots a sense of touch as it moved about its environment. Similar to Mettendorfer's concept, this would help robots react, for example, when they bump into an object.
The goal for robotic skin experts doesn't stop at the current sensory accomplishments. "These machines will someday be able to incorporate our fundamental neurobiological capabilities and form a self-impression," according to the Technical University of Munich.
Subscribe to:
Posts (Atom)