Saturday, July 9, 2011
Nexus S to serve as brain for 3 robots aboard the ISS
The shuttle Atlantis is set to carry two Nexus S phones into orbit that will turn a trio of floating satellites on the International Space Station into remote-operated robots.
The 135th and last flight of the shuttle program, set for 11:26 a.m. ET, will help advance the cause of robotkind when the Android handsets are attached to the bowling ball-size orbs.
Propelled by small CO2 thrusters, the Synchronized Position Hold, Engage, Reorient, Experimental Satellites (Spheres) were developed at MIT and have been in use on the ISS since 2006.
As seen in the vid below, they look like the Star Wars lightsaber training droid but are designed to test spacecraft maneuvers, satellite servicing, and flight formation.
Normally, the Spheres orbs carry out preprogrammed commands from a computer aboard the ISS, but the Nexus Android phones will give them increased computing power, cameras, and links to ground crew who will pilot them.
The ISS Nexus-powered robots aren’t an entirely unique concept, of course. Toy maker Hasbro showed off something similar at Google I/O 2011, its conceptual male and female Nexus S robotic docks. The toys are able to move around and interact with their environment and even get dizzy when shaken by mischievous handlers.
Labels:
Android OS,
exploration,
flying,
Google,
hardware,
space
A robot gets sensitive skin
"The robot has moved a step closer to humanity," concludes a news release put out today by a German research institute on the development of a robotic skin.
The "skin" consists of 2-inch square hexagonal plates packed with sensors for things like touch, acceleration and temperature that are joined together in a honeycomb-like configuration.
"We try to pack many different sensory modalities into the smallest of spaces," said Philip Mettendorfer, who is developing the skin at the Technical University of Munich, in the news release. "In addition, it is easy to expand the circuit boards to later include other sensors, for example, pressure."
The technology, according to the researchers, will provide robots with tactile information to complement their camera eyes, infrared scanners and gripping hands. Tap it on the back, in the dark, and it will know you're there.
In the video above, researchers test the sensors on a robotic arm by doing things such as brushing it with a piece of tissue paper and touching it with a warm hand to show how the robot quickly jerks away. In another test, the accelerometer allows it to keep a cup on a tray steady as the arm is moved around.
For now, the skin consists of just 15 sensors, though the researchers plan to create a prototype completely draped in the skin-line sensors that can interact with its environment.
The research effort, described in June issue of IEEE Transactions on Robotics, joins other quests around the world for robotic skin.
Ali Javey's group at the University of California at Berkeley, for example, recently reported on a new material for e-skin that can detect a range of pressures. This could, for example, allow a robot to distinguish between an egg and a frying pan and adjust its grip accordingly.
NASA scientists reported development of a skin that would give robots a sense of touch as it moved about its environment. Similar to Mettendorfer's concept, this would help robots react, for example, when they bump into an object.
The goal for robotic skin experts doesn't stop at the current sensory accomplishments. "These machines will someday be able to incorporate our fundamental neurobiological capabilities and form a self-impression," according to the Technical University of Munich.
Tuesday, June 14, 2011
Thursday, February 10, 2011
Robots to get their own internet
European scientists have embarked on a project to let robots share and store what they discover about the world.
Called RoboEarth it will be a place that robots can upload data to when they master a task, and ask for help in carrying out new ones.
Researchers behind it hope it will allow robots to come into service more quickly, armed with a growing library of knowledge about their human masters.
The idea behind RoboEarth is to develop methods that help robots encode, exchange and re-use knowledge, said RoboEarth researcher Dr Markus Waibel from the Swiss Federal Institute of Technology in Zurich.
"Most current robots see the world their own way and there's very little standardisation going on," he said. Most researchers using robots typically develop their own way for that machine to build up a corpus of data about the world.
This, said Dr Waibel, made it very difficult for roboticists to share knowledge or for the field to advance rapidly because everyone started off solving the same problems.
By contrast, RoboEarth hopes to start showing how the information that robots discover about the world can be defined so any other robot can find it and use it.
RoboEarth will be a communication system and a database, he said.
In the database will be maps of places that robots work, descriptions of objects they encounter and instructions for how to complete distinct actions.
The human equivalent would be Wikipedia, said Dr Waibel.
"Wikipedia is something that humans use to share knowledge, that everyone can edit, contribute knowledge to and access," he said. "Something like that does not exist for robots."
It would be great, he said, if a robot could enter a location that it had never visited before, consult RoboEarth to learn about that place and the objects and tasks in it and then quickly get to work.
"The key is allowing robots to share knowledge," said Dr Waibel. "That's really new."
RoboEarth is likely to become a tool for the growing number of service and domestic robots that many expect to become a feature in homes in coming decades.
Dr Waibel said it would be a place that would teach robots about the objects that fill the human world and their relationships to each other.
For instance, he said, RoboEarth could help a robot understand what is meant when it is asked to set the table and what objects are required for that task to be completed.
The EU-funded project has about 35 researchers working on it and hopes to demonstrate how the system might work by the end of its four-year duration.
Early work has resulted in a way to download descriptions of tasks that are then executed by a robot. Improved maps of locations can also be uploaded.
A system such as RoboEarth was going to be essential, said Dr Waibel, if robots were going to become truly useful to humans.
Labels:
A.I.,
artificial intelligence,
exploration,
language,
networking
Friday, January 28, 2011
Robots learn from rats' brains
Queensland engineers have translated biological findings to probabilistic algorithms that could direct robots through complicated human environments.
While many of today's machines relied on expensive sensors and systems, the researchers hoped their software would improve domestic robots, cheaply.
Roboticist Michael Milford worked with neuroscientists to develop algorithms that mimicked three navigational systems in rats' brains: place cells; head direction cells; and grid cells.
In an article published in PLoS Computational Biology this week, he described simulating grid cells - recently discovered brain cells that helped rats contextually determine their location.
To explain the function of grid cells, Milford described getting out of a lift at an unknown floor, and deducing his location based on visual cues like vending machines and photocopiers.
"We take it for granted that we find our way to work ... [but] the problem is extremely challenging," said the Queensland University of Technology researcher.
"Robots are able to navigate to a certain point, but they just get confused and lost in an office building," he told iTnews.
The so-called RatSLAM software was installed in a 20kg Pioneer 2DXe robot with a forward facing camera that was capable of detecting visual cues, their relative bearing and distance.
The robot was placed in a maze similar to those used in experiments with rats, with random goal locations that simulated a rat's collection of randomly thrown pieces of food.
It calibrated itself using visual cues, performing up to 14 iterations per second to determine its location when placed in one of four initial starting positions.
Milford explained that environmental changes like lighting, shadows, moving vehicles and people made it difficult for robots to navigate in a human world.
Machines like the Mars Rovers and those competing in the DARPA Challenges tended to use expensive sensors - essentially "throwing a lot of money" at the problem, he said.
But a cheaper solution was needed to direct domestic robots, which were currently still in early stages of development and "very, very, very, dumb".
"The only really successful cheap robot that has occurred so far is the [iRobot Roomba] vacuum cleaner," he said. "They don't have any idea where they are; they just move around randomly."
The grid cell project was the latest in almost seven years of Milford's research into applying biological techniques to machines.
The team had been approached "occasionally" by domestic robot manufacturers, he said, but was currently focussed on research, and not commercialisation.
Labels:
A.I.,
animals,
artificial intelligence,
brains,
exploration,
navigation,
rats
Subscribe to:
Posts (Atom)