Tuesday, May 15, 2012

Amazon to buy robot maker Kiva




Amazon is buying robot maker Kiva Systems $US775 million, a deal that will bring more robotic technology to the e-commerce company's giant network of warehouses.

The acquisition, which has been approved by Kiva's stockholders, is expected to close in the second quarter of 2012, Amazon added in a statement. It is Amazon's second biggest purchase, after it paid $US1.2 billion for shoe e-tailer Zappo in 2009.

Kiva develops robots that zip around warehouses, grabbing and moving shelves and crates full of products. The technology helps retailers fulfill online orders quickly and with fewer workers. Companies including Gap, Staples and Crate & Barrel, have used the technology.

Amazon has traditionally used more employees in its warehouses, or fulfillment centers as they are known. However, Kiva's robots have been used by other e-commerce companies acquired by Amazon in recent years, such as Quidsi and Zappos.

"This is a way to improve efficiency," said Scott Tilghman, an analyst at Caris & Company. "Given the scale of Amazon's operations, it makes sense to have this capability in house."

Fulfillment centers are crucial to Amazon's main online retail business. But the company also offers fulfillment services to other merchants, making the warehouses even more important.

"Amazon has long used automation in its fulfillment centers, and Kiva's technology is another way to improve productivity by bringing the products directly to employees to pick, pack and stow," said Dave Clark, vice president, global customer fulfillment, at Amazon.com.

Amazon has been spending more on fulfillment in recent years as the company opened lots of new warehouses to handle the rapid growth of its business.

Fulfillment costs as a percentage of revenue rose to more than 9 per cent in 2011, from just over 8 per cent in 2010, according to Aaron Kessler, an analyst at Raymond James.

"That's been a big focus for investors recently," Kessler said. "It's a big cost. They are shipping so much and increasing volume so they need to figure out how to get more leverage out of these fulfillment centers."

Saturday, July 9, 2011

Nexus S to serve as brain for 3 robots aboard the ISS



The shuttle Atlantis is set to carry two Nexus S phones into orbit that will turn a trio of floating satellites on the International Space Station into remote-operated robots.

The 135th and last flight of the shuttle program, set for 11:26 a.m. ET, will help advance the cause of robotkind when the Android handsets are attached to the bowling ball-size orbs.

Propelled by small CO2 thrusters, the Synchronized Position Hold, Engage, Reorient, Experimental Satellites (Spheres) were developed at MIT and have been in use on the ISS since 2006.

As seen in the vid below, they look like the Star Wars lightsaber training droid but are designed to test spacecraft maneuvers, satellite servicing, and flight formation.

Normally, the Spheres orbs carry out preprogrammed commands from a computer aboard the ISS, but the Nexus Android phones will give them increased computing power, cameras, and links to ground crew who will pilot them.

The ISS Nexus-powered robots aren’t an entirely unique concept, of course. Toy maker Hasbro showed off something similar at Google I/O 2011, its conceptual male and female Nexus S robotic docks. The toys are able to move around and interact with their environment and even get dizzy when shaken by mischievous handlers.


A robot gets sensitive skin



"The robot has moved a step closer to humanity," concludes a news release put out today by a German research institute on the development of a robotic skin.

The "skin" consists of 2-inch square hexagonal plates packed with sensors for things like touch, acceleration and temperature that are joined together in a honeycomb-like configuration.

"We try to pack many different sensory modalities into the smallest of spaces," said Philip Mettendorfer, who is developing the skin at the Technical University of Munich, in the news release. "In addition, it is easy to expand the circuit boards to later include other sensors, for example, pressure."

The technology, according to the researchers, will provide robots with tactile information to complement their camera eyes, infrared scanners and gripping hands. Tap it on the back, in the dark, and it will know you're there.

In the video above, researchers test the sensors on a robotic arm by doing things such as brushing it with a piece of tissue paper and touching it with a warm hand to show how the robot quickly jerks away. In another test, the accelerometer allows it to keep a cup on a tray steady as the arm is moved around.

For now, the skin consists of just 15 sensors, though the researchers plan to create a prototype completely draped in the skin-line sensors that can interact with its environment.
The research effort, described in June issue of IEEE Transactions on Robotics, joins other quests around the world for robotic skin.

Ali Javey's group at the University of California at Berkeley, for example, recently reported on a new material for e-skin that can detect a range of pressures. This could, for example, allow a robot to distinguish between an egg and a frying pan and adjust its grip accordingly.

NASA scientists reported development of a skin that would give robots a sense of touch as it moved about its environment. Similar to Mettendorfer's concept, this would help robots react, for example, when they bump into an object.

The goal for robotic skin experts doesn't stop at the current sensory accomplishments. "These machines will someday be able to incorporate our fundamental neurobiological capabilities and form a self-impression," according to the Technical University of Munich.