tag:blogger.com,1999:blog-31077796953757362562024-02-20T06:34:34.828+11:00Robots HomeUnknownnoreply@blogger.comBlogger93125tag:blogger.com,1999:blog-3107779695375736256.post-66091571071393077152012-06-16T00:44:00.004+10:002012-06-16T00:44:42.818+10:00Domesticated robots are at our doorstep ready to serve<div dir="ltr" style="text-align: left;" trbidi="on">
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi1yVXW0m8ktG4Z_8v_U-TgEQJU1zA26dfg9WFEKy4gA5-k1CtPXsU6xFawRf8fDrNI-5Tt7TJdf0c9XBL8gupc8pdn8DIy7UUjLDS7w_n58vAfdk5DpfM2T9YUgPhch7B4fu9rJRagLG9w/s1600/robot.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="231" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi1yVXW0m8ktG4Z_8v_U-TgEQJU1zA26dfg9WFEKy4gA5-k1CtPXsU6xFawRf8fDrNI-5Tt7TJdf0c9XBL8gupc8pdn8DIy7UUjLDS7w_n58vAfdk5DpfM2T9YUgPhch7B4fu9rJRagLG9w/s320/robot.jpg" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="background-color: white; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; text-align: left;">PhD student Shaukat Asidi, Mary Anne Williams and PR2.</span><span style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: left; vertical-align: baseline;"><br /></span></td></tr>
</tbody></table>
<br />
'Think of your most tedious household cleaning task. Now think about never having to do it again,'' exhorts the US robotics company iRobot on its website.<br />
For $US500 you can buy the Roomba 770 vacuum cleaning robot made by the Massachusetts Institute of Technology spin-off company. But why would you do that when you can buy much the same thing locally from Robomaid Australia for $400?<br />
The robotic future is here.<br />
<br />
Online you can order household robots to wash your floors and scoop your gutters and clean your swimming pool. This week, an Israeli company claimed a world first in performing robot-guided brain surgery. Around the world, robots are integrated into factories. In the West Angelas iron mine in the Pilbara, Rio Tinto is running robotic drill rigs remotely from Perth, hauling ore with driverless trucks and soon, with driverless trains. The Port of Brisbane is operated largely by robots, monitored from Sydney.<br />
<br />
''Ten years ago, robots were knocking on our doorstep. Now they have invaded,'' says Professor Mary-Anne Williams, director of the innovation and enterprise research laboratory at the University of Technology, Sydney.<br />
UTS's Magic Lab is the proud owner of a PR2 from Willow Garage in the US. It is using the robot for research into co-robotics, jargon for human-robot interactions, which is the next big thing in robot development. Most industrial robots are not safe to be around, Williams says. Because they have difficulty recognising and responding to the presence of humans, they have to be confined to work cells off limits to people.<br />
<br />
But leaps in sensing and vision technologies are making robots sensitive to co-workers.<br />
''The new vision, the next generation, is for people working side by side with robots,'' Williams says. The PR2, one of about 50 of its kind in the world, can stop and back off if it runs into someone. It can high-five. It can hug.<br />
<br />
Melbourne industrial cutting equipment maker Sutton Tools has been using Japanese-made FANUC robots to move components on and off processing lines around its factories in Australia and New Zealand for the past five years.<br />
The chief engineering executive, Phillip Xuereb, says a machine integrated with the robot system achieves a 40 per cent efficiency gain compared with one without.<br />
<br />
Obvious advantages they have over humans is that they don't get tired, bored or sick.<br />
''They are very well accepted by the employees because it makes their job more efficient and takes away the dull and boring part of the operations,'' Xuereb says.<br />
The 95-year-old company employing 450 people and about 40 robots has not retrenched anyone as a result of taking on robots, but ''we have been able to add more equipment without having to add more people to the operation as we grow''.<br />
<br />
Manufacturing equipment supplier John Hart Pty Ltd has been the Australian distributor for FANUC for 25 years. The Japanese company has just built a second factory to keep up with the demand, says John Hart's operations manager of automation and robotics, Simon Hales. Because they are made in large volumes, and because they can be adapted for different purposes, robots are ''quite attractive'' in price compared with customised machinery, he says.<br />
<br />
With robotics systems now more intelligent, able to make decisions for themselves, more flexible and more adaptable, most clients are ''quite surprised'' to learn that ''for a lot of applications, you could probably spend $20,000 to $40,000 and have a robot that will do the job for you'', Hales says.<br />
<br />
Barack Obama has his share of nicknames but ''Robama'' can be added to the list since he announced a robot-led renaissance of US manufacturing last June. Of the $US500 million to kick-start smart manufacturing, $70 million would go to robotics projects, the President said after touring a robotics facility in Pittsburgh. Until now, US robotics has been focused on defence industries, Europe has concentrated on manufacturing and Japan has taken the lead in developing humanoid robots for the service industry, particularly aged care.<br />
<br />
Panasonic has prototype hair washing, food serving and dish washing robots.<br />
Australia, though, is clear world leader in field robotics, says Hugh Durrant-Whyte, the chief executive of National ICT Australia and a leader of Australian robotics research. With its ''big empty spaces'', Australia is ''probably … the best place in the world to do robots'', he says, citing mining, cargo handling, marine and maritime, defence and agricultural applications.<br />
<br />
It is industry wisdom that robots add best value on tasks that are ''dull, dirty and dangerous'', says Salah Sukkarieh, a professor of robotics and intelligent systems at the Australian Centre for Field Robotics at the University of Sydney. But research in which he is involved shows widening applications as robots become better able to interpret and interact with changing environments.<br />
<br />
Unmanned aerial vehicles are being used to map locations of specific weeds across large land areas then deliver targeted payloads of insecticide. They are being used to monitor locust movement and track the endangered swift parrot. Horticulture Australia is funding research into using robots to monitor fruit tree health and count potential yields. Eventually, when they can be taught to recognise ripeness and pick without fruit damage, robots may be used for harvesting, Sukkarieh says.<br />
<br />
Australia also does world leading research in the field of compliance, says Dr Matthew Dunbabin, the president of the Australian Robotics and Automation Association and principal research scientist at the CSIRO Information and Communication Technologies Centre.<br />
<br />
Using robots to collect and analyse data for assessing compliance with regulations is ''gaining incredible traction'' among governments and companies, Dunbabin says. His CSIRO work, for example, involves robot measurement of carbon sequestration for use in greenhouse gas accounting. Robots can go around forests and measure tree diameters to provide estimates of how much carbon is captured there.<br />
<br />
Robotic boats can go around water storage facilities to detect and measure emissions of gases.<br />
Australia's competitive advantage will lie in teaming its world leading researchers with companies and industries that need the technology and using the results to insert itself in overseas markets, argues Durrant-Whyte. ''The real money is in running the automated terminal in the Port of Brisbane more efficiently than anyone else in the world,'' he says, as an example.<br />
<br />
Professor Roy Green agrees. He is a member of the Prime Minister's manufacturing taskforce, which is preparing a report to the government on the way forward. ''It would be great to think that we were a centre for robotics manufacturing but it is unlikely to be the case,'' he says.<br />
<br />
Australia is more likely to play an important part in the global robot supply chain, in the design and implementation of robotics systems and in the design of components and business models, says Green, who is the dean of the school of business at UTS.<br />
<br />
Xuereb, 67, who has been in tool manufacturing for 32 years, has no qualms about a robot-led future. ''I think they are fantastic,'' he says. ''I think it is just a normal progression of Australian manufacturing.''<br />
</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-71013316777922064592012-05-15T19:55:00.000+10:002012-05-15T19:55:54.339+10:00Amazon to buy robot maker Kiva<div dir="ltr" style="text-align: left;" trbidi="on">
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUmsnkd-n8ert9M3Gmxi8dc-empL_WSHUdYaUEFqAz1H9Q7ny5S3OXmBkk5hR5fWuDwOQ6Ngv71fNTbx80a0xv30aJNhs-2y0gYNeMIan8GrvSgmuygWc2wZ2VrS0a_ibFrX74PN3aLsfP/s1600/Kiva-Systems-robots.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="208" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUmsnkd-n8ert9M3Gmxi8dc-empL_WSHUdYaUEFqAz1H9Q7ny5S3OXmBkk5hR5fWuDwOQ6Ngv71fNTbx80a0xv30aJNhs-2y0gYNeMIan8GrvSgmuygWc2wZ2VrS0a_ibFrX74PN3aLsfP/s320/Kiva-Systems-robots.jpg" width="320" /></a></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
Amazon is buying robot maker Kiva Systems $US775 million, a deal that will bring more robotic technology to the e-commerce company's giant network of warehouses.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
The acquisition, which has been approved by Kiva's stockholders, is expected to close in the second quarter of 2012, Amazon added in a statement. It is Amazon's second biggest purchase, after it paid $US1.2 billion for shoe e-tailer Zappo in 2009.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
Kiva develops robots that zip around warehouses, grabbing and moving shelves and crates full of products. The technology helps retailers fulfill online orders quickly and with fewer workers. Companies including Gap, Staples and Crate & Barrel, have used the technology.<br />
<br /></div>
<div class="hidden" id="adspot-300x250-pos-3" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; left: -9000px; line-height: 17px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; position: absolute; top: 0px; vertical-align: baseline; width: 90px;">
<small style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: inherit; font-size: 12px; font-style: inherit; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">Advertisement: Story continues below</small></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
Amazon has traditionally used more employees in its warehouses, or fulfillment centers as they are known. However, Kiva's robots have been used by other e-commerce companies acquired by Amazon in recent years, such as Quidsi and Zappos.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
"This is a way to improve efficiency," said Scott Tilghman, an analyst at Caris & Company. "Given the scale of Amazon's operations, it makes sense to have this capability in house."<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
Fulfillment centers are crucial to Amazon's main online retail business. But the company also offers fulfillment services to other merchants, making the warehouses even more important.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
"Amazon has long used automation in its fulfillment centers, and Kiva's technology is another way to improve productivity by bringing the products directly to employees to pick, pack and stow," said Dave Clark, vice president, global customer fulfillment, at Amazon.com.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
Amazon has been spending more on fulfillment in recent years as the company opened lots of new warehouses to handle the rapid growth of its business.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
Fulfillment costs as a percentage of revenue rose to more than 9 per cent in 2011, from just over 8 per cent in 2010, according to Aaron Kessler, an analyst at Raymond James.<br />
<br /></div>
<div class="FocusMe" style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">
"That's been a big focus for investors recently," Kessler said. "It's a big cost. They are shipping so much and increasing volume so they need to figure out how to get more leverage out of these fulfillment centers."</div>
<span style="border-bottom-width: 0px; border-color: initial; border-image: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 17px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><br /></span></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-77045925424333105592011-07-09T09:56:00.001+10:002011-07-09T10:00:40.369+10:00Nexus S to serve as brain for 3 robots aboard the ISS<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjfTM0o62ug4XVKgFc6-u72jndOiAt903jvVm4WilARS0wzUIAoirgaUie1Awy9qIC76EEVWCYWuKY4B6553pAF95bRexf0N6jUdgGvEjcpEuvqdWrufEm-r3aouFxsUXJVF4ZXSPCPNN0Y/s1600/nasa-nexus-s-robot-spheres.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjfTM0o62ug4XVKgFc6-u72jndOiAt903jvVm4WilARS0wzUIAoirgaUie1Awy9qIC76EEVWCYWuKY4B6553pAF95bRexf0N6jUdgGvEjcpEuvqdWrufEm-r3aouFxsUXJVF4ZXSPCPNN0Y/s400/nasa-nexus-s-robot-spheres.jpg" width="400" /></a></div><br />
<br />
The shuttle Atlantis is set to carry two Nexus S phones into orbit that will turn a trio of floating satellites on the International Space Station into remote-operated robots.<br />
<br />
The 135th and last flight of the shuttle program, set for 11:26 a.m. ET, will help advance the cause of robotkind when the Android handsets are attached to the bowling ball-size orbs.<br />
<br />
Propelled by small CO2 thrusters, the Synchronized Position Hold, Engage, Reorient, Experimental Satellites (Spheres) were developed at MIT and have been in use on the ISS since 2006.<br />
<br />
As seen in the vid below, they look like the Star Wars lightsaber training droid but are designed to test spacecraft maneuvers, satellite servicing, and flight formation.<br />
<br />
Normally, the Spheres orbs carry out preprogrammed commands from a computer aboard the ISS, but the Nexus Android phones will give them increased computing power, cameras, and links to ground crew who will pilot them.<br />
<br />
The ISS Nexus-powered robots aren’t an entirely unique concept, of course. Toy maker Hasbro showed off something similar at Google I/O 2011, its conceptual male and female Nexus S robotic docks. The toys are able to move around and interact with their environment and even get dizzy when shaken by mischievous handlers.<br />
<br />
<br />
<div style="text-align: center;"><object style="height: 288px; width: 440px;"><param name="movie" value="http://www.youtube.com/v/nl6lZbyLkzs?version=3"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/nl6lZbyLkzs?version=3" type="application/x-shockwave-flash" allowfullscreen="true" allowScriptAccess="always" width="440" height="288"></object></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-44783077364783536282011-07-09T09:37:00.003+10:002011-07-09T09:47:53.459+10:00A robot gets sensitive skin<object height="268" width="440"><param name="movie" value="http://www.youtube.com/v/5CILOcxjkQY&hl=en_US&feature=player_embedded&version=3">
</param>
<param name="allowFullScreen" value="true">
</param>
<param name="allowScriptAccess" value="always">
</param>
<embed src="http://www.youtube.com/v/5CILOcxjkQY&hl=en_US&feature=player_embedded&version=3" type="application/x-shockwave-flash" allowfullscreen="true" allowScriptAccess="always" width="440" height="268"></embed></object><br />
<br />
"The robot has moved a step closer to humanity," concludes a news release put out today by a German research institute on the development of a robotic skin.<br />
<br />
The "skin" consists of 2-inch square hexagonal plates packed with sensors for things like touch, acceleration and temperature that are joined together in a honeycomb-like configuration.<br />
<br />
"We try to pack many different sensory modalities into the smallest of spaces," said Philip Mettendorfer, who is developing the skin at the Technical University of Munich, in the news release. "In addition, it is easy to expand the circuit boards to later include other sensors, for example, pressure."<br />
<br />
The technology, according to the researchers, will provide robots with tactile information to complement their camera eyes, infrared scanners and gripping hands. Tap it on the back, in the dark, and it will know you're there.<br />
<br />
In the video above, researchers test the sensors on a robotic arm by doing things such as brushing it with a piece of tissue paper and touching it with a warm hand to show how the robot quickly jerks away. In another test, the accelerometer allows it to keep a cup on a tray steady as the arm is moved around.<br />
<br />
For now, the skin consists of just 15 sensors, though the researchers plan to create a prototype completely draped in the skin-line sensors that can interact with its environment.<br />
The research effort, described in June issue of IEEE Transactions on Robotics, joins other quests around the world for robotic skin.<br />
<br />
Ali Javey's group at the University of California at Berkeley, for example, recently reported on a new material for e-skin that can detect a range of pressures. This could, for example, allow a robot to distinguish between an egg and a frying pan and adjust its grip accordingly.<br />
<br />
NASA scientists reported development of a skin that would give robots a sense of touch as it moved about its environment. Similar to Mettendorfer's concept, this would help robots react, for example, when they bump into an object.<br />
<br />
The goal for robotic skin experts doesn't stop at the current sensory accomplishments. "These machines will someday be able to incorporate our fundamental neurobiological capabilities and form a self-impression," according to the Technical University of Munich.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-47284567123529208522011-06-14T22:46:00.000+10:002011-06-14T22:46:11.734+10:00High speed robotic hand<div dir="ltr" style="text-align: left;" trbidi="on"><iframe width="425" height="349" src="http://www.youtube.com/embed/BGgcGF1WAD0" frameborder="0" allowfullscreen></iframe><br />
</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-8724347596552883932011-02-10T07:40:00.002+11:002011-02-12T10:45:24.851+11:00Robots to get their own internet<div dir="ltr" style="text-align: left;" trbidi="on"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjElXz50lAhdRGCJxhOcRMJCKBWCBzhvJwN7IcdrE9wjr4gO7WfRxQRSyGvork9H37DAVj9YL-Na_aCAF1pQWIPMvQFec1JaQ7_ju-k3ApKC4Rif6R8j84oTWTprDJnu18iklIUiKw9xnka/s1600/RoboEarth.org_logo.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="45" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjElXz50lAhdRGCJxhOcRMJCKBWCBzhvJwN7IcdrE9wjr4gO7WfRxQRSyGvork9H37DAVj9YL-Na_aCAF1pQWIPMvQFec1JaQ7_ju-k3ApKC4Rif6R8j84oTWTprDJnu18iklIUiKw9xnka/s320/RoboEarth.org_logo.gif" width="320" /></a></div><br />
<div class="separator" style="clear: both; text-align: center;"></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg80_S7-u8JPNuSVeHMkeFtNagqxnSwkPoETfY3jK_KShXhODYb2isSBiYfRtJxM1Y5Mouw7vlySYpdaBhOUR4dtvi9IxP4vC5O-q9NL1mAyo9TfOKL34FOuub7zVRBmTzpodMWMdGYGYp-/s1600/RoboEarth-Diagram-small.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="286" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg80_S7-u8JPNuSVeHMkeFtNagqxnSwkPoETfY3jK_KShXhODYb2isSBiYfRtJxM1Y5Mouw7vlySYpdaBhOUR4dtvi9IxP4vC5O-q9NL1mAyo9TfOKL34FOuub7zVRBmTzpodMWMdGYGYp-/s320/RoboEarth-Diagram-small.png" width="320" /></a></div><div class="separator" style="clear: both; text-align: center;"><br />
</div><div class="separator" style="clear: both; text-align: center;"><br />
</div><b>Robots could soon have an equivalent of the internet and Wikipedia.</b><br />
<br />
European scientists have embarked on a project to let robots share and store what they discover about the world.<br />
<br />
Called RoboEarth it will be a place that robots can upload data to when they master a task, and ask for help in carrying out new ones.<br />
<br />
Researchers behind it hope it will allow robots to come into service more quickly, armed with a growing library of knowledge about their human masters.<br />
<br />
<br />
The idea behind RoboEarth is to develop methods that help robots encode, exchange and re-use knowledge, said RoboEarth researcher Dr Markus Waibel from the Swiss Federal Institute of Technology in Zurich.<br />
<br />
"Most current robots see the world their own way and there's very little standardisation going on," he said. Most researchers using robots typically develop their own way for that machine to build up a corpus of data about the world.<br />
<br />
This, said Dr Waibel, made it very difficult for roboticists to share knowledge or for the field to advance rapidly because everyone started off solving the same problems.<br />
<br />
By contrast, RoboEarth hopes to start showing how the information that robots discover about the world can be defined so any other robot can find it and use it.<br />
<br />
RoboEarth will be a communication system and a database, he said.<br />
<br />
In the database will be maps of places that robots work, descriptions of objects they encounter and instructions for how to complete distinct actions.<br />
<br />
The human equivalent would be Wikipedia, said Dr Waibel.<br />
<br />
"Wikipedia is something that humans use to share knowledge, that everyone can edit, contribute knowledge to and access," he said. "Something like that does not exist for robots."<br />
<br />
It would be great, he said, if a robot could enter a location that it had never visited before, consult RoboEarth to learn about that place and the objects and tasks in it and then quickly get to work.<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><br />
</div><div class="separator" style="clear: both; text-align: center;"><br />
</div>While other projects are working on standardising the way robots sense the world and encode the information they find, RoboEarth tries to go further.<br />
<br />
"The key is allowing robots to share knowledge," said Dr Waibel. "That's really new."<br />
<br />
RoboEarth is likely to become a tool for the growing number of service and domestic robots that many expect to become a feature in homes in coming decades.<br />
<br />
Dr Waibel said it would be a place that would teach robots about the objects that fill the human world and their relationships to each other.<br />
<br />
For instance, he said, RoboEarth could help a robot understand what is meant when it is asked to set the table and what objects are required for that task to be completed.<br />
<br />
The EU-funded project has about 35 researchers working on it and hopes to demonstrate how the system might work by the end of its four-year duration.<br />
<br />
Early work has resulted in a way to download descriptions of tasks that are then executed by a robot. Improved maps of locations can also be uploaded.<br />
<br />
A system such as RoboEarth was going to be essential, said Dr Waibel, if robots were going to become truly useful to humans.<br />
<br />
<br />
</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-66021971544374880832011-01-28T15:49:00.000+11:002011-01-28T15:49:12.050+11:00Robots learn from rats' brains<div dir="ltr" style="text-align: left;" trbidi="on"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7O-CET7VnRtAOY381BwCgLj0aXfMzrzla6E4SGzLv9jEuQ7055VtS092NsQA74HIPJFK-TMVdswESfpkd2hufGpAOPuBUBzfhL8RPbAM6kUDsLz8kT0UqtlHykkP5KA86aXlBGYbZgAg0/s1600/Robot-Rat.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="330" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7O-CET7VnRtAOY381BwCgLj0aXfMzrzla6E4SGzLv9jEuQ7055VtS092NsQA74HIPJFK-TMVdswESfpkd2hufGpAOPuBUBzfhL8RPbAM6kUDsLz8kT0UqtlHykkP5KA86aXlBGYbZgAg0/s400/Robot-Rat.jpg" width="400" /></a></div><br />
<div dir="ltr" style="text-align: left;" trbidi="on"><br />
Queensland engineers have translated biological findings to probabilistic algorithms that could direct robots through complicated human environments.<br />
<br />
While many of today's machines relied on expensive sensors and systems, the researchers hoped their software would improve domestic robots, cheaply.<br />
<br />
Roboticist Michael Milford worked with neuroscientists to develop algorithms that mimicked three navigational systems in rats' brains: place cells; head direction cells; and grid cells.<br />
<br />
In an article published in PLoS Computational Biology this week, he described simulating grid cells - recently discovered brain cells that helped rats contextually determine their location.<br />
<br />
To explain the function of grid cells, Milford described getting out of a lift at an unknown floor, and deducing his location based on visual cues like vending machines and photocopiers.<br />
<br />
"We take it for granted that we find our way to work ... [but] the problem is extremely challenging," said the Queensland University of Technology researcher.<br />
<br />
"Robots are able to navigate to a certain point, but they just get confused and lost in an office building," he told iTnews.<br />
<br />
The so-called RatSLAM software was installed in a 20kg Pioneer 2DXe robot with a forward facing camera that was capable of detecting visual cues, their relative bearing and distance.<br />
<br />
The robot was placed in a maze similar to those used in experiments with rats, with random goal locations that simulated a rat's collection of randomly thrown pieces of food.<br />
<br />
It calibrated itself using visual cues, performing up to 14 iterations per second to determine its location when placed in one of four initial starting positions.<br />
<br />
Milford explained that environmental changes like lighting, shadows, moving vehicles and people made it difficult for robots to navigate in a human world.<br />
<br />
Machines like the Mars Rovers and those competing in the DARPA Challenges tended to use expensive sensors - essentially "throwing a lot of money" at the problem, he said.<br />
<br />
But a cheaper solution was needed to direct domestic robots, which were currently still in early stages of development and "very, very, very, dumb".<br />
<br />
"The only really successful cheap robot that has occurred so far is the [iRobot Roomba] vacuum cleaner," he said. "They don't have any idea where they are; they just move around randomly."<br />
<br />
The grid cell project was the latest in almost seven years of Milford's research into applying biological techniques to machines.<br />
<br />
The team had been approached "occasionally" by domestic robot manufacturers, he said, but was currently focussed on research, and not commercialisation.</div></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-14066393350869626652010-10-11T20:01:00.000+11:002010-10-11T20:01:34.531+11:00Google Cars Drive Themselves, in Traffic<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3NHDul-1kqJe4Cah3VtyNPQJPRh-8E8zAu1xgLzQ5E6TSUkNo-hSqTOJGLJQf5ITtmqSaDtERavq0aHI1iz2wUUgScWqlJgLCeuNq5i0rASsVl5zwiKH0NQoyMm8IRVI9aJoY8kWuCY8T/s1600/googlecar.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="237" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3NHDul-1kqJe4Cah3VtyNPQJPRh-8E8zAu1xgLzQ5E6TSUkNo-hSqTOJGLJQf5ITtmqSaDtERavq0aHI1iz2wUUgScWqlJgLCeuNq5i0rASsVl5zwiKH0NQoyMm8IRVI9aJoY8kWuCY8T/s400/googlecar.jpg" width="400" /></a></div><br />
<br />
Anyone driving the twists of Highway 1 between San Francisco and Los Angeles recently may have glimpsed a Toyota Prius with a curious funnel-like cylinder on the roof. Harder to notice was that the person at the wheel was not actually driving.<br />
<br />
The car is a project of Google, which has been working in secret but in plain view on vehicles that can drive themselves, using artificial-intelligence software that can sense anything near the car and mimic the decisions made by a human driver.<br />
<br />
With someone behind the wheel to take control if something goes awry and a technician in the passenger seat to monitor the navigation system, seven test cars have driven 1,000 miles without human intervention and more than 140,000 miles with only occasional human control. One even drove itself down Lombard Street in San Francisco, one of the steepest and curviest streets in the nation. The only accident, engineers said, was when one Google car was rear-ended while stopped at a traffic light.<br />
<br />
Autonomous cars are years from mass production, but technologists who have long dreamed of them believe that they can transform society as profoundly as the Internet has.<br />
<br />
Robot drivers react faster than humans, have 360-degree perception and do not get distracted, sleepy or intoxicated, the engineers argue. They speak in terms of lives saved and injuries avoided — more than 37,000 people died in car accidents in the United States in 2008. The engineers say the technology could double the capacity of roads by allowing cars to drive more safely while closer together. Because the robot cars would eventually be less likely to crash, they could be built lighter, reducing fuel consumption. But of course, to be truly safer, the cars must be far more reliable than, say, today’s personal computers, which crash on occasion and are frequently infected.<br />
<br />
The Google research program using artificial intelligence to revolutionize the automobile is proof that the company’s ambitions reach beyond the search engine business. The program is also a departure from the mainstream of innovation in Silicon Valley, which has veered toward social networks and Hollywood-style digital media.<br />
<br />
During a half-hour drive beginning on Google’s campus 35 miles south of San Francisco last Wednesday, a Prius equipped with a variety of sensors and following a route programmed into the GPS navigation system nimbly accelerated in the entrance lane and merged into fast-moving traffic on Highway 101, the freeway through Silicon Valley.<br />
<br />
It drove at the speed limit, which it knew because the limit for every road is included in its database, and left the freeway several exits later. The device atop the car produced a detailed map of the environment.<br />
<br />
The car then drove in city traffic through Mountain View, stopping for lights and stop signs, as well as making announcements like “approaching a crosswalk” (to warn the human at the wheel) or “turn ahead” in a pleasant female voice. This same pleasant voice would, engineers said, alert the driver if a master control system detected anything amiss with the various sensors.<br />
<br />
The car can be programmed for different driving personalities — from cautious, in which it is more likely to yield to another car, to aggressive, where it is more likely to go first.<br />
<br />
Christopher Urmson, a Carnegie Mellon University robotics scientist, was behind the wheel but not using it. To gain control of the car he has to do one of three things: hit a red button near his right hand, touch the brake or turn the steering wheel. He did so twice, once when a bicyclist ran a red light and again when a car in front stopped and began to back into a parking space. But the car seemed likely to have prevented an accident itself.<br />
<br />
When he returned to automated “cruise” mode, the car gave a little “whir” meant to evoke going into warp drive on “Star Trek,” and Dr. Urmson was able to rest his hands by his sides or gesticulate when talking to a passenger in the back seat. He said the cars did attract attention, but people seem to think they are just the next generation of the Street View cars that Google uses to take photographs and collect data for its maps.<br />
<br />
The project is the brainchild of Sebastian Thrun, the 43-year-old director of the Stanford Artificial Intelligence Laboratory, a Google engineer and the co-inventor of the Street View mapping service.<br />
<br />
In 2005, he led a team of Stanford students and faculty members in designing the Stanley robot car, winning the second Grand Challenge of the Defense Advanced Research Projects Agency, a $2 million Pentagon prize for driving autonomously over 132 miles in the desert.<br />
<br />
Besides the team of 15 engineers working on the current project, Google hired more than a dozen people, each with a spotless driving record, to sit in the driver’s seat, paying $15 an hour or more. Google is using six Priuses and an Audi TT in the project.<br />
<br />
The Google researchers said the company did not yet have a clear plan to create a business from the experiments. Dr. Thrun is known as a passionate promoter of the potential to use robotic vehicles to make highways safer and lower the nation’s energy costs. It is a commitment shared by Larry Page, Google’s co-founder, according to several people familiar with the project.<br />
<br />
The self-driving car initiative is an example of Google’s willingness to gamble on technology that may not pay off for years, Dr. Thrun said. Even the most optimistic predictions put the deployment of the technology more than eight years away.<br />
<br />
One way Google might be able to profit is to provide information and navigation services for makers of autonomous vehicles. Or, it might sell or give away the navigation technology itself, much as it offers its Android smart phone system to cellphone companies.<br />
<br />
But the advent of autonomous vehicles poses thorny legal issues, the Google researchers acknowledged. Under current law, a human must be in control of a car at all times, but what does that mean if the human is not really paying attention as the car crosses through, say, a school zone, figuring that the robot is driving more safely than he would?<br />
<br />
And in the event of an accident, who would be liable — the person behind the wheel or the maker of the software?<br />
<br />
“The technology is ahead of the law in many areas,” said Bernard Lu, senior staff counsel for the California Department of Motor Vehicles. “If you look at the vehicle code, there are dozens of laws pertaining to the driver of a vehicle, and they all presume to have a human being operating the vehicle.”<br />
<br />
The Google researchers said they had carefully examined California’s motor vehicle regulations and determined that because a human driver can override any error, the experimental cars are legal. Mr. Lu agreed.<br />
<br />
Scientists and engineers have been designing autonomous vehicles since the mid-1960s, but crucial innovation happened in 2004 when the Pentagon’s research arm began its Grand Challenge.<br />
<br />
The first contest ended in failure, but in 2005, Dr. Thrun’s Stanford team built the car that won a race with a rival vehicle built by a team from Carnegie Mellon University. Less than two years later, another event proved that autonomous vehicles could drive safely in urban settings.<br />
<br />
Advances have been so encouraging that Dr. Thrun sounds like an evangelist when he speaks of robot cars. There is their potential to reduce fuel consumption by eliminating heavy-footed stop-and-go drivers and, given the reduced possibility of accidents, to ultimately build more lightweight vehicles.<br />
<br />
There is even the farther-off prospect of cars that do not need anyone behind the wheel. That would allow the cars to be summoned electronically, so that people could share them. Fewer cars would then be needed, reducing the need for parking spaces, which consume valuable land.<br />
<br />
And, of course, the cars could save humans from themselves. “Can we text twice as much while driving, without the guilt?” Dr. Thrun said in a recent talk. “Yes, we can, if only cars will drive themselves.”Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-37572650519242401132010-10-10T10:42:00.001+11:002010-10-10T10:42:45.341+11:00Aiming to Learn as We Do, a Machine Teaches Itself: NELL, the Never-Ending Language Learning system<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9icw2nD5nCbca_PaiPsyx1okC5MB0eIG7OYijxHGc89vY4NFcZ_tv0Hm86O65b5rpMCSDROfY8QefqPvWhM4AihaheVBg1zMhyphenhyphenzeU9-ed7KVjZh6TJpjNDgAoABg1q15rseseLGwUdmNu/s1600/nellTeam.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="176" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9icw2nD5nCbca_PaiPsyx1okC5MB0eIG7OYijxHGc89vY4NFcZ_tv0Hm86O65b5rpMCSDROfY8QefqPvWhM4AihaheVBg1zMhyphenhyphenzeU9-ed7KVjZh6TJpjNDgAoABg1q15rseseLGwUdmNu/s320/nellTeam.jpg" width="320" /></a></div>Give a computer a task that can be crisply defined — win at chess, predict the weather — and the machine bests humans nearly every time. Yet when problems are nuanced or ambiguous, or require combining varied sources of information, computers are no match for human intelligence.<br />
<br />
Few challenges in computing loom larger than unraveling semantics, understanding the meaning of language. One reason is that the meaning of words and phrases hinges not only on their context, but also on background knowledge that humans learn over years, day after day.<br />
<br />
Since the start of the year, a team of researchers atCarnegie Mellon University — supported by grants from the Defense Advanced Research Projects Agency andGoogle, and tapping into a research supercomputing cluster provided by Yahoo — has been fine-tuning a computer system that is trying to master semantics by learning more like a human. Its beating hardware heart is a sleek, silver-gray computer — calculating 24 hours a day, seven days a week — that resides in a basement computer center at the university, in Pittsburgh. The computer was primed by the researchers with some basic knowledge in various categories and set loose on the Web with a mission to teach itself.<br />
<br />
“For all the advances in computer science, we still don’t have a computer that can learn as humans do, cumulatively, over the long term,” said the team’s leader, Tom M. Mitchell, a computer scientist and chairman of the machine learning department.<br />
<br />
The Never-Ending Language Learning system, or NELL, has made an impressive showing so far. NELL scans hundreds of millions of Web pages for text patterns that it uses to learn facts, 390,000 to date, with an estimated accuracy of 87 percent. These facts are grouped into semantic categories — cities, companies, sports teams, actors, universities, plants and 274 others. The category facts are things like “San Francisco is a city” and “sunflower is a plant.”<br />
<br />
NELL also learns facts that are relations between members of two categories. For example, Peyton Manning is a football player (category). The Indianapolis Colts is a football team (category). By scanning text patterns, NELL can infer with a high probability that Peyton Manning plays for the Indianapolis Colts — even if it has never read that Mr. Manning plays for the Colts. “Plays for” is a relation, and there are 280 kinds of relations. The number of categories and relations has more than doubled since earlier this year, and will steadily expand.<br />
<br />
The learned facts are continuously added to NELL’s growing database, which the researchers call a “knowledge base.” A larger pool of facts, Dr. Mitchell says, will help refine NELL’s learning algorithms so that it finds facts on the Web more accurately and more efficiently over time.<br />
<br />
NELL is one project in a widening field of research and investment aimed at enabling computers to better understand the meaning of language. Many of these efforts tap the Web as a rich trove of text to assemble structured ontologies — formal descriptions of concepts and relationships — to help computers mimic human understanding. The ideal has been discussed for years, and more than a decade ago Sir Tim Berners-Lee, who invented the underlying software for the World Wide Web, sketched his vision of a “semantic Web.”<br />
<br />
Today, ever-faster computers, an explosion of Web data and improved software techniques are opening the door to rapid progress. Scientists at universities, government labs, Google, Microsoft, I.B.M. and elsewhere are pursuing breakthroughs, along somewhat different paths.<br />
<br />
For example, I.B.M.’s “question answering” machine, Watson, shows remarkable semantic understanding in fields like history, literature and sports as it plays the quiz show “Jeopardy!” Google Squared, a research project at the Internet search giant, demonstrates ample grasp of semantic categories as it finds and presents information from around the Web on search topics like “U.S. presidents” and “cheeses.”<br />
<br />
Still, artificial intelligence experts agree that the Carnegie Mellon approach is innovative. Many semantic learning systems, they note, are more passive learners, largely hand-crafted by human programmers, while NELL is highly automated. “What’s exciting and significant about it is the continuous learning, as if NELL is exercising curiosity on its own, with little human help,” said Oren Etzioni, a computer scientist at the University of Washington, who leads a project called TextRunner, which reads the Web to extract facts.<br />
<br />
Computers that understand language, experts say, promise a big payoff someday. The potential applications range from smarter search (supplying natural-language answers to search queries, not just links to Web pages) to virtual personal assistants that can reply to questions in specific disciplines or activities like health, education, travel and shopping.<br />
<br />
“The technology is really maturing, and will increasingly be used to gain understanding,” said Alfred Spector, vice president of research for Google. “We’re on the verge now in this semantic world.”<br />
<br />
With NELL, the researchers built a base of knowledge, seeding each kind of category or relation with 10 to 15 examples that are true. In the category for emotions, for example: “Anger is an emotion.” “Bliss is an emotion.” And about a dozen more.<br />
<br />
<br />
Then NELL gets to work. Its tools include programs that extract and classify text phrases from the Web, programs that look for patterns and correlations, and programs that learn rules. For example, when the computer system reads the phrase “Pikes Peak,” it studies the structure — two words, each beginning with a capital letter, and the last word is Peak. That structure alone might make it probable that Pikes Peak is a mountain. But NELL also reads in several ways. It will mine for text phrases that surround Pikes Peak and similar noun phrases repeatedly. For example, “I climbed XXX.”<br />
<br />
NELL, Dr. Mitchell explains, is designed to be able to grapple with words in different contexts, by deploying a hierarchy of rules to resolve ambiguity. This kind of nuanced judgment tends to flummox computers. “But as it turns out, a system like this works much better if you force it to learn many things, hundreds at once,” he said.<br />
<br />
For example, the text-phrase structure “I climbed XXX” very often occurs with a mountain. But when NELL reads, “I climbed stairs,” it has previously learned with great certainty that “stairs” belongs to the category “building part.” “It self-corrects when it has more information, as it learns more,” Dr. Mitchell explained.<br />
<br />
NELL, he says, is just getting under way, and its growing knowledge base of facts and relations is intended as a foundation for improving machine intelligence. Dr. Mitchell offers an example of the kind of knowledge NELL cannot manage today, but may someday. Take two similar sentences, he said. “The girl caught the butterfly with the spots.” And, “The girl caught the butterfly with the net.”<br />
<br />
A human reader, he noted, inherently understands that girls hold nets, and girls are not usually spotted. So, in the first sentence, “spots” is associated with “butterfly,” and in the second, “net” with “girl.”<br />
<br />
“That’s obvious to a person, but it’s not obvious to a computer,” Dr. Mitchell said. “So much of human language is background knowledge, knowledge accumulated over time. That’s where NELL is headed, and the challenge is how to get that knowledge.”<br />
<br />
A helping hand from humans, occasionally, will be part of the answer. For the first six months, NELL ran unassisted. But the research team noticed that while it did well with most categories and relations, its accuracy on about one-fourth of them trailed well behind. Starting in June, the researchers began scanning each category and relation for about five minutes every two weeks. When they find blatant errors, they label and correct them, putting NELL’s learning engine back on track.<br />
<br />
When Dr. Mitchell scanned the “baked goods” category recently, he noticed a clear pattern. NELL was at first quite accurate, easily identifying all kinds of pies, breads, cakes and cookies as baked goods. But things went awry after NELL’s noun-phrase classifier decided “Internet cookies” was a baked good. (Its database related to baked goods or the Internet apparently lacked the knowledge to correct the mistake.)<br />
<br />
NELL had read the sentence “I deleted my Internet cookies.” So when it read “I deleted my files,” it decided “files” was probably a baked good, too. “It started this whole avalanche of mistakes,” Dr. Mitchell said. He corrected the Internet cookies error and restarted NELL’s bakery education.<br />
<br />
His ideal, Dr. Mitchell said, was a computer system that could learn continuously with no need for human assistance. “We’re not there yet,” he said. “But you and I don’t learn in isolation either.”Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-37688655670145275852010-09-28T19:20:00.000+10:002010-09-28T19:20:00.934+10:00EPFL develops Linux-based swarming micro air vehicles<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNZQGvk6C1Fx70fWjQ-Mcz1BIlzAAgOOnMBXXPxr2Wwpkw-lFhHvTBr_yHaV3IaeSRJxV1i3KcEwu4Eahe2dAGzthCDu5llsU_tgXx7Kq-dqRyjxfmPYlU0KJUjT60lOXOTcs4ofUCW6xd/s1600/swarm-01.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNZQGvk6C1Fx70fWjQ-Mcz1BIlzAAgOOnMBXXPxr2Wwpkw-lFhHvTBr_yHaV3IaeSRJxV1i3KcEwu4Eahe2dAGzthCDu5llsU_tgXx7Kq-dqRyjxfmPYlU0KJUjT60lOXOTcs4ofUCW6xd/s400/swarm-01.jpg" width="400" /></a></div><br />
<br />
The good people at Ecole Polytechnique Federale de Lausanne (or EPFL) in Switzerland have been very busy lately, as this video demonstrates. <br />
<br />
Not only have they put together a scalable system that will let any flying robot perch in a tree or similar structure, but now they've gone and developed a platform for swarming air vehicles (with Linux, nonetheless). <br />
<br />
Said to be the largest network of its kind, the ten SMAVNET swarm members control their own altitude, airspeed, and turn rate based on input from the onboard gyroscope and pressure sensors. The goal is to develop low cost devices that can be deployed in disaster areas to creat ad hoc communications networks, although we can't help but think this would make the best Christmas present ever.<br />
<br />
<object height="304" width="500"><param name="movie" value="http://www.youtube.com/v/pfYs5C8D4uk&hl=en_US&feature=player_embedded&version=3"></param><param name="allowFullScreen" value="true"></param><param name="allowScriptAccess" value="always"></param><embed src="http://www.youtube.com/v/pfYs5C8D4uk&hl=en_US&feature=player_embedded&version=3" type="application/x-shockwave-flash" allowfullscreen="true" allowScriptAccess="always" width="500" height="304"></embed></object>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-51601900386087957932010-09-12T08:49:00.000+10:002010-09-12T08:49:15.666+10:00Future farms to be run by robots<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKSL8QjLEttDUV9oleqVMEPxY8bGFtgSkExABE8jAYxuEC6N31eXJJCRRhL1gRXY38Vo4bQEuuPZGBZ7BHtMBg5Py8Qxf-yOw5t6frg6diSlXezXr2V-7rCw2KK4eEgSBR2ONJLDKiEL8B/s1600/robot02.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="332" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKSL8QjLEttDUV9oleqVMEPxY8bGFtgSkExABE8jAYxuEC6N31eXJJCRRhL1gRXY38Vo4bQEuuPZGBZ7BHtMBg5Py8Qxf-yOw5t6frg6diSlXezXr2V-7rCw2KK4eEgSBR2ONJLDKiEL8B/s400/robot02.jpg" width="400" /></a></div><br />
<br />
Robots from Mechanisation Automation Robotics Remote Sensing (MARRS) technologies could one day run automated farms in Australia, a futuristic researcher from the University of Queensland says.<br />
<br />
Dr Adam Postula says technologies can be used to control unmanned aircraft or unmanned tractors, using detection systems capable of observing environments using visual, infra-red or laser light wavelengths.<br />
<br />
The emerging technologies can also help farmers by detecting and communicating in real-time variable environmental, field, and crop parameters such as moisture content, temperature and humidity.<br />
<br />
Dr Postula and a colleague will speak on the role of smart machines in the future of Australian farming at an industry event in Marburg, west of Brisbane, on Wednesday.<br />
<br />
The workshop will focus on opportunities available to Australian farmers through the introduction of robots and smart machines into their operations.<br />
<br />
The remote-controlled farm could become a reality, just as mines are becoming more automated, he said.<br />
<br />
"That's definitely possible. Look what happens with farms now - how many people we've employed on farms before and how many we have now," Dr Postula said.<br />
<br />
"I've seen a mine in Sweden where there were no people underground, everything was controlled from above ground."<br />
<br />
The future of farming is largely about precision, Dr Postula told AAP on Monday.<br />
<br />
"It's not only pursued in space - where you put your plants in particular locations - but also you know almost everything about your soil, about moisture, stuff that really matters for growing," he said.<br />
<br />
"In order to know that you have to have sensors that are close to the plant."<br />
<br />
That means the sensors must be cheap, he said.<br />
<br />
Dr Postula said any four-wheel drive vehicle can be made autonomous, and unmanned aircraft will be able to scan and estimate the size of crops and the maturity of fruit, or determine the location of cattle.<br />
<br />
"We expect that walking, moving, flying robots will be commonplace on Australian farms in the future," Dr Postula said.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-14858779205554295662010-07-24T11:04:00.000+10:002010-07-24T11:04:55.739+10:00Robot eats sewage for energy: researchers develop synthetic gut<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSI-T4pJNnRYV7w-393pGl4s_MfdUkwfs_dv-zKy9Ay-vnx34p31PDDmgIH7XGEsh_9SNbR75IEJtRCswPT9rEwW8jd-Q6Bk5mHJ8t9ew9_tT58YBtaj83GQBAXAhLoEY527sLpztpBrut/s1600/EcoBot3.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSI-T4pJNnRYV7w-393pGl4s_MfdUkwfs_dv-zKy9Ay-vnx34p31PDDmgIH7XGEsh_9SNbR75IEJtRCswPT9rEwW8jd-Q6Bk5mHJ8t9ew9_tT58YBtaj83GQBAXAhLoEY527sLpztpBrut/s320/EcoBot3.jpg" width="183" /></a></div><div class="separator" style="clear: both; text-align: center;"><br />
</div>Hot on the heels of the first synthetic cell comes a slightly lower-brow advance: a synthetic gut. The basic function that it provides could be the key to freedom for self-sustaining robots.<br />
<br />
In the bid to create such autonomous robots, researchers turned to biomass as an energy source. By being able to feed themselves, robots could be set to work for long periods without human intervention.<br />
<br />
Such food-munching robots have been demonstrated in the past, often generating power with the help of microbial fuel cells (MFCs) - bio-electrochemical devices that enlist cultures of bacteria to break down food to generate power. Until now, though, no one had tackled the messy but inevitable issue of finding a way to evacuate the waste these bugs produce.<br />
<br />
What was needed was an artificial gut, says Chris Melhuish, director of the Bristol Robotics Lab in the UK. He has spent three years with Ioannis Ieropoulos and colleagues working up the concept. The result: Ecobot III.<br />
<br />
"Diarrhoea-bot would be more appropriate," Melhuish admits. "It's not exactly knocking out rabbit pellets." Even so, he says, it marks the first demonstration of a biomass-powered robot that can operate unaided for some time.<br />
<br />
<br />
Previous incarnations of Ecobot showed that it is possible to generate enough power for the robot to exhibit certain basic, yet intelligent behaviours, such as moving towards a light source. Human intervention was needed to clean up after meals, though.<br />
<br />
Now, by redesigning the robot to include a digestive tract, Ecobot III has shown that it can survive for up to seven days, feeding and "watering" itself unaided. It obediently expels its waste into a litter tray once every 24 hours.<br />
<br />
The key to getting this gut to work, says Ieropoulos, is a recycling system that relies on a gravity-fed peristaltic pump which, like the human colon, applies waves of pressure to squeeze unwanted matter out of a tube.<br />
<br />
At the start of the digestive process the robot feeds itself by moving into contact with a dispenser. This pumps a nutrient-rich solution of partially processed sewage into its "mouth" where it is distributed into 48 separate MFCs within the robot. This fluid is a concoction of minerals, salts, yeast extracts and other nutrients. As unappetising as this mixture sounds, for the culture of microbes in the robot's stomach it is ambrosia itself.<br />
<br />
At the heart of the process is a reduction-oxidation reaction that takes place in the anode chambers of each of the robot's MFCs. As the bacteria metabolise the organic matter, hydrogen atoms are given off. The hydrogen's electrons migrate to the electrode, generating a current, while hydrogen ions pass through a proton-exchange membrane into the cathode chamber of the cell, which contains water. Here, oxygen dissolved in that water combines with the protons to produce additional water. Because this supply of water gradually evaporates, the robot also needs regular drinks, which it gets from a separate spout.<br />
<br />
The cells are arranged in a stack of two tiers of 24 (see picture), designed to allow gravity to direct any heavy undigested matter to accumulate in a central trough. The contents are repeatedly re-circulated from the trough into the robot's feeder tanks to extract as much energy as possible, before being excreted.<br />
<br />
Getting rid of this waste not only prevents fuel cells from filling up and becoming clogged, but also removes any acidic waste products from the digester that might poison the bacteria, says Ieropoulos.<br />
<br />
As things stand, the fuel cells are capable of extracting a mere 1 per cent of the chemical energy available in its food, despite the recycling process. The system uses off-the-shelf components, so modifying the anodes to have a larger surface area upon which bacteria can attach themselves, should help extract far more energy, says Ieropoulos.<br />
<br />
Robert Finkelstein who is heading the Energetically Autonomous Tactile Robot (EATR) project at the US's military research agency DARPA, thinks MFC technology is the wrong choice. It is inefficient and too slow to convert energy, he says.<br />
<br />
EATR will derive its energy from burning biomass rather than eating it. Using a novel combustion engine, developed by Cyclone Power Technology of Pompano Beach, Florida, the hope is that when EATR is assembled and tested later this month it will generate enough energy to roll 160 kilometres on 60 kilograms of biomass. In terms of the calorific value of the fuel, that's better than the average car, says Finkelstein.<br />
<br />
One of the advantages of MFCs, though, is that they can consume almost anything, including waste water, a substance that isn't easily burned, says Ieropoulos. The bacteria in Ecobot III's gut are made up of hundreds of different species, allowing it to adapt to different foodstuffs. One of the ideas the group is playing with, and the reason they are using waste water as food, is to see if these fuel cells could be used as part of a filtration system to clean up sewage water.<br />
<br />
The work will be presented at the Artificial Life conference in Odense, Denmark, next month. The next step is to explore how the robot will cope with a heartier meal, namely flies.<br />
<br />
The carnivorous-robot fearing public need not worry, says Melhuish. Much of the energy generated from flies will go into powering the robot's digestive system. With an average speed of about 21 centimetres a day, it is unlikely to catch you, he says.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-84964421325285269872010-05-01T12:54:00.000+10:002010-05-01T12:54:09.756+10:00Family Nanny robot is just five years and $1,500 away from being your new best friend<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkmdrAx1pReuwajiM5AstRrD-xBfqayppE6ZLt1L0icNCdQDeBFuhEDaPslMzfs6RvFiGo1Pn0rmoN9cC5RsdSVQlelaUdtNQ876QoGdrZ4JGKFNOMjuYBEAZnjrZoYvhglwyofadF9Cxh/s1600/familynannyrobot04292010.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="212" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkmdrAx1pReuwajiM5AstRrD-xBfqayppE6ZLt1L0icNCdQDeBFuhEDaPslMzfs6RvFiGo1Pn0rmoN9cC5RsdSVQlelaUdtNQ876QoGdrZ4JGKFNOMjuYBEAZnjrZoYvhglwyofadF9Cxh/s320/familynannyrobot04292010.jpg" width="320" /></a></div><br />
While Japan's busy preparing its robotic invasion on the moon, China's Siasun Robot & Automation Co., Ltd. has its eyes on Planet Earth instead.<br />
<br />
Meet Family Nanny, a two-foot-seven, 55-pound robot that can talk, email, text, detect gas leaks, and run around on its two wheels for eight hours on a single two-hour charge. <br />
<br />
It'll make great chatty company for the elderly while it relays vital stats back to health monitoring systems. In case of emergencies such as a gas leak, the Family Nanny will alert the owner via text and email. <br />
<br />
Not bad for ¥10,000 ($1,465), we'd say, but we'll remain skeptical on its chatting skills until it launches -- supposedly sometime around 2015.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-66021162594763023032010-04-02T20:45:00.001+11:002010-04-02T20:45:10.749+11:00Robot folds laundry<object height="356" width="440"><param name="movie" value="http://www.youtube.com/v/gy5g33S0Gzo&rel=0&color1=0x5d1719&color2=0xcd311b&hl=en_US&feature=player_embedded&fs=1"></param><param name="allowFullScreen" value="true"></param><param name="allowScriptAccess" value="always"></param><embed src="http://www.youtube.com/v/gy5g33S0Gzo&rel=0&color1=0x5d1719&color2=0xcd311b&hl=en_US&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowScriptAccess="always" width="440" height="356"></embed></object><br />
<br />
UC Berkeley roboticist Pieter Abbeel and his colleagues developed software that enables a robot to fold towels. From the abstract to their scientific paper:<br />
<br />
"The robot begins by picking up a randomly dropped towel from a table, goes through a sequence of vision-based re-grasps and manipulations-- partially in the air, partially on the table--and finally stacks the folded towel in a target location. The reliability and robustness of our algorithm enables for the first time a robot with general purpose manipulators to reliably and fully-autonomously fold previously unseen towels, demonstrating success on all 50 out of 50 single-towel trials as well as on a pile of 5 towels. "Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-69316078308610828282010-03-28T21:31:00.001+11:002010-04-02T20:50:41.487+11:00Virtual pets that can learn<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEsZxbuMVeN_ZjcmMv1eihbH44RTv_TpglNQ7OVg1dOw_ogB41hGvwWYdY8QspAfBvIllmBhA-Xmn6M7ulJIib0Pw7WW8rVIOC_dIHjp7YdlunxoTe_Eb-U6Q9gqOUppuQX9HxJi-Ac_fz/s1600-h/parrot.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><br />
</a></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcDrDMZIRVIrBkqlQbVRTZ4XKZCgPhRvtZ_C3WqztXFALerRa_JSYjiZ8DF4q31VmNp0EZi8tW23cKS50PYBrtgZbFUyMCjS_5B-7HSws5CodGLEiJcsnOnXDAebGZbpPMQLsoL8_ZEqn-/s1600-h/virtual.dog.for.sale.in.second.life.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="264" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcDrDMZIRVIrBkqlQbVRTZ4XKZCgPhRvtZ_C3WqztXFALerRa_JSYjiZ8DF4q31VmNp0EZi8tW23cKS50PYBrtgZbFUyMCjS_5B-7HSws5CodGLEiJcsnOnXDAebGZbpPMQLsoL8_ZEqn-/s320/virtual.dog.for.sale.in.second.life.jpg" width="320" /></a></div><br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcDrDMZIRVIrBkqlQbVRTZ4XKZCgPhRvtZ_C3WqztXFALerRa_JSYjiZ8DF4q31VmNp0EZi8tW23cKS50PYBrtgZbFUyMCjS_5B-7HSws5CodGLEiJcsnOnXDAebGZbpPMQLsoL8_ZEqn-/s1600-h/virtual.dog.for.sale.in.second.life.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"></a></div>"SIT," says the man. The dog tilts its head but does nothing. "Sit," the man repeats.<br />
<br />
The dog lies down. "No!" the man admonishes.<br />
<br />
Then, unable to get the dog to sit, the man decides to teach it by example. He sits down himself.<br />
<br />
"I'm sitting. Try sitting," he says. The dog cocks its head attentively, folds its hind legs under its body and sits. "Good!" says the man.<br />
<br />
No, it's not a rather bizarre way to teach your pet new tricks. It is a demonstration a synthetic character in a virtual world being controlled by an autonomous artificial intelligence (AI) program, which will be released to inhabitants of virtual worlds like Second Life later this year.<br />
<br />
Novamente, a company in Washington DC which built the AI program that controls the dog, says that the demonstration is a foretaste not just of future virtual pets but of computer games to come. Their work, along with similar programs from other researchers, was presented at the First Conference on Artificial General Intelligence at the University of Memphis in Tennessee earlier this month.<br />
<br />
If first impressions are anything to go by, synthetic pets like Novamente's dog will be a far cry from today's virtual pets, such as Neopets and Nintendogs, which can only perform pre-programmed moves, such as catching a disc. "The problem with current virtual pets is they are rigidly programmed and lack emotions, responsiveness, individual personality or the ability to learn," says Ben Goertzel of Novamente. "They are pretty much all morons."<br />
<br />
In contrast, Goertzel claims that synthetic characters like his dog can be taught almost anything, even things that their programmers never imagined.<br />
<br />
For instance, owners could train their pets to help win battles in adventure games such as World of Warcraft, says Sibley Verbek of the Electric Sheep Company in New York City, which helped Novamente create the virtual pets. "It is a system that allows the user to teach the virtual character anything they want to," he says.<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiapTmV5ztn7f3docj91Rr1kn895Cb8tNA0JoGcvLYM7vI3ue7AYBkS5oKA7pnSgHpx1J03CmtEmh_L-65k88vqpeYKZ8E1aMK9nxqm0y83i2cXOECe4UIsz78qPxpEGh7QC8_8Xhgzwcsz/s1600-h/prototype.virtual.dog.in.second.life.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiapTmV5ztn7f3docj91Rr1kn895Cb8tNA0JoGcvLYM7vI3ue7AYBkS5oKA7pnSgHpx1J03CmtEmh_L-65k88vqpeYKZ8E1aMK9nxqm0y83i2cXOECe4UIsz78qPxpEGh7QC8_8Xhgzwcsz/s320/prototype.virtual.dog.in.second.life.jpg" width="247" /></a></div><br />
So how do these autonomous programs work? Take Novamente's virtual pet, which is expected to be the first to hit the market. One way that the pets learn is by being taught specific tasks by human-controlled avatars, similar to the way babies are taught by their parents.<br />
<br />
To do this, the humans must directly tell the pet - via Second Life's instant messaging typing interface - that they are about to teach it a task. When the pet receives a specific command, such as "I am going to teach you to sit", it works out that it is about to learn something new called "sit". It then watches the human avatar and starts to copy some of the things the teacher does.<br />
<br />
At first it doesn't know which aspects of the task are important. This can lead to mistakes: the dog lying down instead of sitting, for example. But it soon figures out the correct behaviour by trying the task several times in a variety of ways. The key learning tool is that the pets are pre-programmed to seek praise from their owners, so they can make increasingly intelligent guesses about what they should copy, repeating adjustments that seem to make the human avatar more likely to say "good dog", and avoiding those that elicit the response "bad dog". Eventually, the pet figures out how to sit.<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEsZxbuMVeN_ZjcmMv1eihbH44RTv_TpglNQ7OVg1dOw_ogB41hGvwWYdY8QspAfBvIllmBhA-Xmn6M7ulJIib0Pw7WW8rVIOC_dIHjp7YdlunxoTe_Eb-U6Q9gqOUppuQX9HxJi-Ac_fz/s1600-h/parrot.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="267" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEsZxbuMVeN_ZjcmMv1eihbH44RTv_TpglNQ7OVg1dOw_ogB41hGvwWYdY8QspAfBvIllmBhA-Xmn6M7ulJIib0Pw7WW8rVIOC_dIHjp7YdlunxoTe_Eb-U6Q9gqOUppuQX9HxJi-Ac_fz/s320/parrot.png" width="320" /></a></div><br />
Learning by imitation isn't exactly a new idea. Robots in the real world are still being trained in this wayMovie Camera. But it hasn't been easy. For example, a real robot needs sophisticated computer vision to recognise its teacher's legs, so that it can isolate their movement and copy it. But the great variation in the size and shape of legs, which depends on their motion and the angle of viewing, means it is hard to program a robot to recognise legs.<br />
<br />
In Second Life, you can get round this problem. Characters don't see objects from a certain angle, nor from a particular distance; all they know is the 3D coordinates of the object, allowing them to recognise legs simply by their geometry. Once the pet can recognise legs, Goertzel then programs it to map the leg movements to the movement of its own legs. Obviously, the pet's own legs are a different size and shape, so the exact same motions wouldn't be appropriate. But the pets experiment with slightly different variations on the theme - and then settle on the set of movements that elicits the most praise from the avatar.<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCeAowlHpEU-zIgsVkxppy_qGWDNipdLnMfHYfJjRqjuvTlARdHLBJBddeKP4MosydkT8A_5xmc9-k2ET96mIJH7ycadiXesuzhJetg6Bq6Hvvuu9Kmw6Xo_6kG3JT5B54U3ILVkrV35-Z/s1600-h/Pets-Cue-2-Bunnies_001.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCeAowlHpEU-zIgsVkxppy_qGWDNipdLnMfHYfJjRqjuvTlARdHLBJBddeKP4MosydkT8A_5xmc9-k2ET96mIJH7ycadiXesuzhJetg6Bq6Hvvuu9Kmw6Xo_6kG3JT5B54U3ILVkrV35-Z/s320/Pets-Cue-2-Bunnies_001.jpg" width="320" /></a></div><br />
So far, Goertzel says he has successfully taught his dogs to play fetch, basic soccer skills such as kicking the ball, faking a shot and dribbling, and to dance a simple series of moves, just by showing them how (watch a video of the demo at www.novamente.net/puppy.mov).<br />
<br />
Imitation isn't the only way the pets learn, however. They can also learn things humans may not have intended to teach them. As well as seeking praise, they are also programmed with other basic desires such as hunger and thirst, as well as some random movements and exploration of the virtual environment. As they explore, their "memory" records everything that happens. It then carries out statistical analyses to find combinations of sequences and actions that seem to predict fulfilment of its goals, such as appeasement of hunger, and uses that knowledge to guide its future behaviour. This can then lead to more sophisticated behaviour, such as a dog learning to touch its bowl when a human walks into the room, because that increases the chance of a goal being fulfilled. "It learns that going near the bowl is symbolic for food," says Goertzel. "This is a sort of rudimentary gestural communication."<br />
<br />
Goertzel is aiming even higher. He says learning gestures could eventually form the basis for virtual pets to learn language, just as it does in young children. "Eventually we want to have virtual babies or talking parrots that learn to speak," he says (see "If only they could talk").<br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTNY9oFP43uU2QCqaAK2I2zyEe1U4OnMSA4py3m67kSti9DfptMfIYOH-lsbL7WvNy5zRhtdxInCNWzRyeWjR7Hl5qQs61Q76-MURvFEFwqa-SDg06Sf1DuygxDfmlSaISO-qaDfMaTucp/s1600-h/pet.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="290" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTNY9oFP43uU2QCqaAK2I2zyEe1U4OnMSA4py3m67kSti9DfptMfIYOH-lsbL7WvNy5zRhtdxInCNWzRyeWjR7Hl5qQs61Q76-MURvFEFwqa-SDg06Sf1DuygxDfmlSaISO-qaDfMaTucp/s320/pet.jpg" width="320" /></a></div><br />
Deb Roy, an AI researcher at the Massachusetts Institute of Technology, worries that people will tire of training their virtual pets. "Philosophically I am on board. These are lovely and powerful ideas," he says, "But what are the results that show [Goertzel's team] are making progress compared to people who have tried similar things?"<br />
<br />
Novamente has a few tricks up its sleeve to stop people from getting bored. For starters, the synthetic characters will learn quickly as more and more people use them. Although each pet has its own "brain", Novamente's servers will pool knowledge from all the brains. So once one pet has mastered one trick, it will be much easier for another one to master it, too.<br />
<br />
Researchers at Novamente are not the only ones who hope to create compelling synthetic characters. Selmer Bringsjord, Andrew Shilliday and colleagues at Rensselaer Polytechnic Institute in Troy, New York, are working on a character called EddieMovie Camera, that they hope will reason about another human's state of mind - potentially leading to characters that understand deceit and betrayal - and predict what other characters will do next.<br />
<br />
The fusing of virtual worlds and AI will almost certainly be good for AI. Since the field failed to deliver on its initial promises of machines you can chat to, robotic assistants that do your housework and conscious machines, it has been hard to get funding to build generally intelligent programs. Instead more specific, "narrow AI" such as computer vision or chess-playing have flourished. Novamente is planning to make its pets so much fun that people will actually pay money to interact with them. If so, the multibillion-dollar games industry could drive AI towards delivering on its original promise.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-81074206404235338382010-03-28T21:20:00.000+11:002010-03-28T21:20:10.103+11:00Could the fusion of games, virtual worlds and artificial intelligence take us closer to building artificial brains?<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMvzdC9BJo0LkxzBr3wr8LhrD80xmTtLYZPdXndNChQiTkf7lcH55b-SVXwoyAoXCmUUucEi_-GNS70onX4bVHM4845ihP1a5-72wWxpRx7EC96t9vJjIO9K7JlA-llOgpos5ASKbktFgx/s1600-h/secondLifeCat.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="231" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMvzdC9BJo0LkxzBr3wr8LhrD80xmTtLYZPdXndNChQiTkf7lcH55b-SVXwoyAoXCmUUucEi_-GNS70onX4bVHM4845ihP1a5-72wWxpRx7EC96t9vJjIO9K7JlA-llOgpos5ASKbktFgx/s320/secondLifeCat.jpg" width="320" /></a></div><br />
Novamente is a company that creates virtual pets equipped with artificial intelligence. <br />
As they move forward on this goal they hope the pets will learn to make common-sense assumptions like humans, which could eventually allow them to understand and produce natural language, for example.<br />
<br />
One of the biggest challenges faced by researchers trying to imbue computers with natural language abilities is getting computers to resolve ambiguities. Take this sentence: "I saw the man with a telescope." There are three possible ways to interpret the sentence. Either I was looking at a man holding a telescope, or I saw a man through my telescope, or more morbidly, I am sawing a man with a telescope. The context would help a human figure out the real meaning, while a computer might be flummoxed.<br />
<br />
But in an environment like Second Life, a synthetic character endowed with AI could use its immediate experience and interactions with other avatars and objects to make sense of language the way humans might. "The stuff that really excites me is to start teaching [pets] simple language," says Ben Goertzel of Novamente.<br />
<br />
But other AI researchers doubt that virtual environments will be rich enough for synthetic characters to move towards the kind of general intelligence that is required for natural language processing. Stephen Grand, an independent researcher from Baton Rouge, Louisiana, who created the AI game Creatures in the mid-1990s, applauds the Novamente approach, but thinks there are limits to learning inside a virtual world. <br />
<br />
"Just imagine how intelligent you would be if you were born with nothing more than the sensory information available to a Second Life inhabitant," he says. "It's like trying to paint a picture while looking through a drinking straw."Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-90924270938832338612010-03-25T21:08:00.000+11:002010-03-25T21:20:57.653+11:00IBM Simulates a Cat-Like Brain: AI or Shadow Minds for Humans?<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGAwHF0XlV76CGYU1uQ8J3MstuTWtMWYCo7KkSjaUb7OHBIrUF7JqzdkGL5n5pJtSGlLAslvXJxi0nP7TT98DnNujtZt4VvPWBq-GunY5pX9H4UW9IF_toBYaBhlxHSLlxI8-QraWdWW3_/s1600/robotCat.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 267px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGAwHF0XlV76CGYU1uQ8J3MstuTWtMWYCo7KkSjaUb7OHBIrUF7JqzdkGL5n5pJtSGlLAslvXJxi0nP7TT98DnNujtZt4VvPWBq-GunY5pX9H4UW9IF_toBYaBhlxHSLlxI8-QraWdWW3_/s400/robotCat.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5452511567968007938" /></a><br /><br />IBM's Almaden Research Center have announced that they had produced a "cortical simulation" of the scale and complexity of a cat brain. <br /><br />This simulation ran on one of IBM's "Blue Gene" supercomputers, in this case at the Lawrence Livermore National Laboratory (LLNL). <br /><br />This isn't a simulation of a cat brain, it's a simulation of a brain structure that has the scale and connection complexity of a cat brain. <br /><br />It doesn't include the actual structures of a cat brain, nor its actual connections; the various experiments in the project filled the memory of the cortical simulation with a bunch of data, and let the system create its own signals and connections. <br /><br />Put simply, it's not an artificial (feline) intelligence, it's a platform upon which an A(F)I could conceivably be built.<br /><br /><br /> Scientists, at IBM Research - Almaden, in collaboration with colleagues from Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses. <br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjfQexdjVH7L9RPsmj4fYmDJqcYOodYt-C02KVXDbkwWSIEB7cYQIpoIwuGxVifm7kR3xV_lA54N11SWfCc0LPQW-tMLjpyIrNinWBuV_LvR5PjLtjsicVtumb13zoZObQOtpf-floQHJSL/s1600/AIgraph.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 241px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjfQexdjVH7L9RPsmj4fYmDJqcYOodYt-C02KVXDbkwWSIEB7cYQIpoIwuGxVifm7kR3xV_lA54N11SWfCc0LPQW-tMLjpyIrNinWBuV_LvR5PjLtjsicVtumb13zoZObQOtpf-floQHJSL/s400/AIgraph.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5452511907080456258" /></a><br /><br /><br /><br />Ultimately, this is a very interesting development, both for the obvious reasons (an artificial cat brain!) and because of its associated "Blue Matter" project, which uses supercomputers and magnetic resonance to non-invasively map out brain structures and connections. <br /><br />The cortical sim is intended, in large part, to serve as a test-bed for the maps gleaned by the Blue Matter analysis. The combination could mean taking a reading of a brain and running the shadow mind in a box.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-56456712811988783872010-03-10T13:46:00.000+11:002010-03-10T13:47:31.286+11:00Android Phone powered robot<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjpfxP142kDjuklf8XVtSru-9Ci_er0sOsT7djb4JYKymXqSKGSo-sLOq4b2Y-2PG-xjSweHG-eV2D2N34H8Y_hgPsSq8uPGSn6g6ybCgXX1vxw_BA4svXNSWzqXxF7jCirjsty-LnmlVA/s1600-h/500x_androidtank.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 400px; height: 266px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjpfxP142kDjuklf8XVtSru-9Ci_er0sOsT7djb4JYKymXqSKGSo-sLOq4b2Y-2PG-xjSweHG-eV2D2N34H8Y_hgPsSq8uPGSn6g6ybCgXX1vxw_BA4svXNSWzqXxF7jCirjsty-LnmlVA/s400/500x_androidtank.jpg" alt="" id="BLOGGER_PHOTO_ID_5446602096176071170" border="0" /></a><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi-6R6zHaq1MRuG1f4LNF4BERrHl11lXDYvw3qjtRSuIsOv5-8ZzVlQCP5Pytpp6hewJ1SEtWwn7bRNZ9XDRz1d9QoeZ7nfUfJEcyQgAWcjcCJFrefVyV2A56Hh4blCt2AwacZD-wzJdXo/s1600-h/500x_androidtank2.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 400px; height: 266px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi-6R6zHaq1MRuG1f4LNF4BERrHl11lXDYvw3qjtRSuIsOv5-8ZzVlQCP5Pytpp6hewJ1SEtWwn7bRNZ9XDRz1d9QoeZ7nfUfJEcyQgAWcjcCJFrefVyV2A56Hh4blCt2AwacZD-wzJdXo/s400/500x_androidtank2.jpg" alt="" id="BLOGGER_PHOTO_ID_5446602083189053490" border="0" /></a><br />Some clever California hackers, Tim Heath and Ryan Hickman, are building bots that harness Android phones for their robo-brainpower.<br /><br />Their first creation, the TruckBot, uses a HTC G1 as a brain and has a chassis that they made for $30 in parts. It's not too advanced yet—it can use the phone's compass to head in a particular direction—but they're working on incorporating the bot more fully with the phone and the Android software.<br /><br />Some ideas they're looking to build in soon are facial and voice recognition and location awareness.<br /><br />If you're interested in putting together a Cellbot of your own the team's <a href="http://www.cellbots.com/">development blog</a> has some more information.<br /><br /><div style="text-align: center;"><br /><object width="400" height="324"><param name="movie" value="http://www.youtube.com/v/fPiQ-Rtcp9k&rel=0&color1=0x6699&color2=0x54abd6&hl=en_US&feature=player_embedded&fs=1"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/fPiQ-Rtcp9k&rel=0&color1=0x6699&color2=0x54abd6&hl=en_US&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="400" height="324"></embed></object><br /><br /><object width="400" height="324"><param name="movie" value="http://www.youtube.com/v/sZ3SecMEtnA&rel=0&color1=0x6699&color2=0x54abd6&hl=en_US&feature=player_embedded&fs=1"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/sZ3SecMEtnA&rel=0&color1=0x6699&color2=0x54abd6&hl=en_US&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="400" height="324"></embed></object><br /><br /><object width="400" height="324"><param name="movie" value="http://www.youtube.com/v/_Aw76wsj7VQ&color1=0x6699&color2=0x54abd6&hl=en_US&feature=player_embedded&fs=1"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/_Aw76wsj7VQ&color1=0x6699&color2=0x54abd6&hl=en_US&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="400" height="324"></embed></object><br /></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-74435548599983894252009-11-18T19:26:00.000+11:002010-03-25T21:21:31.440+11:00Brisbane maps robotic future<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgt91cGwJzhnenVPMG-BgY43_mvU4gsvMS7IF-R47Fgt4dtzFCR8hWdncgiN7H7DssfHrCMGcUHZ4di4z7c0krBexg9d7SXvmis2ZrfkdHcPIJSo6oY_4t_WT81waO7PB8ILXpQ8QhJjfSf/s1600/robots420-420x0.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 306px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgt91cGwJzhnenVPMG-BgY43_mvU4gsvMS7IF-R47Fgt4dtzFCR8hWdncgiN7H7DssfHrCMGcUHZ4di4z7c0krBexg9d7SXvmis2ZrfkdHcPIJSo6oY_4t_WT81waO7PB8ILXpQ8QhJjfSf/s400/robots420-420x0.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5405357851964305186" /></a><br />Scientists in Brisbane are blurring the line between biology and technology and creating a new generation of robot "helpers" more in tune with human needs.<br /><br />The University of Queensland is hosting the the Australian Research Council's Thinking Systems symposium this week, which brings together UQ's robotic navigation project with the University of New South Wales' robotic hands project and a speech and cognition project out of the University of Western Sydney.<br /><br />Scientists are working towards a range of robotic innovations, from the development of navigation and learning robots to the construction of artificial joints and limbs and the creation of a conversational computer program, a la 2001: A Space Odyssey's HAL.<br /><br />UQ's School of Information Technology and Electrical Engineering head, Professor Janet Wiles, said the symposium paired "some very clever engineers...with very clever scientists" to map the future of robotics - and it was going to be a very different world.<br /><br />"You're bringing together neuroscience, cognitive science, psychology, behaviour and robotics information system to look at the cross disciplinary projects we can do in this space," Professor Wiles said.<br /><br />"We're doing a combination of the fundamental science and the translation into the technology and that's one of the great benefits of our project."<br /><br />The group aims to advance robotic technology by decoding the way human and animal brains work to equip machines with the ability to operate in the real world.<br /><br />"There's a strong connection to cognition - the way the brain works as a whole - and navigation, so what we've been doing is studying the fundamental of navigation in animals and taking the algorithms we've learnt from those and putting them into robots," Professor Wiles said.<br /><br />Over the next two decades, she sees robots becoming more and more important, expanding from their current roles as cleaners, assemblers and drones and into smarter machines more closely integrated with human beings in the form of replacement limbs and joints.<br /><br />"It's not going to be the robots and us. Already a lot of people are incorporating robot components; people who have had a leg amputated who now have a knee and in the knee. It is effectively a fully-articulated robotic knee [with] a lot of the spring in the step that a natural knee has," Professor Wiles said.<br /><br />"The ability of robots to replace component parts is an area which is going to be growing.<br /><br />"This is where you're going to blur the line between technology and biology when you start to interface these two fields."<br /><br />At UQ, the team is working on developing computer codes or algorithms that would enable a robot to "learn" rapidly about its near environment and navigate within it.<br /><br />"Navigation is quite an intriguing skill because it is so intrinsic to what we do and we are really not aware of it unless we have a poor sense of navigation," Professor Wiles said.<br /><br />"The kind of navigation we are dealing with is how you get from one place to another, right across town or from one room in a building to another you can't see."<br /><br />With about four million robots in households right now, performing menial chores such as vacuuming the carpet, improvements in navigation has the potential to widen the scope of these creations to take a larger place in everyday life.<br /><br />According to Professor Wiles, the ability to rapidly process information and apply it to the area they are working in will give robots the edge into the future. <br /><br />"Robots need to learn new environments very rapidly and that's what a lot of our work focuses on.<br /><br />"When you take a robot out of the box you don't want to program into it with the architecture of your house, you want the robot to explore the house and learn it very quickly," Professor Wiles said.<br /><br />"Household robotics is going to be really big in the next 15 years or so and this is one of the things you need is for robots to be able to look after themselves in space."<br /><br />But as Australian universities and international research institutes look into replicating the individual parts of biological creatures and mimic them in machines, the question of intelligence inevitably become more important.<br /><br />While the sometimes frightening scenarios played out in science fiction novels and films - where so often robots lay waste to humanity - remains securely in the realm of fantasy, Professor Wiles believes that some day machines will think like us. <br /><br />"There's strong AI [artificial intelligence] and weak AI. Strong AI says there will be artificially intelligent creatures which are not biological. Weak AI says they will have a lot of the algorithms and they do already have a lot of those algorithms," she said.<br /><br />"The bee, whose brain is a tiny as a sesame seed, already has better navigation abilities than even our best robots.<br /><br />"So we have a little way to go before robots reach biological intelligence let alone human intelligence but I don't see why we shouldn't take steps towards it."Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-28053863732217407892009-10-21T08:42:00.000+11:002009-10-21T08:45:33.333+11:00Scientists Develop New Material For Robot Muscles<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrvzIzjRVLwYAovoVjOz5j3f8aSJZFAw_d3jzR_OHadBMSMNenK_r1l3WZ3nwC8wMahfl2G9R6dLCai4E2tBICZtW85V1ZNd8EDWVSQNK0YoipPHYC8gFIQlFTK78BIIquL3KKLnGGfLxd/s1600-h/robot-arm-wrestling.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 264px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrvzIzjRVLwYAovoVjOz5j3f8aSJZFAw_d3jzR_OHadBMSMNenK_r1l3WZ3nwC8wMahfl2G9R6dLCai4E2tBICZtW85V1ZNd8EDWVSQNK0YoipPHYC8gFIQlFTK78BIIquL3KKLnGGfLxd/s400/robot-arm-wrestling.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5394801382093571522" /></a><br /><br />A group of researchers at the University of Texas have made a breakthrough in their quest to find a stronger, more effective material for use in making artificial muscles for robots. They’ve found a way to create tangled nanotube ribbons that are, relative to weight, stronger than steel and stiffer than diamond. <br /><br />The ribbons have the remarkable ability to expand in width by 220% when a voltage is applied. When the voltage is removed, the ribbons return to their normal state. The shift in state occurs over mere milliseconds. Additionally, the nanotube ribbons retain their properties in an extreme range of temperature: between -196 °C and 1538 °C.<br /><br />The scientists construct the nanotube ribbons into an aerogel, which contains primarily air. A cubic centimeter of the stuff weighs just 1.5 milligrams; one gram could cover a space of 30 square meters. Researchers are still experimenting with ribbon lengths. So far, the team has produced ribbons that are 1/50th of a millimeter thick, 16 centimeters wide and several meters long. Larger sheets are in the works.<br /><br />Besides serving as material for artificial muscles, the nanotube ribbons could be used to create sturdy, lightweight structures in space.<br /><br /><object width="425" height="344"><param name="movie" value="http://www.youtube.com/v/e_ZchBesP9I&rel=0&color1=0xb1b1b1&color2=0xcfcfcf&hl=en&feature=player_embedded&fs=1"></param><param name="allowFullScreen" value="true"></param><param name="allowScriptAccess" value="always"></param><embed src="http://www.youtube.com/v/e_ZchBesP9I&rel=0&color1=0xb1b1b1&color2=0xcfcfcf&hl=en&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowScriptAccess="always" width="425" height="344"></embed></object>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-41600473006904257612009-10-21T08:29:00.000+11:002009-10-21T08:41:40.484+11:00Hydrogen muscle silences the domestic robot<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQUspVkPCk0F0WiOD7t7JYzojEUFoPAXbwW0oEupmL3R34Uw0aSijy97F4gbX9ZfxEny6TSYHVuk3YCCoD8WuqWtgLhSs92D9CezCK7Gxn94LCoY-IPsS0JJZOfu_E9pVloTLZgnqfb10z/s1600-h/mg20327223.900-1_300.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 300px; height: 229px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQUspVkPCk0F0WiOD7t7JYzojEUFoPAXbwW0oEupmL3R34Uw0aSijy97F4gbX9ZfxEny6TSYHVuk3YCCoD8WuqWtgLhSs92D9CezCK7Gxn94LCoY-IPsS0JJZOfu_E9pVloTLZgnqfb10z/s400/mg20327223.900-1_300.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5394800560216803826" /></a><br /><br /><br />IF ROBOTS are ever going to be welcome in the home they will need to become a lot quieter. Building them with artificial muscles that run on hydrogen, instead of noisy compressed-air pumps or electric motors, could be the answer.<br /><br />Kwang Kim, a materials engineer at the University of Nevada in Reno, came up with the idea after realising that hydrogen can be supplied silently by metal hydride compounds.<br /><br />Metal hydrides can undergo a process called reversible chemisorption, allowing them to store and release extra hydrogen held by weak chemical bonds. It's this property that has led to the motor industry investigating metal hydrides as hydrogen "tanks" for fuel cells.<br /><br />To make a silent artificial muscle, Kim and his colleague Alexandra Vanderhoff first compressed a copper and nickel-based metal hydride powder into peanut-sized pellets. They then secured them in a reactor vessel and pumped in hydrogen to "charge" the pellets with the gas. A heater coil surrounded the vessel, as heat breaks the weak chemical bonds and releases the stored hydrogen.<br /><br />The next step was to connect the vessel to an off-the-shelf artificial muscle, which comprises an inflatable rubber tube surrounded by Kevlar fibre braiding. Two of these placed either side of a robotic joint can mimic the push/pull action of muscles by being alternately inflated and deflated (see diagram).<br /><br />Turning the heater on and off controls the flow of hydrogen into the rubber tube, causing the muscle to move silently. Even better, the pair say, the muscle performs as well as those that run on compressed air systems (Smart Materials and Structures, DOI: 10.1088/0964-1726/18/12/125014). Importantly, the gas didn't leak.<br /><br />"The system has biological muscle-like properties for humanoid robots that need high power, large limb strokes - and no noise," says Kim.<br /><br />Yoseph Bar-Cohen, an engineer specialising in artificial muscle technology at NASA's Jet Propulsion Lab in Pasadena, California, says this is "a novel approach" for controlling artificial muscles. "It is an important contribution and increases the arsenal of potential actuators that may become available in the future," he says.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-48827027983588305512009-10-21T07:43:00.000+11:002009-10-21T07:46:21.488+11:00iRobot Unveils Morphing Blob Robot<object height="344" width="425"><param name="movie" value="http://www.youtube.com/v/SbqHERKdlK8&color1=0xb1b1b1&color2=0xcfcfcf&hl=en&feature=player_embedded&fs=1"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/SbqHERKdlK8&color1=0xb1b1b1&color2=0xcfcfcf&hl=en&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" height="344" width="425"></embed></object><br /><br /><span style="font-weight: bold;">iRobot's latest robot is unique on many levels. The doughy blob moves by inflating and deflating - a new technique its developers call "jamming."<br /><br />As the researchers explain in the video below, the jamming mechanism enables the robot to transition from a liquid-like to a solid-like state.</span><br /><br /><br />Earlier this week, researchers from iRobot and the University of Chicago presented the new "blob bot" at the IEEE/RSJ International Conference on Intelligent Robots and Systems.<br /><br />As a new kind of chemical robot (or chembot), the blob bot has stretchy silicone skin, which is composed of multiple cellular compartments that each contain a "jammable slurry."<br /><br />When some of these cells are unjammed, and an actuator in the center of the robot is inflated, the robot inflates in the areas of the unjammed cells. By controlling which cells are unjammed, the researchers can change the shape of the robot and make it roll in a specific direction.<br /><br />The new robot is being funded by DARPA, which gave iRobot $3.3 million to work on the chembot last year. The goal is to build a robot that can squeeze through tiny openings smaller than its own dimensions, which could be valuable in a variety of missions.<br /><br />The video shows the robot from about one year ago, and since then the researchers have been working on adding sensors and connecting multiple blob bots together.Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-49914035769471454172009-09-26T21:01:00.000+10:002009-09-26T21:08:15.048+10:00L300 Automatic Robot Lawn Mower Demo Video<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjP_rJlaQWZoaz8Pm842H84o7rNLUsuKktD4iE5ICBQyul7M2vnxaKCRL27IHS0vZHPkuQo3pxFWfPOh6epn-N0ADEWm30nQKv4mnG5MYBpOiSAa-K0AsyGpEpv7TZRkjDJQw7ptsP6XrAl/s1600-h/4912-sekacka-ambrogio-robot-l-300-2.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 310px; height: 206px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjP_rJlaQWZoaz8Pm842H84o7rNLUsuKktD4iE5ICBQyul7M2vnxaKCRL27IHS0vZHPkuQo3pxFWfPOh6epn-N0ADEWm30nQKv4mnG5MYBpOiSAa-K0AsyGpEpv7TZRkjDJQw7ptsP6XrAl/s400/4912-sekacka-ambrogio-robot-l-300-2.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5385730939077308114" /></a><br /><br /><br />Auto Lawn Mow presents the new line L300 basic model of Auto Lawn mowers. As the top of the line model, this is a robotic auto lawn mower that can handle over three acres of grass. Clean, effective and fully automatic, you can do whatever you want to do while the robot mows the grass. To understand the power of our flagship auto lawn mower, explore the range of features and benefits of the L300. This beautiful robotic lawn mower is everything you ever wanted and more!<br /><br />Controlling and programming the new top of the range L300 model couldn't be easier. Using a Bluetooth™ mobile phone set the days and times you want L300 to cut or use the simple control panel on the rear of the mower. The heavy duty wheel motors are ideal for 30° slopes.<br /><br />Intelligent mowing technology, means where the grass is longer, the L300 Robotic Lawn Mower will perform a 'Smart Spiral' function. In shorter grass L300 will save power by slowing the blade down.<br /><br />This really is the Rolls Royce of robot lawn mowing. It not only looks good on the outside, it is very high-tech on the inside.<br /><br />Fully Autonomous - The L300 Robotic Lawn mower returns to the recharge base on its own when the battery gets low - You can go all season long without worry. Get up to 8 hours without having to re-charge.<br /><br /><div><object width="420" height="339"><param name="movie" value="http://www.dailymotion.com/swf/x4qrjo" /><param name="allowFullScreen" value="true" /><param name="allowScriptAccess" value="always" /><embed src="http://www.dailymotion.com/swf/x4qrjo" type="application/x-shockwave-flash" width="420" height="339" allowFullScreen="true" allowScriptAccess="always"></embed></object><br /><b><a href="http://www.dailymotion.com/swf/x4qrjo">L300 Automatic Robot Lawn Mower Demo Video</a></b><br /><i>by <a href="http://www.dailymotion.com/robotm">robotm</a></i></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-43725700182016242762009-09-17T13:48:00.000+10:002009-09-17T13:57:33.750+10:00Taiwan plans to build robot pandas<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjgIZMiX1cfX3xKLKrCU3mC9AC7uHzg1MJcCRq_I48qreY58HiAWnXjocyItXTq6lZ0K3XCLiZITPj_gNVhGT7WRJzQK8r1EHHBr0tci-OcPuDZqB-GU_3W5QfhR9m-SUk5pxs_T3JIPtcZ/s1600-h/crazy_robot_pandas.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 400px; height: 294px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjgIZMiX1cfX3xKLKrCU3mC9AC7uHzg1MJcCRq_I48qreY58HiAWnXjocyItXTq6lZ0K3XCLiZITPj_gNVhGT7WRJzQK8r1EHHBr0tci-OcPuDZqB-GU_3W5QfhR9m-SUk5pxs_T3JIPtcZ/s400/crazy_robot_pandas.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5382280532583887234" /></a><br />A CUTTING-EDGE lab in Taiwan aims to develop panda robots that are friendlier and more artistically endowed than their endangered real-life counterparts.<br /><br />THE Centre for Intelligent Robots Research said its world-first panda robot was taking shape at the hands of an ambitious group of scientists hoping to add new dimensions to the island's reputation as a high-tech power.<br /><br />"The panda robot will be very cute and more attracted to humans. Maybe the panda robot can be made to sing a panda song,'' the centre's 52-year-old director Jerry Lin said.<br /><br />Day by day, the panda has evolved on the centre's computer screens and, if funding permitted, the robot would take its first steps by the end of the year.<br /><br />"It's the first time we try to construct a quadrupedal robot," Jo Po-chia, a doctoral student who is in charge of the robot's design, said.<br /><br />"We need to consider the balance problem."<br /><br />The robo-panda was just one of many projects on the drawing board at the centre attached to the National Taiwan University of Science and Technology.<br /><br />The Taipei-based centre also aimed to build robots that looked like popular singers, so exact replicas of world stars could perform in the comfort of their fans' homes.<br /><br />"It could be a Madonna robot. It will be a completely different experience from just listening to audio,'' Mr Lin said.<br /><br />Mr Lin and his team were also working on educational robots that acted as private tutors for children, teaching them vocabulary or telling them stories in foreign languages.<br /><br />There is an obvious target market: China, with its tens of millions of middle-class parents doting on the one child they are allowed under strict population policies.<br /><br />"Asian parents are prepared to spend a lot of money to teach their children languages,'' Mr Lin said.<br /><br />Robots running amok were a fixture of popular literature but parents did not have to worry about leaving their children home alone with their artificial teachers, he said.<br /><br />"A robot may hit you like a car or a motorbike might hit you," he said.<br /><br />"But it won't suddenly lose control and get violent. Humans lose control, not robots. It's not like that.''<br /><br />Mr Lin's long-term dream was to create a fully-functioning Robot Theatre of Taiwan, with an ensemble of life-like robots able to sing, dance and entertain.<br /><br />Two robotic pioneers, Thomas and Janet, appeared before an audience in Taiwan in December, performing scenes from the Phantom of the Opera, but that was just the beginning, Mr Lin said.<br /><br />"You can imagine a robot shooting down balloons, like in the wild west, using two revolvers, or three, but much faster than a person," Mr Lin said.<br /><br />"Some things robots can do better than humans with the aid of technologies."Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-3107779695375736256.post-37321617332103932062009-09-17T11:41:00.000+10:002009-09-17T11:42:31.387+10:00Robot prepares tea at CeBIT<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7n_c3AqxLiG6HOqBfxC9xOr99FpAavtsvGhhJsFq8Lom3yvLVygrkLw_4IkodkCf6CV7nanP3P5G8S4-Rx0hyLFkazq37ZrJdg5d1QMJgwfXIJnEXk_AWDgqLyetBZshcnnGEpTjPQ1MT/s1600-h/r20_18154141.jpg"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 400px; height: 262px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7n_c3AqxLiG6HOqBfxC9xOr99FpAavtsvGhhJsFq8Lom3yvLVygrkLw_4IkodkCf6CV7nanP3P5G8S4-Rx0hyLFkazq37ZrJdg5d1QMJgwfXIJnEXk_AWDgqLyetBZshcnnGEpTjPQ1MT/s400/r20_18154141.jpg" alt="" id="BLOGGER_PHOTO_ID_5382245534971797090" border="0" /></a><br /><span class="bpMore">Fair visitors look at the humanoid robotic system "Rollin' Justin" preparing a tea on March 2, 2009 at the world's biggest high-tech fair CeBIT in Hanover, central Germany.</span>Unknownnoreply@blogger.com0