tag:blogger.com,1999:blog-11980187291303034202024-03-13T04:25:34.516-07:00Advanced RoboticsUnknownnoreply@blogger.comBlogger29125tag:blogger.com,1999:blog-1198018729130303420.post-70367584360463935732010-12-21T06:48:00.000-08:002010-12-21T06:48:04.393-08:00Robot fliers racing to catch the Zephyr<span class="Apple-style-span" style="color: #222222; font-family: georgia, times, serif;"><span class="Apple-style-span" style="font-size: x-small;"></span></span><br />
<div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; line-height: 24px; margin-bottom: 25px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><span class="Apple-style-span" style="font-size: x-small;"><img alt="" height="164" src="http://www.scientificamerican.com/media/inline/blog/Image/unmanned-aircraft-blog.jpg" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; margin-bottom: 8px; margin-left: 0px; margin-right: 10px; margin-top: 0px; max-width: 270px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;" width="600" /></span></div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; line-height: 24px; margin-bottom: 25px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><span class="Apple-style-span" style="font-size: x-small;">The Pentagon's hope of having a squadron of unmanned aerial vehicles (UAV) capable of staying in the air and performing surveillance for <em style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">years</em> rather than hours recently took a small step forward. Working with U.K.-based idea factory QinetiQ Group PLC, researchers from the U.S. Defense Advanced Research Project Agency (DARPA) managed to keep the solar-powered Zephyr high-altitude, long-endurance aircraft in the air over the Arizona desert for 82 hours 37 minutes, beating the 54-hour flight completed last year by an earlier version of the aircraft, the <a href="http://www.qinetiq.com/home/newsroom/news_releases_homepage/2008/3rd_quarter/qinetiq_s_zephyr_uav.html" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #19437c; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; vertical-align: baseline;">company reports on its Web site</a>.<br />
<br />
Launched by hand, the 66-pound (30-kilogram) Zephyr is an ultra-lightweight carbon-fiber aircraft that during the day flies on solar power generated by silicon solar arrays no thicker than sheets of paper that cover the aircraft's wings. At night it's powered by <a href="http://www.sionpower.com/index.html" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #19437c; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; vertical-align: baseline;">SION Power Inc.'s</a> rechargeable lithium-sulfur batteries (recharged during the day using solar power), according to the company.<br />
<br />
The flight, took place between July 28 and 31 as researchers guided the Zephyr by remote control to an operating altitude in excess of 60,000 ft (18 km), according to<a href="http://news.bbc.co.uk/2/hi/science/nature/7577493.stm" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #19437c; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; vertical-align: baseline;"><em style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">BBC News</em></a>. After that, the aircraft, which carried a 4.4-pound (2-kilogram) payload, flew on autopilot and via satellite communication.<br />
<br />
The Zephyr could be the predecessor of DARPA's proposed Vulture (<a href="http://www.sciam.com/article.cfm?id=pentagon-developing-new-u" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #19437c; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; vertical-align: baseline;">Very high altitude, Ultraendurance, Loitering Theater Unmanned Reconnaissance Element</a>) project, to create an aircraft that can remain above a surveillance target for at least five years. Under the current plans, the Vulture would weigh 1,000 pounds (453.5 kilograms) and be designed to collect its power from its environment—via solar or some other source—to store and use energy efficiently, and include a robotic refueling capability. With a wingspan of between 300 and 500 feet (between 91.4 and 152.4 meters), the Vulture would function like a low-orbit satellite as much as like an aircraft, staying aloft far longer than any surveillance plane can today.</span></div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; line-height: 24px; margin-bottom: 25px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><span class="Apple-style-span" style="font-size: x-small;">Government contractors Aurora Flight Sciences, Boeing and Lockheed Martin are working on the first phase of the Vulture project, which began in April. The contractors are studying possible designs and configurations during this year-long first phase, which will conclude with a review of smaller and full-size demo aircraft, says DARPA spokeswoman Jan Walker. The next phase is expected to include an uninterrupted, three-month flight test of a smaller version of the Vulture. In the third and final phase, the program will conduct a flight test of a full-scale Vulture during which the vehicle will be in the air for an entire year without landing.<br />
<br />
"The Zephyr flight demonstrates the continuous improvement of all of the systems--solar cells, battery, light weight structural components, aerodynamic design, power management, and energy harvesting--to make (long-term, unmanned flight) possible," Jamey Jacob, an Oklahoma State University associate professor of mechanical and aerospace engineering, told <em style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">ScientificAmerican.com</em>. "I anticipate that records will begin to fall rather quickly as more platforms make use of the evolving technology."<br />
<br />
One record that's already fallen is the current official world record for unmanned flight set by the U.S. robot plane Global Hawk—of 30 hours, 24 minutes. (QinetiQ's flight is an unofficial record because it did not involve the FAI (Federation Aeronautique Internationale), the world air sports federation, which sanctions all record attempts.<br />
<br />
Another budding UAV project is Oklahoma State University's <a href="http://flickr.com/photos/arena5/sets/72157605909981910/detail/" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #19437c; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; vertical-align: baseline;">Pterosoar-B</a>, which on July 1 set two new aviation world records for flight in the FAI category for autonomous aircraft of less than 5 kg with a duration of 6 hours, 15 minutes, and 54 seconds, and distance in a closed course of 122 km.</span></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-87724870772233164892010-12-21T06:45:00.000-08:002010-12-21T06:45:42.854-08:00NASA Recreates Mars Surface to Liberate Rover<span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;"><span class="Apple-style-span" style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px;"></span></span><br />
<div id="embed_wide" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><div id="pic" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_1a.jpg" style="border-bottom-style: none; border-left-style: none; border-right-style: none; border-top-style: none; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span></div></div><table border="0" cellpadding="0" cellspacing="0" style="margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 10px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; width: 680px;"><tbody style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">
<tr style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><td class="blog_slideshow_thumbnail_border_on" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rover/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_1a_t.jpg" style="border-bottom-color: rgb(0, 124, 165); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(0, 124, 165); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(0, 124, 165); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(0, 124, 165); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/2/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_2a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/3/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_3a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/4/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_4a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/5/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_5a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/6/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_6a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/7/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_7a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/8/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_8a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td><td class="blog_slideshow_thumbnail_border_off" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: top;"><a href="http://www.wired.com/wiredscience/2009/07/mars_rovermars_rover/9/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_9a_t.jpg" style="border-bottom-color: rgb(255, 255, 255); border-bottom-style: solid; border-bottom-width: 4px; border-color: initial; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 4px; border-right-color: rgb(255, 255, 255); border-right-style: solid; border-right-width: 4px; border-style: initial; border-top-color: rgb(255, 255, 255); border-top-style: solid; border-top-width: 4px; border-width: initial; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" width="67" /></a></td></tr>
</tbody></table><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;">PASADENA – Getting stuck is never fun, especially when you’re over 30 million miles from Earth. NASA’s Spirit rover is mired in dirt on Mars and now scientists at the Jet Propulsion Laboratory are working hard to free the over-worked robot.</span></div><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;">Spirit first ran afoul of the Martian surface on May 6 when it hit some patches of dirt that made its wheels spin in place. Now the wheels (two of which are not working properly) are sunk in up to their hubcaps.</span></div><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;">Like a remote Auto Club for robots, JPL engineers have built a sandbox filled with a mixture of materials that closely mimic the consistency of Martian soil as well as a rock to high-center the rover. They’ve driven a replica of the Spirit into the box and are working diligently to figure out the best way to escape the talcum-like trap – a technique used with Spirit’s twin rover, Opportunity, back in 2005 when it also became stuck.</span></div><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;">The first Mars Exploration Rover landed on the red planet in January 2004. Initially, the mission was supposed to last 90 Martian days, but Spirit exceeded that by over 20 times. Thanks to a recent dust storm, the fine dust that coated Spirit’s solar panels was blown off and it has been operating at full power for months now. If this latest obstacle can be overcome, Spirit can keep exploring even longer.</span></div><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;">Read on to see how the JPL scientists created a little piece of Mars on Earth and get up close and personal with Spirit’s predicament.</span></div><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;"><strong style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Above:</strong> A JPL technician attaches a grounding strap to the rover before measuring the distance it traveled during the previous move. Below:</span></div><div id="embed_wide" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><div id="pic" style="margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;"><img alt="" src="http://www.wired.com/images_blogs/wiredscience/2009/07/mars_rover_1b.jpg" style="border-bottom-style: none; border-left-style: none; border-right-style: none; border-top-style: none; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span></div></div><div style="margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="font-family: Arial, Helvetical, sans-serif;">A Discovery Channel Canada film crew films the engineers as they work to get the rover unstuck.</span></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-83238892274519691102010-12-21T06:43:00.001-08:002010-12-21T06:43:39.266-08:00Design a New Robotic Muscle Suit<div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="image large" style="clear: both; display: block; float: none; font-weight: normal; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 10px; margin-top: 10px; max-width: 606px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a href="http://www.pcworld.com/zoom?id=183122&page=1&zoomIdx=1" style="border-bottom-color: rgb(0, 71, 132); border-bottom-style: dotted; border-bottom-width: 1px; clear: none; color: #1c609f; font-weight: normal; list-style-type: none; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;" target="_blank"><img alt="Click image for larger." height="225" src="http://zapp5.staticworld.net/news/graphics/183122-1125-irex-muscle-04_350.jpg" style="border-bottom-style: none; border-color: initial; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; border-width: initial; clear: both; display: block; font-weight: normal; height: auto; list-style-type: none; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 606px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Click image for larger." width="400" /></a><span class="artCaption" style="clear: both; color: #404040; float: left; font-size: 11px; font-weight: normal; line-height: 16px; list-style-type: none; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 6px; padding-left: 0px; padding-right: 0px; padding-top: 6px; width: auto;">Click image for larger.</span></span></div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Students at Tokyo's University of Science have developed a new version of their muscle suit, a wearable robotic suit that assists the muscles when carrying out strenuous tasks.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">The original version of the suit, which has been in production for several years, provides assistance to the arms and back but the new version provides assistance to the back only. That means it is lighter and more compact than the original model.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">In a demonstration on Wednesday at the International Robot Exhibition in Tokyo, a student wearing the suit was able to bend down and lift 15 kilograms of weights with the assistance of the robotic suit. Doing so without assistance would be difficult for many people and could cause injury to some.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">The university is still developing the suit and the model demonstrated on Wednesday was the first prototype. A production version is due some time in 2010.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">With its greater assistance the original version of the suit will remain the most useful for heavier tasks.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">In a demonstration of that model on Wednesday a student was asked to carry 10-kilogram bags of rice. With the suit switched off he could manage up to three bags before they started to get too heavy to carry, but with the suit switched on another two bags could be loaded into his arms. He quickly dropped the bags when the suit was switched off as without assistance it was too much weight to carry.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Such suits are being developed with an eye on assisting the physically challenged and workers carrying out physically demanding jobs.</div><div style="font-family: arial, verdana, helvetica, sans-serif; font-size: 14px; font-weight: normal; line-height: 20px; list-style-type: none; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Earlier this year Toyota Motor unveiled similar robot-assisted suits and has been testing them at factories in Japan with workers who have to lift large or heavy sheets of metal or car parts.</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-3223193823699464172010-12-21T06:42:00.001-08:002010-12-21T06:42:35.742-08:00NASA Robot Solves 19-Year-Old Murder Mystery<span class="Apple-style-span" style="font-family: 'Lucida Grande', Helvetica, Arial, sans-serif; font-size: 12px; line-height: 20px;"></span><br />
<div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><a href="http://cache.gawkerassets.com/assets/images/4/2010/07/458547main_max_rover_-_magnetometer.jpg" rel="lytebox" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #dc870e; font-size: 12px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline; vertical-align: baseline;"><img alt="NASA Robot Solves 19-Year-Old Murder Mystery" class="left image500 image_0" src="http://cache.gawkerassets.com/assets/images/4/2010/07/500x_458547main_max_rover_-_magnetometer.jpg" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-color: rgb(179, 179, 179); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(179, 179, 179); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(179, 179, 179); border-right-style: solid; border-right-width: 1px; border-style: initial; border-top-color: rgb(179, 179, 179); border-top-style: solid; border-top-width: 1px; clear: left; float: left; font-size: 12px; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: justify; vertical-align: baseline;" width="500" /></a></div><div style="text-align: justify;">Dawn Sanchez was last seen alive when she stepped into Bernado Bass' car in 1991. Her disappearance and death remained unsolved until recently when—thanks to a little NASA robot—her murderer was sentenced to six years in prison.</div><br />
<div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: justify; vertical-align: baseline;">Bass was Sanchez's boyfriend at the time of her disappearance and there were witness reports claiming that he shot the girl "in a vacant lot after the two got into a fight." The only problem was that no evidence to support this explanation was anywhere to be found. No car. No gun. No body.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: justify; vertical-align: baseline;">This meant that Bass got away with the murder until recently when parts from the suspect's car were found buried in a large abandoned lot. They most likely would not have been found without the aid of the NASA equipment borrowed for the investigation. Using this equipment, investigators were able to figure out just where they needed to excavate:</div><blockquote style="background-attachment: initial; background-clip: initial; background-color: #eaf2f4; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #51646b; font-size: 12px; line-height: 18px; margin-bottom: 5px; margin-left: 0px; margin-right: 0px; margin-top: 5px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 10px; padding-left: 10px; padding-right: 10px; padding-top: 10px; quotes: none; vertical-align: baseline;"><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 10px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: justify; vertical-align: baseline;"><b>The case was dismissed in 1991 due to lack of evidence. The case was recently reopened, when an informant reported that the car may have been disassembled and buried in a large abandoned lot in Alviso. The exact location in the lot was not specified, and the cost to excavate the entire area was too high. Further, the lot contained a substantial amount of buried and surface metallic debris, making a simple survey with metal detectors insufficient.</b></div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 10px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: justify; vertical-align: baseline;"><b>[...]</b></div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 10px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: justify; vertical-align: baseline;"><b>[T]he mixed team of scientists and engineers from CMIL, NASA Ames and the USGS deployed an instrumented Senseta MAX 5.0A rover hosting the research technologies under development, and mapped the magnetic environment of the survey area. The USGS received the processed data set, and after further post-processing, presented the county DA's office with their analysis and possible locations for ex</b>cavation. Based on this data, the county excavated the site and retrieved car parts that matched the suspect's car.</div></blockquote>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-38249462568153817872010-12-21T06:39:00.000-08:002010-12-21T06:39:30.374-08:00IMPASS: Intelligent Robot with Mobility Platform with Active Spoke System<div style="text-align: justify;"><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;">IMPASS (Intelligent Mobility Platform with Active Spoke System) is a novel high mobility locomotion platform for unmanned systems in unstructured environments. Utilizing rimless wheels with individually actuated spokes, it can follow the contour of uneven surfaces like tracks and step over large obstacles like legged vehicles while retaining the simplicity of wheels. Since it lacks the complexity of legs and has a large effective (wheel) diameter, this highly adaptive system can move over extreme terrain with ease while maintaining respectable travel speeds, and thus has great potential for search-and-rescue missions, scientific exploration, and anti-terror response applications.</span></div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><br />
</span></div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><span class="Apple-style-span" style="color: white;"><table cellspacing="5" class="infobox" style="background-color: black; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(170, 170, 170); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(170, 170, 170); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(170, 170, 170); border-top-style: solid; border-top-width: 1px; clear: right; color: white; font-size: 11px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 1em; margin-right: 0px; margin-top: 0.5em; padding-bottom: 0.2em; padding-left: 0.2em; padding-right: 0.2em; padding-top: 0.2em; text-align: left; width: 22em;"><tbody>
<tr><td class="" colspan="2" style="text-align: center; vertical-align: top;"><a class="image" href="http://www.romela.org/main/File:IMPASS.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="IMPASS.jpg"><img alt="IMPASS" border="0" height="265" src="http://www.romela.org/wiki/images/thumb/4/42/IMPASS.jpg/300px-IMPASS.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; vertical-align: middle;" width="400" /></a></td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">IMPASS<br />
Intelligent Mobility Platform with Active Spoke System</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Principal Investigator</th><td class="" style="text-align: right; vertical-align: top;">Dr. Dennis Hong</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Researchers</th><td class="" style="text-align: right; vertical-align: top;">Ping Ren<br />
Ya Wang<br />
<a class="mw-redirect" href="http://www.romela.org/main/User:Jbjeans" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="User:Jbjeans">Blake Jeans</a></td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">Mechanical Details</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">DOF</th><td class="" style="text-align: right; vertical-align: top;">9</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Motors</th><td class="" style="text-align: right; vertical-align: top;">Escap<br />
Maxon</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Controllers</th><td class="" style="text-align: right; vertical-align: top;">AllMotion EZSV10<br />
AllMotion EZSV23</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Sensors</th><td class="" style="text-align: right; vertical-align: top;">Foot Touch Sensors (12x)</td></tr>
<tr><td colspan="2" style="text-align: right; vertical-align: top;"><div class="noprint plainlinks navbar" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; font-size: xx-small; font-weight: normal; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">This box: <a class="new" href="http://www.romela.org/wiki/index.php?title=Template:IMPASS&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template:IMPASS (page does not exist)"><span title="View this template">view</span></a> <span style="font-size: 7px;">•</span> <a class="new" href="http://www.romela.org/wiki/index.php?title=Template_talk:IMPASS&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template talk:IMPASS (page does not exist)"><span title="Discuss this template">talk</span></a></div></td></tr>
</tbody></table></span></span></div><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Goals and Objectives</span></h2><ul style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: url(http://www.romela.org/wiki/skins/romela/bullet.gif); list-style-type: square; margin-bottom: 0.5em; margin-left: 1.5em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Classification for topology structures of IMPASS based on different ground contact points</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Mobility analysis for different configuration cases, using both conventional and screw-based modified <a class="external text" href="http://en.wikipedia.org/wiki/Chebychev%E2%80%93Gr%C3%BCbler%E2%80%93Kutzbach_criterion" rel="nofollow" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: url(http://www.romela.org/wiki/skins/romela/external.png); background-origin: initial; background-position: 100% 50%; background-repeat: no-repeat no-repeat; padding-bottom: 0px; padding-left: 0px; padding-right: 13px; padding-top: 0px; text-decoration: underline;" title="http://en.wikipedia.org/wiki/Chebychev–Grübler–Kutzbach_criterion">Grubler</a> and <a class="external text" href="http://en.wikipedia.org/wiki/Chebychev%E2%80%93Gr%C3%BCbler%E2%80%93Kutzbach_criterion" rel="nofollow" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: url(http://www.romela.org/wiki/skins/romela/external.png); background-origin: initial; background-position: 100% 50%; background-repeat: no-repeat no-repeat; padding-bottom: 0px; padding-left: 0px; padding-right: 13px; padding-top: 0px; text-decoration: underline;" title="http://en.wikipedia.org/wiki/Chebychev–Grübler–Kutzbach_criterion">Katzbach</a>criterion</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Inverse and forward position analysis for the critical topology scheme of IMPASS</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Singularity configuration identify and investigation using screw theory</li>
<li style="margin-bottom: 0.1em; text-align: justify;"><a class="external text" href="http://en.wikipedia.org/wiki/Generalized_Jacobian" rel="nofollow" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: url(http://www.romela.org/wiki/skins/romela/external.png); background-origin: initial; background-position: 100% 50%; background-repeat: no-repeat no-repeat; padding-bottom: 0px; padding-left: 0px; padding-right: 13px; padding-top: 0px; text-decoration: underline;" title="http://en.wikipedia.org/wiki/Generalized_Jacobian"><span class="Apple-style-span" style="color: black;">Screw-based Jacobian analysis</span></a></li>
<li style="margin-bottom: 0.1em; text-align: justify;">Develop 2D and 3D motion planning strategies in unstructured terrain for both terrain sensing and non-terrain sensing configurations</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Verify motion planning strategies in simulation and experimentally</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Advance the capabilities of the hardware platform, including a moving center of gravity, onboard computer and power, and rugged body and components</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Develop accurate and dependable perception units for terrain sensing and object recognition, including laser range finders and cameras</li>
</ul><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Publications" name="Publications" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Publications</span></h2><ol style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Laney, D. and Hong, D.W.,”Kinematic Analysis of a Novel Rimless Wheel with Independently Actuated Spokes”, 29th ASME Mechanisms and Robotics Conference, Long Beach, California, September 24-28, 2005.</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Hong. D.W. and Laney, D., “Preliminary Design and Kinematic Analysis of a Mobility Platform with Two Actuated Spoke Wheels”, US-Korea Conference on Science, Technology and Entrepreneurship (UKC 2006), Mechanical Engineering & Robotics Symposium, Teaneck, New Jersey, August 10-13, 2006.</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Laney, D. and Hong, D.W., “Three-Dimensional Kinematic Analysis of the Actuated Spoke Wheel Robot”. 30th ASME Mechanisms and Robotics Conference, Philadelphia, Pennsylvania, September 10-13, 2006.</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Wang, Y., Ren, P., Hong, D.W." Mobility and geometrical analysis of a twoactuated spoke wheel robot modeled as a mechanism with variable topology",32ndASME Mechanisms and Robotics Conference, August 6-9, 2008, Brooklyn, New York,United States</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Ren, P., Wang, Y., Hong, D.W." Three-dimensional Kinematic Analysis of a TwoActuated Spoke Wheel Robot Based on its Equivalency to a Serial Manipulator",32ndASME Mechanisms and Robotics Conference,August 6-9, 2008, Brooklyn, New York,United States</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Wang, Y., Ren, P., Hong, D.W." Gait and Gait Transition for a Robot with TwoActuated Spoke Wheels",33rd ASME Mechanisms and Robotics Conference, August30-September 2, 2009,San Diego, California, United States</li>
<li style="margin-bottom: 0.1em; text-align: justify;">2005 ASME Freudenstein/General Motors Young Investigator Award</li>
</ol>Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-1198018729130303420.post-65783985560500754492010-12-21T06:37:00.001-08:002010-12-21T06:37:58.970-08:00RAPHaEL: Robotic Air Powered Hand with Elastic Ligaments<div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">RAPHaEL (Robotic-Air Powered Hand with Elastic Ligaments) is a humanoid robotic hand that utilizes corrugated tube actuation with compressed air. Unlike electromechanically actuated hands, thanks to the natural compliance, RAPHaEL can mimic the grasping capability of a human hand more accurately. By changing the pressure of the compressed air, the amount of applied force can also be controlled.</div><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><span class="Apple-style-span" style="font-size: 19px;"><span class="Apple-style-span" style="color: white; font-size: 13px;"></span></span></div><table cellspacing="5" class="infobox" style="background-color: black; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(170, 170, 170); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(170, 170, 170); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(170, 170, 170); border-top-style: solid; border-top-width: 1px; clear: right; color: white; font-size: 11px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 1em; margin-right: 0px; margin-top: 0.5em; padding-bottom: 0.2em; padding-left: 0.2em; padding-right: 0.2em; padding-top: 0.2em; text-align: left; width: 22em;"><tbody>
<tr><td class="" colspan="2" style="text-align: center; vertical-align: top;"><a class="image" href="http://www.romela.org/main/File:RAPHaEL.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="RAPHaEL.jpg"><img alt="RAPHaEL" border="0" height="345" src="http://www.romela.org/wiki/images/thumb/2/24/RAPHaEL.jpg/300px-RAPHaEL.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; vertical-align: middle;" width="400" /></a></td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">RAPHaEL<br />
Robotic Air Powered Hand with Elastic Ligaments</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Current Researchers</th><td class="" style="text-align: right; vertical-align: top;">Kyle Cothern</td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">Details</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Dimensions</th><td class="" style="text-align: right; vertical-align: top;">90x300mm (approx)</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Weight</th><td class="" style="text-align: right; vertical-align: top;">3-5 lb (approx)</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">DOF</th><td class="" style="text-align: right; vertical-align: top;">6</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Actuators</th><td class="" style="text-align: right; vertical-align: top;">Corrugated Tubing</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Computing</th><td class="" style="text-align: right; vertical-align: top;">National Instruments cRIO, custom board for mobile applications</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Sensors</th><td class="" style="text-align: right; vertical-align: top;">Flex sensors for Position, Force Sensitive Resistors for Force</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Power</th><td class="" style="text-align: right; vertical-align: top;">12.0 VDC, 50-120PSIG</td></tr>
<tr><td colspan="2" style="text-align: right; vertical-align: top;"><div class="noprint plainlinks navbar" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; font-size: xx-small; font-weight: normal; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">This box: <a class="new" href="http://www.romela.org/wiki/index.php?title=Template:RAPHaEL&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template:RAPHaEL (page does not exist)"><span title="View this template">view</span></a> <span style="font-size: 7px;">•</span> <a class="new" href="http://www.romela.org/wiki/index.php?title=Template_talk:RAPHaEL&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template talk:RAPHaEL (page does not exist)"><span title="Discuss this template">talk</span></a></div></td></tr>
</tbody></table><br />
<div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><span class="Apple-style-span" style="font-size: 19px;"><br />
</span></div><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><span class="Apple-style-span" style="font-size: 19px;"><br />
Goals & Objectives</span></div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Goals_.26_Objectives" name="Goals_.26_Objectives" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">The goal of this project is to accurately emulate the motion and dexterity of a human hand. Once this has been accomplished, it can serve as a platform for research related to the the sensing and control in humanoid robotics. Future versions of the hand will be fabricated using <a class="external text" href="http://www.cs.cmu.edu/~sdm/" rel="nofollow" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: url(http://www.romela.org/wiki/skins/romela/external.png); background-origin: initial; background-position: 100% 50%; background-repeat: no-repeat no-repeat; padding-bottom: 0px; padding-left: 0px; padding-right: 13px; padding-top: 0px; text-decoration: underline;" title="http://www.cs.cmu.edu/~sdm/">SDM (Shape Deposition Manufacturing)</a> technology which will enable us to embed all of the components into a single cohesive unit.</div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Advantages" name="Advantages" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Advantages</span></h2><ul style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: url(http://www.romela.org/wiki/skins/romela/bullet.gif); list-style-type: square; margin-bottom: 0.5em; margin-left: 1.5em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Natural compliance</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Low in cost</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Modular design allows for simple repair</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Actuation requires minimul air input (15ml of air at 60psig to achieve actuation in one finger)</li>
</ul><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Potential_Applications" name="Potential_Applications" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Potential Applications</span></h2><ul style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: url(http://www.romela.org/wiki/skins/romela/bullet.gif); list-style-type: square; margin-bottom: 0.5em; margin-left: 1.5em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Hazardous environment operation</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Industry settings</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Prosthetics</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Telepresence</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Entertainment</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Sign language interpretation</li>
</ul>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-68852358370635297392010-12-21T06:36:00.001-08:002010-12-21T06:36:59.410-08:00DARwIn: Dynamic Anthropomorphic Robot with Intelligence<div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em;"><b><span class="Apple-style-span" style="font-weight: normal;"></span></b></div><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><b>DARwIn (Dynamic Anthropomorphic Robot with Intelligence) is a family of fully autonomous humanoid robots capable of bipedal walking and performing human like motions. Developed at the Robotics & Mechanisms Laboratory (RoMeLa) at Virginia Tech, DARwIn is a research platform for studying robot locomotion and autonomous behaviors, and also the base platform for Virginia Tech’s entry to the <a href="http://www.romela.org/main/RoboCup" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;" title="RoboCup">RoboCup</a> competition.</b></div><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><b><br />
</b></div><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><b><span class="Apple-style-span" style="color: white;"></span></b></div><table cellspacing="5" class="infobox" style="background-color: black; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(170, 170, 170); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(170, 170, 170); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(170, 170, 170); border-top-style: solid; border-top-width: 1px; clear: right; color: white; font-size: 11px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 1em; margin-right: 0px; margin-top: 0.5em; padding-bottom: 0.2em; padding-left: 0.2em; padding-right: 0.2em; padding-top: 0.2em; text-align: left; width: 22em;"><tbody>
<tr><td class="" colspan="2" style="text-align: center; vertical-align: top;"><a class="image" href="http://www.romela.org/main/File:DARwInIV_main.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="DARwInIV main.jpg"><img alt="DARwIn IV" border="0" height="640" src="http://www.romela.org/wiki/images/thumb/6/63/DARwInIV_main.jpg/300px-DARwInIV_main.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; vertical-align: middle;" width="424" /></a></td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">DARwIn IV<br />
Dynamic Anthropomorphic Robot with Intelligence</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Principal Investigator</th><td class="" style="text-align: right; vertical-align: top;">Dr. Dennis Hong</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Researchers</th><td class="" style="text-align: right; vertical-align: top;">Jeakweon Han<br />
Dr. Bohee Lee<br />
<a href="http://www.romela.org/main/User:SmSong" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="User:SmSong">Seungmoon Song</a><br />
<a href="http://www.romela.org/main/User:Rnguyen" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="User:Rnguyen">Robert Nguyen</a><br />
Michael Hopkins</td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">Details</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Height</th><td class="" style="text-align: right; vertical-align: top;">55 cm</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Weight</th><td class="" style="text-align: right; vertical-align: top;">3.9 kg</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">DOF</th><td class="" style="text-align: right; vertical-align: top;">21</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Motors</th><td class="" style="text-align: right; vertical-align: top;">Robotis Dynamixel RX-28, RX-64, EX-106</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Computing</th><td class="" style="text-align: right; vertical-align: top;">Gumstix Verdex Pro XL6P<br />
Analog Devices Blackfin BF561</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Vision</th><td class="" style="text-align: right; vertical-align: top;">VT-Cam system with dual HDR camers</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Power</th><td class="" style="text-align: right; vertical-align: top;">Dual 7.4v 2000mAh LiPo batteries</td></tr>
<tr><td colspan="2" style="text-align: right; vertical-align: top;"><div class="noprint plainlinks navbar" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; font-size: xx-small; font-weight: normal; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">This box: <a class="new" href="http://www.romela.org/wiki/index.php?title=Template:DARwIn&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template:DARwIn (page does not exist)"><span title="View this template">view</span></a> <span style="font-size: 7px;">•</span> <a class="new" href="http://www.romela.org/wiki/index.php?title=Template_talk:DARwIn&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template talk:DARwIn (page does not exist)"><span title="Discuss this template">talk</span></a></div></td></tr>
</tbody></table><br />
<div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><b><b><span class="Apple-style-span" style="font-weight: normal;"></span></b></b></div><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; display: inline !important; font-size: 19px; font-weight: normal; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><b><b><span class="mw-headline"><br />
Goals & Objectives</span></b></b></h2><br />
<b><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Goals_.26_Objectives" name="Goals_.26_Objectives" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">DARwIn is a research platform for studying robot locomotion and is also the base platform for Virginia Tech's entry to the RoboCup competition. In this research, we study the issues of mechanical design, kinematics, dynamic bipedal gaits, ZMP control, vision tracking, and complex autonomous behaviors needed for playing soccer.</div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Current_Areas_of_Research" name="Current_Areas_of_Research" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-size: 19px; font-weight: normal; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Current Areas of Research</span></h2><ul style="line-height: 1.5em; list-style-image: url(http://www.romela.org/wiki/skins/romela/bullet.gif); list-style-type: square; margin-bottom: 0.5em; margin-left: 1.5em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Particle-based Localization</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Bipedal Locomotion</li>
</ul><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="DARwIn_Family" name="DARwIn_Family" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-size: 19px; font-weight: normal; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">DARwIn Family</span></h2><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">Many versions of DARwIn have been developed, each an improvement on its predecessor.</div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="DARwIn_0" name="DARwIn_0" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">DARwIn 0</span></h3><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; float: left; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: 13px;"><b><span class="Apple-style-span" style="font-weight: normal;"><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; display: inline !important; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; display: inline !important; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><a class="image" href="http://www.romela.org/main/File:DARwIn0.png" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;" title="DARwIn0.png"><img alt="" border="0" class="thumbimage" height="187" src="http://www.romela.org/wiki/images/thumb/1/1b/DARwIn0.png/100px-DARwIn0.png" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; border-width: initial; vertical-align: middle;" width="100" /></a></div></div></span></b></span></div><div class="thumbcaption" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; font-size: 11px; line-height: 1.4em; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: left;"><div class="magnify" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; float: right;"><a class="internal" href="http://www.romela.org/main/File:DARwIn0.png" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; display: block; text-decoration: underline;" title="Enlarge"><img alt="" height="11" src="http://www.romela.org/wiki/skins/common/images/magnify-clip.png" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-color: initial; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; border-width: initial; display: block; text-align: justify; vertical-align: middle;" width="15" /></a></div></div></div></div><div style="text-align: justify;"><b><span class="Apple-style-span" style="font-weight: normal;">To investigate the feasibility of controlling a 21 DOF humanoid robot, the Cycloid robot which was designed and fabricated by Robotis was used as the testing platform. Since this was not the first physical iteration of DARwIn this testing iteration is called DARwIn 0. The motors used for controlling the robot's motion were the Dynamixel DX-117.</span></b></div><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">DARwIn 0 proved to be a success, demonstrating that the core software developed for the robot worked for controlling the robot to stand up and walk.</div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="DARwIn_I" name="DARwIn_I" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">DARwIn I</span></h3><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; float: left; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: 13px;"><b><span class="Apple-style-span" style="font-weight: normal;"><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; display: inline !important; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; display: inline !important; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><a class="image" href="http://www.romela.org/main/File:DARwInI.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;" title="DARwInI.jpg"><img alt="" border="0" class="thumbimage" height="134" src="http://www.romela.org/wiki/images/thumb/5/5a/DARwInI.jpg/100px-DARwInI.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; border-width: initial; vertical-align: middle;" width="100" /></a></div></div></span></b></span></div><div class="thumbcaption" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; font-size: 11px; line-height: 1.4em; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: left;"><div class="magnify" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; float: right;"><a class="internal" href="http://www.romela.org/main/File:DARwInI.jpg" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; display: block; text-decoration: underline;" title="Enlarge"><img alt="" height="11" src="http://www.romela.org/wiki/skins/common/images/magnify-clip.png" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-color: initial; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; border-width: initial; display: block; text-align: justify; vertical-align: middle;" width="15" /></a></div></div></div></div><div style="text-align: justify;"><b><span class="Apple-style-span" style="font-weight: normal;">DARwIn I was the first humanoid robot created by a senior design project at RoMeLa. DARwIn I has 21 degrees of freedom, 4 force sensors on each feet (which were never used), a 3 axis rate gyro, a 3 axis accelerometer, and space to house a computer and batteries for powering the motors, sensors, and computing equipment. DARwIn I's links are fabricated out of bent sheet aluminum. This robot also uses Robotis Dynamixel DX-117 motors for the joints.</span></b></div><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">No sensor information was used for stability control and the gaits were not generated from mathematical functions. Therefore the robot would not walk successfully unless a sufficient sequence of stances was recorded. Additionally, the robot would fall over in the presence of any external disturbances.</div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="DARwIn_II" name="DARwIn_II" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">DARwIn II</span></h3><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; float: left; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: 13px;"><b><span class="Apple-style-span" style="font-weight: normal;"><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; display: inline !important; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; display: inline !important; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><a class="image" href="http://www.romela.org/main/File:DARwInIIa.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;" title="DARwInIIa.jpg"><img alt="" border="0" class="thumbimage" height="134" src="http://www.romela.org/wiki/images/thumb/5/54/DARwInIIa.jpg/100px-DARwInIIa.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; border-width: initial; vertical-align: middle;" width="100" /></a></div></div></span></b></span></div><div class="thumbcaption" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; font-size: 11px; line-height: 1.4em; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: left;"><div class="magnify" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; float: right;"><a class="internal" href="http://www.romela.org/main/File:DARwInIIa.jpg" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; display: block; text-decoration: underline;" title="Enlarge"><img alt="" height="11" src="http://www.romela.org/wiki/skins/common/images/magnify-clip.png" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-color: initial; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; border-width: initial; display: block; text-align: justify; vertical-align: middle;" width="15" /></a></div></div></div></div><div style="text-align: justify;"><b><span class="Apple-style-span" style="font-weight: normal;">DARwIn II was designed and fabricated by the 2006-2007 senior design team. Two versions were created, DARwIn IIa in the fall and DARwIn IIb in the spring. DARwIn IIa builds on its predecessor with improved mechanical design, more sensors, and added intelligence. The links were cut out of solid blocks of aluminum on a CNC mill to maximize stiffness and reduce weight.</span></b></div><div style="line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">DARwIn IIb is based on the design of DARwIn IIa, but with improvements in all categories. The motors used for articulating DARwIn's joints were replaced with a motor with twice the torque. DARwIn's link design was further refined to create even lighter weight parts. The entire computer, sensors, electronics package, and computer ports were mounted to a custom designed heat sink as a single module. This module is attached to the robot body using shock mounts, which allows easy access and removal while protecting the equipment from shock when falling.</div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="DARwIn_III" name="DARwIn_III" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">DARwIn III</span></h3><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; float: left; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: 13px;"><b><span class="Apple-style-span" style="font-weight: normal;"><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; display: inline !important; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; display: inline !important; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><a class="image" href="http://www.romela.org/main/File:DARwInIII.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;" title="DARwInIII.jpg"><img alt="" border="0" class="thumbimage" height="206" src="http://www.romela.org/wiki/images/thumb/3/34/DARwInIII.jpg/100px-DARwInIII.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; border-width: initial; vertical-align: middle;" width="100" /></a></div></div></span></b></span></div><div class="thumbcaption" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; font-size: 11px; line-height: 1.4em; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: left;"><div class="magnify" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; float: right;"><a class="internal" href="http://www.romela.org/main/File:DARwInIII.jpg" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; display: block; text-decoration: underline;" title="Enlarge"><img alt="" height="11" src="http://www.romela.org/wiki/skins/common/images/magnify-clip.png" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-color: initial; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; border-width: initial; display: block; text-align: justify; vertical-align: middle;" width="15" /></a></div></div></div></div><div style="text-align: justify;"><b><span class="Apple-style-span" style="font-weight: normal;">DARwIn III was an attempt to perfect the designs before it. Successful in many ways, DARwIn III improved in a few key areas. RX-64 motors were used for almost the entire lower body in order to gain more power. RX-28 motors were used for the yawing motor in the hip to reduce the height of the robot. An RX-64 motor was used in the waist to add strength.</span></b></div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="DARwIn_IV" name="DARwIn_IV" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">DARwIn IV</span></h3><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; float: left; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: 13px;"><b><span class="Apple-style-span" style="font-weight: normal;"><div class="thumb tleft" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 0.8em; border-left-color: black; border-left-style: solid; border-left-width: 0px; border-right-color: black; border-right-style: solid; border-right-width: 1.4em; border-top-color: black; border-top-style: solid; border-top-width: 0.5em; clear: left; display: inline !important; margin-bottom: 0.5em; margin-right: 0.5em; width: auto;"><div class="thumbinner" style="background-color: black; border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; display: inline !important; font-size: 12px; min-width: 100px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: center; width: 102px;"><a class="image" href="http://www.romela.org/main/File:D4.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;" title="D4.jpg"><img alt="" border="0" class="thumbimage" height="151" src="http://www.romela.org/wiki/images/thumb/9/99/D4.jpg/100px-D4.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; border-width: initial; vertical-align: middle;" width="100" /></a></div></div></span></b></span></div><div class="thumbcaption" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; font-size: 11px; line-height: 1.4em; padding-bottom: 3px !important; padding-left: 3px !important; padding-right: 3px !important; padding-top: 3px !important; text-align: left;"><div class="magnify" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; float: right;"><a class="internal" href="http://www.romela.org/main/File:D4.jpg" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; display: block; text-decoration: underline;" title="Enlarge"><img alt="" height="11" src="http://www.romela.org/wiki/skins/common/images/magnify-clip.png" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; border-bottom-style: none !important; border-color: initial !important; border-color: initial; border-left-style: none !important; border-right-style: none !important; border-top-style: none !important; border-width: initial !important; border-width: initial; display: block; text-align: justify; vertical-align: middle;" width="15" /></a></div></div></div></div><div style="text-align: justify;"><b><span class="Apple-style-span" style="font-weight: normal;">DARwIn IV was a radical change from DARwIn III. The decision was made to move away from the PC/104-Plus platform to a Gumstix, thereby allowing us to reduce the size and weight of the robot. </span></b></div><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Publications" name="Publications" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-size: 19px; font-weight: normal; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Publications</span></h2><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Journal_Papers" name="Journal_Papers" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Journal Papers</span></h3><ol style="line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Hurdus, J., Hong, D., “The Use of Hierarchical State Machines for Behavioral Programming in the DARPA Urban Challenge and RoboCup”, Springer-Verlag of Lecture Notes in Electrical Engineering (LNEE), 2009 (in print)</li>
</ol><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Refereed_Conference_Papers" name="Refereed_Conference_Papers" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Refereed Conference Papers</span></h3><ol style="line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Muecke, M. and Hong, D. W., “Investigation of an Analytical Motion Filter for Humanoid Robots”, 5th International Conference on Ubiquitous Robots and Ambient Intelligence, Seoul, S. Korea, November 20-22, 2008</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Hurdus, J., Hong, D., “The Use of Hierarchical State Machines for Behavioral Programming in the DARPA Urban Challenge and RoboCup”, IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Korea, August 20-22, 2008</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, M. and Hong, D. W., “The Synergistic Combination of Research, Education, and International Robot Competitions Through the Development of a Humanoid Robot”, 32nd ASME Mechanisms and Robotics Conference, New York City, NY, August 3-6, 2008</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, M., Hong, D. W., and Lim, S., “Precision Circular Walking of Bipedal Robots”, 32nd ASME Mechanisms and Robotics Conference, New York City, NY, August 3-6, 2008</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K., and Hong, D. W., “DARwIn’s Evolution: Development of a Humanoid Robot”, 2007 IEEE International Conference on Intelligent Robotics and Systems, San Diego, CA, October 29-November 2, 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Terpenny, J., Dancey, C., Goff, R., Nelson, D., Ellis, M., and Hong, D. W., “Success Strategies for Capstone Design Courses with Large Classes, Diverse Project Types, Small to Large Student Teams, and Varied Faculty Interests and Approaches”, 2007 ASEE Annual Conference & Exposition, Honolulu, Hawaii, June 24-27, 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Hong, D. W., “Biologically Inspired Locomotion Strategies: Novel Ground Mobile Robots at RoMeLa”, 3rd International Conference on Ubiquitous Robots and Ambient Intelligence, Seoul, S. Korea, October 15-17, 2006</li>
</ol><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Non-Refereed_Conference_Papers" name="Non-Refereed_Conference_Papers" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Non-Refereed Conference Papers</span></h3><ol style="line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K. and Hong, D. W., “Development of a Fully Autonomous Humanoid Robot for Novel Locomotion Research and as the First US Humanoid Entry to Robocup”, NI Week, Worldwide Virtual Instrumentation Conference, Austin, Texas, August 7-9, 2007 (Most Outstanding Application of Virtual Instrumentation, Editor’s Choice Award Winner for Best Application of Virtual Instrumentation, Best Application of Virtual Instrumentation, Mechatronics Category Winner)</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K. and Hong, D. W., “Development of an Open Humanoid Robot Platform for Research and Autonomous Soccer Playing”, 22nd AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, July 2007 (Technical Innovation Award, Judges’ Award for Mechanism Design)</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K. and Hong, D., “A Reactive Approach to Behavior Based Control of a Soccer Playing Humanoid Robot”, US-Korea Conference on Science, Technology and Entrepreneurship (UKC2007), Mechanical Engineering & Robotics Symposium, Washington DC, August 9-12, 2007</li>
</ol><span class="Apple-style-span" style="background-attachment: initial; background-clip: initial; background-color: initial; background-origin: initial;"><a href="" id="Other_Selected_Publications" name="Other_Selected_Publications" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-size: 17px; font-weight: bold; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Other Selected Publications</span></h3><ol style="line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K. and Hong, D. W., “PC/104-Plus: The Brains Behind the DARwIn Humanoid Robot”, PC/104 and Small Form Factors, Journal of Modular Embedded Design, Vol. 12, No. 3, Summer 2008, pp. 26-30</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K. and Hong, D. W., “DARwIn’s Evolution: Development of a Humanoid Robot for Research and Education”, Industrial Embedded Systems, OpenSystems Publishing, December 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Hong, D., Muecke, K., Mayo, R., Hurdus, J., and Pullins, B., “DARwIn’s Fist Soccer Tournament: America’s First Entry to the Humanoid Division of RoboCup”, Servo Magazine, Vol. 5, No. 9, September, 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K., Mayo, R., Hong, D. W., “DARwIn: Dynamic Anthropomorphic Robot with Intelligence, Part 3 – DARwIn 2.0: The Next Generation”, Servo Magazine, Vol. 5, No. 2, February, 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K., Cox, P., Hong, D. W., “DARwIn: Dynamic Anthropomorphic Robot with Intelligence, Part 2 – Parts, Wires and Motors”, Servo Magazine, Vol. 5, No. 1, January, 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Muecke, K., Cox, P., Hong, D. W., “DARwIn: Dynamic Anthropomorphic Robot with Intelligence, Part 1 – Concept & General Overview”, Servo Magazine, Vol. 4, No. 12, December, 2006</li>
</ol></b>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-57734557228239169172010-12-21T06:35:00.001-08:002010-12-21T06:35:46.531-08:00STriDER: Self-excited Tripedal Dynamic Experimental Robot<div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">STriDER (Self-excited Tripedal Dynamic Experimental Robot) is a novel three-legged walking machine that exploits the concept of actuated passive dynamic locomotion to dynamically walk with high energy efficiency and minimal control. Unlike other passive dynamic walking machines, this unique tripedal locomotion robot is inherently stable with its tripod stance and can change directions while walking.</div><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;"><span class="Apple-style-span" style="color: white;"></span></div><table cellspacing="5" class="infobox" style="background-color: black; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(170, 170, 170); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(170, 170, 170); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(170, 170, 170); border-top-style: solid; border-top-width: 1px; clear: right; color: white; font-size: 11px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 1em; margin-right: 0px; margin-top: 0.5em; padding-bottom: 0.2em; padding-left: 0.2em; padding-right: 0.2em; padding-top: 0.2em; text-align: left; width: 22em;"><tbody>
<tr><td class="" colspan="2" style="text-align: center; vertical-align: top;"><a class="image" href="http://www.romela.org/main/File:STriDER_1and2.jpg" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="STriDER 1and2.jpg"><img alt="IMPASS" border="0" height="298" src="http://www.romela.org/wiki/images/thumb/5/5c/STriDER_1and2.jpg/300px-STriDER_1and2.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; vertical-align: middle;" width="400" /></a></td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">STriDER<br />
Self-excited Tripedal Dynamic Experimental Robot</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Principal Investigator</th><td class="" style="text-align: right; vertical-align: top;">Dr. Dennis Hong</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Researchers</th><td class="" style="text-align: right; vertical-align: top;">Ping Ren<br />
<a href="http://www.romela.org/main/User:Joehays" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; color: white; text-decoration: underline;" title="User:Joehays">Joe Hays</a></td></tr>
<tr><th colspan="2" style="background-attachment: initial; background-clip: initial; background-color: #404040; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: center; vertical-align: top;">Mechanical Details</th></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">DOF</th><td class="" style="text-align: right; vertical-align: top;">12</td></tr>
<tr class=""><th style="background-attachment: initial; background-clip: initial; background-color: #272727; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; text-align: left; vertical-align: top;">Motors</th><td class="" style="text-align: right; vertical-align: top;">Robotis Dynamixel RX-28</td></tr>
<tr><td colspan="2" style="text-align: right; vertical-align: top;"><div class="noprint plainlinks navbar" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; background-position: initial initial; background-repeat: initial initial; font-size: xx-small; font-weight: normal; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">This box: <a class="new" href="http://www.romela.org/wiki/index.php?title=Template:STriDER&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template:STriDER (page does not exist)"><span title="View this template">view</span></a> <span style="font-size: 7px;">•</span> <a class="new" href="http://www.romela.org/wiki/index.php?title=Template_talk:STriDER&action=edit&redlink=1" style="background-attachment: initial !important; background-clip: initial !important; background-color: initial !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; color: #cc2200; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: underline;" title="Template talk:STriDER (page does not exist)"><span title="Discuss this template">talk</span></a></div></td></tr>
</tbody></table><br />
<div style="text-align: center;"><span class="Apple-style-span" style="font-family: sans-serif;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; font-size: 12px; line-height: 19px;"><b><br />
</b></span></span></div><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Operational Concept</span></h2><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">During a step, two legs act as stance legs while the other acts as a swing leg. The legs are oriented to push the center of gravity outside of the stance legs to initiate a step. As the body of the robot falls forward, the swing leg naturally swings in between the two stance legs and catches the fall. The body also rotates 180 degrees, preventing the legs from tangling up. Once all three legs are in contact with the ground, the robot regains its stability and the posture of the robot is then reset in preparation for the next step. Gaits for changing directions are implemented in a rather interesting way: by changing the sequence of choice of the swing leg, the tripedal gait can move the robot in 60° interval directions for each step.</div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Advantages" name="Advantages" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Advantages</span></h2><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">The simple tripod configuration and tripedal gait of STriDER has many advantages over other legged robots; it has a simple kinematic structure; it is inherently stable (like a camera tripod); it is simple to control as the motion is a simple falling in a predetermined direction and catching its fall; it is energy efficient, exploiting the actuated passive dynamic locomotion concept utilizing its built in dynamics; it is lightweight enabling it to be launched to difficult to access areas; and it is tall making it ideal for deploying and positioning sensors at high position for surveillance, for example.</div><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Goals_and_Objectives" name="Goals_and_Objectives" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Goals and Objectives</span></h2><div style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.4em; text-align: justify;">In this research, we study the issues of actuated passive dynamic locomotion, optimizing physical design parameters for dynamically walking robots, and the design of this novel locomotion system by means of a combination of theoretical analysis, computer simulation, and designing and construction of prototypes for experimentation. The overall research objectives are:</div><ul style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: url(http://www.romela.org/wiki/skins/romela/bullet.gif); list-style-type: square; margin-bottom: 0.5em; margin-left: 1.5em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Analyze and synthesize various gait strategies for changing directions and path planning</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Study the three-dimensional kinematics and dynamics of STriDER</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Improve understanding of the effects of design parameters on the quality of gaits and find optimal mechanical design parameters with dynamic considerations</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Design and fabricate a working robot prototype to verify the analytical model and evaluate the concept.</li>
</ul><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Publications" name="Publications" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Publications</span></h2><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Journal_Papers" name="Journal_Papers" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-family: sans-serif; font-size: 17px; font-weight: bold; line-height: 19px; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Journal Papers</span></h3><ol style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Ren,P., Hong, D.W., and Morazzani, I., “Forward and Inverse Displacement Analysis of A Novel Three-Legged Mobile Robot Based on the Kinematics of In-parallel Manipulators,” ASME Journal of Mechanisms and Robotics, submitted</li>
</ol><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Book_Chapters" name="Book_Chapters" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-family: sans-serif; font-size: 17px; font-weight: bold; line-height: 19px; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Book Chapters</span></h3><ol style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Morazzani, I., Lahr, D., Hong, D.W., Ren, P., “Novel Tripedal Mobile Robot and Considerations for Gait Planning Strategies Based on Kinematics,” Recent Progress in Robitics: Viable Robotic Service to Human, pp.35-48, Springer-Verlag Berlin Heidelberg, 2008</li>
</ol><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Conference_Papers" name="Conference_Papers" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h3 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: initial; border-bottom-style: none; border-bottom-width: initial; font-family: sans-serif; font-size: 17px; font-weight: bold; line-height: 19px; margin-bottom: 0.3em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Conference Papers</span></h3><ol style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Ren, P., Hong, D.W., “Instantaneous Kinematics and Singularity Analysis of a Novel Three-Legged Mobile Robot with Active S-R-R-R Legs,” 32nd ASME Mechanisms and Robotics Conference, August 3-6, 2008, Brooklyn, New York, USA</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Ren, P.,Morazzani, I., and Hong, D.W., “Forward and Inverse Displacement Analysis of a Novel Three-legged Mobile Robot base on the Kinematics of In-parallel Manipulators,” 31st ASME Mechanisms and Robotics Conference , September 4-7, 2007, Las Vegas, Nevada, USA</li>
<li style="margin-bottom: 0.1em; text-align: justify;">J.R. Heaston and D.W. Hong, "Design of a novel tripedal locomotion robot and simulation of a dynamic gait for a single step," ASME Mechanisms and Robotics Conference, September 2007.</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Hong, D. W., “Biologically Inspired Locomotion Strategies: Novel Ground Mobile Robots at RoMeLa”, The 3rd International Conference on Ubiquitous Robots and Ambient Intelligence, Seoul, S. Korea, October 15-17, 2006</li>
<li style="margin-bottom: 0.1em; text-align: justify;">Heaston, J. R., Hong, D. W., Morazzani, I., Ren, P., Goldman, G., “STriDER: Self-Excited Tripedal Dynamic Experimental Robot”, 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, April 10-14, 2007</li>
<li style="margin-bottom: 0.1em; text-align: justify;">D.W. Hong and D.F Lahr. "Synthesis of the body swing rotator joint aligning mechanism for the abductor joint of a novel tripedal locomotion robot," ASME Mechanisms and Robotics Conference, September 2007.</li>
</ol><span class="Apple-style-span" style="font-family: sans-serif; font-size: 13px; line-height: 19px;"><a href="" id="Awards" name="Awards" style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; text-decoration: underline;"></a></span><h2 style="background-attachment: initial; background-clip: initial; background-color: initial; background-image: none; background-origin: initial; border-bottom-color: rgb(170, 170, 170); border-bottom-style: solid; border-bottom-width: 1px; font-family: sans-serif; font-size: 19px; font-weight: normal; line-height: 19px; margin-bottom: 0.6em; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 0.17em; padding-top: 0.5em; text-align: justify;"><span class="mw-headline">Awards</span></h2><ol style="font-family: sans-serif; font-size: 13px; line-height: 1.5em; list-style-image: none; margin-bottom: 0.5em; margin-left: 3.2em; margin-right: 0px; margin-top: 0.3em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><li style="margin-bottom: 0.1em; text-align: justify;">Best Paper Award, The 13th International Conference on Advanced Robotics, 2007</li>
</ol>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-16930842747243767022010-12-21T06:34:00.001-08:002010-12-21T06:34:19.200-08:00a full-sized humanoid robot<div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">As CHARLI takes his first steps, anxious onlookers stand ready to catch him if he falls. His stride is short, but upright, as one foot is placed in front of the other in the basement of Virginia Tech’s Randolph Hall. </div><div class="vt_img_caption_left vt_medium_img" style="display: block; float: left; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 15px; margin-right: 10px; margin-top: 5px;"><div style="background-attachment: initial; background-clip: initial; background-color: #fbfbf8; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; clear: none; display: block; float: none; margin-bottom: 13px; margin-top: 3px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px; padding-left: 3px; padding-right: 3px; padding-top: 3px; width: 500px;"></div><div style="text-align: justify;"><img alt="Students prepare CHARLI for his walk at the group's Randolph Hall basement facility" height="320px" src="http://www.vt.edu/spotlight/innovation/2010-04-26-charli/M_charli-apart.jpg" style="background-attachment: initial; background-clip: initial; background-color: white; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" width="490px" /></div><span style="background-color: transparent; color: #666666; display: block; float: left; font-size: 0.9em; line-height: 1.2em; padding-bottom: 10px; padding-left: 10px; padding-right: 10px; padding-top: 10px; text-align: justify; width: 485px;">(From left) Robotics and Mechanisms Laboratory students Taylor Young, a senior; Jeakweon Han, a Ph.D. student; Rob Nguyen, a master’s student; and Mike Stevens, a senior; prepare CHARLI for his walk at the group's Randolph Hall basement facility.</span></div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">But CHARLI is no toddler.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">He is a 5-foot tall humanoid robot built by graduate and undergraduate students with the Virginia Tech <a href="http://www.eng.vt.edu/main/index.php" style="border-bottom-color: rgb(153, 153, 153); border-bottom-style: dotted; border-bottom-width: 1px; color: #333333; text-decoration: none;">College of Engineering</a>’s <a href="http://www.romela.org/main/Robotics_and_Mechanisms_Laboratory" style="border-bottom-color: rgb(153, 153, 153); border-bottom-style: dotted; border-bottom-width: 1px; color: #333333; text-decoration: none;" target="_blank">Robotics and Mechanisms Laboratory</a> (RoMeLa).</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">After a long moment, CHARLI comes to a rest. An audible “Whew!” is heard from CHARLI’s main architect, doctoral student Jeakweon (“J.K.”) Han. Dennis Hong, associate professor of <a href="http://www.me.vt.edu/" style="border-bottom-color: rgb(153, 153, 153); border-bottom-style: dotted; border-bottom-width: 1px; color: #333333; text-decoration: none;">mechanical engineering</a> and director of RoMeLa, can’t resist a joke. </div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">“One small step for a robot, one giant leap for robotics,” he shouted.</div><div class="vt_img_caption_left vt_medium_img" style="display: block; float: left; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 15px; margin-right: 10px; margin-top: 5px;"><div style="background-attachment: initial; background-clip: initial; background-color: #fbfbf8; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; clear: none; display: block; float: none; margin-bottom: 13px; margin-top: 3px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px; padding-left: 3px; padding-right: 3px; padding-top: 3px; width: 500px;"></div><div style="text-align: justify;"><img alt="Dennis Hong, associate professor of mechanical engineering and director of RoMeLa, is the faculty adviser on the project to build CHARLIs L (Lightweight) and H (Heavy)." height="320px" src="http://www.vt.edu/spotlight/innovation/2010-04-26-charli/M_charli-dennis.jpg" style="background-attachment: initial; background-clip: initial; background-color: white; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" width="490px" /></div><span style="background-color: transparent; color: #666666; display: block; float: left; font-size: 0.9em; line-height: 1.2em; padding-bottom: 10px; padding-left: 10px; padding-right: 10px; padding-top: 10px; text-align: justify; width: 485px;">Dennis Hong, associate professor of mechanical engineering and director of RoMeLa, is the faculty adviser on the project to build CHARLIs L (Lightweight) and H (Heavy). To build a human-sized bipedal humanoid robot that can walk upright, traipse stairs, and cover uneven ground is the “holy grail” of robotics research, Hong said.</span></div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Hong isn’t entirely facetious, though. CHARLI (that’s for Cognitive Humanoid Autonomous Robot with Learning Intelligence) is historic. CHARLI is the first untethered, autonomous, full-sized, walking, humanoid robot with four moving limbs and a head, built in the United States. His two long legs and arms can move and gesture thanks to a combination of pulleys, springs, carbon fiber rods, and actuators. CHARLI soon will be able to talk as well.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Fans of robotics are taking note. </div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">“This is a significant milestone in robotics engineering and is a testament to the technological leadership of Virginia Tech’s RoMeLa lab,” said Tom Atwood, editor of “Robot” magazine.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Of course it will be many years before CHARLI or his incarnations will be seen walking around campus or even in homes across America. Hong refers to the latter placement as his “Jetsons Goal” (named for the popular 1960s cartoon that featured Rosie, a robotic maid for the futuristic family).</div><div class="vt_img_caption_right vt_medium_img" style="display: block; float: right; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 15px; margin-left: 10px; margin-top: 5px;"><div style="background-attachment: initial; background-clip: initial; background-color: #fbfbf8; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; clear: none; display: block; float: none; margin-bottom: 13px; margin-top: 3px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px; padding-left: 3px; padding-right: 3px; padding-top: 3px; width: 250px;"></div><div style="text-align: justify;"><img alt="CHARLI takes a few steps while Seungmoon Song, a master’s student in electrical and computer engineering from Busan, South Korea, observes." height="360px" src="http://www.vt.edu/spotlight/innovation/2010-04-26-charli/M_charli-walk.jpg" style="background-attachment: initial; background-clip: initial; background-color: white; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" width="240px" /></div><span style="background-color: transparent; color: #666666; display: block; float: left; font-size: 0.9em; line-height: 1.2em; padding-bottom: 10px; padding-left: 10px; padding-right: 10px; padding-top: 10px; text-align: justify; width: 235px;">CHARLI takes a few steps while Seungmoon Song, a master’s student in electrical and computer engineering, observes.</span></div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">“The environment we live in is designed for humans: The step size of stairs, the height of door handles, etc., are designed by humans for humans,” Hong said. “Thus for a robot to live among us and to serve us, it needs human size and form. Thus humanoids. But, manipulation with hands, perception, intelligence, and autonomy are all important and difficult research problems that need to be addressed.”</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">That research is under way. There are two CHARLIs.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">The one now walking across floors and motioning his arms is CHARLI L, as in Lightweight. He’s meant to walk indoors on known flat surfaces, but not run or jump. Eventually, he will be able to kick soccer balls. The robot is expected to debut at this year’s RoboCup tournament in Singapore.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Then there is CHARLI H (for Heavy). This bulkier robot will utilize custom-designed actuators and other technologies that one day will allow it walk on the sloping, rising ground that comprises Virginia Tech’s campus. He also will be able to run, jump, kick, open doors, pick up objects, and do just about anything a real person can do.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">“CHARLI H will be a fully functioning robot,” said Derek Lahr, a Ph.D. student from Charleston, S.C., who is spearheading the “H” project. For now, only one leg of CHARLI H is complete.</div><div class="vt_img_caption_left vt_medium_img" style="display: block; float: left; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 15px; margin-right: 10px; margin-top: 5px;"><div style="background-attachment: initial; background-clip: initial; background-color: #fbfbf8; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; clear: none; display: block; float: none; margin-bottom: 13px; margin-top: 3px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 3px; padding-left: 3px; padding-right: 3px; padding-top: 3px; width: 500px;"></div><div style="text-align: justify;"><img alt="Derek Lahr, a Ph.D. student at RoMeLa, holds the leg of the humanoid robot, CHARLI H" height="367px" src="http://www.vt.edu/spotlight/innovation/2010-04-26-charli/M_charli-derek.jpg" style="background-attachment: initial; background-clip: initial; background-color: white; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(187, 186, 176); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(187, 186, 176); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(187, 186, 176); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(187, 186, 176); border-top-style: solid; border-top-width: 1px; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" width="490px" /></div><span style="background-color: transparent; color: #666666; display: block; float: left; font-size: 0.9em; line-height: 1.2em; padding-bottom: 10px; padding-left: 10px; padding-right: 10px; padding-top: 10px; text-align: justify; width: 485px;">Derek Lahr, a Ph.D. student at RoMeLa, holds the leg of the humanoid robot, CHARLI H. This bulky, stronger robot will utilize various technologies that may allow it to walk outside.</span></div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">The students built CHARLI L with $20,000 in seed money from the Virginia Tech Student Engineers’ Council and donated equipment from National Instruments and Maxon Precision Motors. “The budget constraints actually inspired us to think of different solutions, ‘How can we actually make this happen with a small budget?' And that actually led us to new types of mechanical solutions,” Hong said.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Work on the robot began in 2008 with 13 undergraduate and graduate students working on the project at any given time. Inspiration came from science-fiction films and spouses.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">“I hope CHARLI could help physically challenged people to cook, clean, and carry items like the NS-5,” said Han, referring to the humanoid robot at the center of the 2004 film “I, Robot.” Han’s design concept was assisted by his wife, Younseal Eum, who is an artist. </div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Atwood said he is excited for the future. “Eventually, there will be a robot in every home assisting families and individuals, and walking robots will find work in all kinds of places, from warehouses to manufacturing centers,” he said.</div><div style="clear: none; display: block; float: none; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; margin-bottom: 13px; margin-top: 3px; padding-bottom: 0px; padding-top: 0px; text-align: justify; width: auto;">Indeed, Hong will have seen his “Jetsons Goal” come true.</div><ul style="background-color: transparent; border-bottom-color: rgb(227, 226, 214); border-bottom-style: solid; border-bottom-width: 0px; border-left-color: rgb(227, 226, 214); border-left-style: solid; border-left-width: 0px; border-right-color: rgb(227, 226, 214); border-right-style: solid; border-right-width: 0px; border-top-color: rgb(227, 226, 214); border-top-style: solid; border-top-width: 0px; font-family: Arial, Helvetica, sans-serif; font-size: 13px; line-height: 18px; list-style-image: initial; list-style-position: initial; list-style-type: none; margin-bottom: 0px; margin-left: 0px; margin-top: 10px; padding-bottom: 0px; padding-left: 0px; padding-right: 10px; padding-top: 0px; width: auto;"><li style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: url(http://www.vt.edu/global_assets/images/li_black.gif); background-origin: initial; background-position: 0% 4px; background-repeat: no-repeat no-repeat; font-size: 1em; font-style: normal; line-height: 1.4em; margin-bottom: 0px; margin-top: 0px; padding-bottom: 5px; padding-left: 15px; padding-right: 10px; padding-top: 0px; text-align: justify;"><i>For more information on this topic, contact Steven Mackay at</i> <a href="mailto:smackay@vt.edu" style="color: #7b4342; margin-bottom: 0px; margin-top: 0px; padding-bottom: 0px; padding-top: 0px; text-decoration: none;"><u><i>smackay@vt.edu</i></u></a><i>or (540) 231-4787.</i></li>
</ul>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-11437808022389744052010-12-21T06:32:00.001-08:002010-12-21T06:32:52.049-08:0040 Years Ago, Robots Started Doing Our Work For Us In Space<span class="Apple-style-span" style="font-family: 'Lucida Grande', Helvetica, Arial, sans-serif; font-size: 12px; line-height: 20px;"></span><br />
<div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><img alt="40 Years Ago, Robots Started Doing Our Dirty Work For Us In Space" class="left image500 image_0" src="http://cache.gawkerassets.com/assets/images/4/2010/09/500x_annexe15_luna_16c.jpg" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(179, 179, 179); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(179, 179, 179); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(179, 179, 179); border-right-style: solid; border-right-width: 1px; border-style: initial; border-top-color: rgb(179, 179, 179); border-top-style: solid; border-top-width: 1px; clear: left; float: left; font-size: 12px; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;" width="500" />Earthlings had scored moon rocks before 1970. NASA's Apollo 11 and 12 missions successfully hauled them back to study—immense scientific accomplishments, of course. One problem. It cost $142 billion in today's dollars. Russia's solution? Send a robot instead.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">The Soviets were the first to realize that sending a human to pick up dust is incredibly expensive, and, if you're the guy in the pod—maybe just a bit too dangerous to be worth it. Their response was the Luna 16 space probe, history's first robotic craft to successfully bring back an extraterrestrial sample.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">The stakes were high. To call the space race anything but a, well, race, is to state the obvious. But Luna 16's predecessor had lost a literal race, in an almost comically embarrassing moment of Space Age failure—the Luna 15 unmanned probe crashed itself into lunar oblivion, smashed on the surface of the moon, just hours before Apollo 11 began the trip back to Earth.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">So for Russia, it was time to get serious—and fast.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">A successful robo-drilling effort would require a craft that could land itself gently without damaging the sensitive drilling equipment inside. The craft itself—a byzantine mound of pods and landing apparatuses that still look like science fiction compared to today's sterilized space affairs—gracefully maneuvered onto the surface of the moon's Sea of Fertility, using a delicate pair of descent engines that stopped firing only 20 meters above the surface, before it clamped down and wasted no time drilling. Luna 16, unlike her older sister, landed softly. Seven minutes later, the Russian digger was done, having sucked up a sample of basalt rock 35 mm deep into the moon's surface, where it would be stored and analyzed back on earth.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;"><img alt="40 Years Ago, Robots Started Doing Our Dirty Work For Us In Space" class="left image500 image_1" src="http://cache.gawkerassets.com/assets/images/4/2010/09/500x_h_lunik16_02.jpg" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-color: rgb(179, 179, 179); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-color: rgb(179, 179, 179); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(179, 179, 179); border-right-style: solid; border-right-width: 1px; border-style: initial; border-top-color: rgb(179, 179, 179); border-top-style: solid; border-top-width: 1px; clear: left; float: left; font-size: 12px; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;" width="500" /></div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">A day later, its top section blasted off, dusty cargo inside, leaving the other half behind to record and broadcast data from the moon's surface. Luna 16's cap, carrying a scant 101 grams of lunar rock, crashed safely into the Kazakhstani steppes, carrying what was described as a "grayish brown" dust. That might sound dull, but this dust was later sold for $442,500 at auction (though not before a bit was exchanged for some NASA moon rock as a gesture of goodwill).</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">The mission was modest, perhaps, as was the cargo. But Luna 16's legacy isn't just its onetime ability to scrape powder off the moon and scoot back to earth, but its proof that robots can do some seriously cool stuff for us. Sending robots to work in space in the 70s put mitigated the romanticized (and, during that era, heavily-politicized) dream notion of the Man In Space. Where our hands (and heads) were needed, increasingly, were on the ground, where brilliant engineers and scientists of all shades on both sides of the Iron Curtain worked to realize the fact that sometimes it's a matter of finding the right bot for the job.</div><div style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 12px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; vertical-align: baseline;">The <a href="http://gizmodo.com/225070/mars-rovers-three-years-running-and-smarter-too" style="background-attachment: initial; background-clip: initial; background-color: transparent; background-image: initial; background-origin: initial; background-position: initial initial; background-repeat: initial initial; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; color: #dc870e; font-size: 12px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; vertical-align: baseline;">incredible success of the Mars Rovers</a>? At least in part (and in principle), thanks Luna 16—and happy 40th birthday.</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-47094359952731227572010-12-21T06:31:00.001-08:002010-12-21T06:31:49.566-08:00Robotic Spider Melds Legos and 3-D Printing<div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a href="http://www.wired.com/images_blogs/gadgetlab/2010/02/ks01_display_medium.jpg" style="background-color: #e5f8ff; color: #238db1; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="ks01_display_medium" class="alignnone size-large wp-image-33265" height="495" src="http://www.wired.com/images_blogs/gadgetlab/2010/02/ks01_display_medium-660x495.jpg" style="background-color: white; border-bottom-style: none; border-bottom-width: 0px; border-color: initial; border-color: initial; border-left-style: none; border-left-width: 0px; border-right-style: none; border-right-width: 0px; border-style: initial; border-top-style: none; border-top-width: 0px; border-width: initial; display: block; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 5px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="ks01_display_medium" width="660" /></a></div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Lego’s programmable robotics set Mindstorms is a fun toy for computing enthusiasts but if you really want to take it to the next level, check out Mark Weller’s project.</div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Weller, a machinist and technician at the McCoy School of Engineering at Midwestern State University, combined milled plastic pieces with the basic Lego Mindstorms set to create a robotic spider that can crawl and turn.</div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">“I wanted to open students’ minds to go beyond ‘let’s put the parts together and program the robot,’” he says. “This project is more than sticking the wheels on a Lego set.” The school uses Lego Mindstorms to introduce freshman students to robotics.</div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">The spider robot’s legs are based on a concept called the <a href="http://www.mechanicalspider.com/concept.html" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">Klann linkage</a>. A single leg has a <a href="http://en.wikipedia.org/wiki/Klann_Linkage" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">six-bar linkage</a> with a frame, crank, two rockers and two couplers connected with pivot joints. This transforms rotating motion into linear motion.</div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Weller says he created the spider’s legs from 3/8-inch plastic sheet stock on a 3-axis CNC mill. But it can also be made by a 3-D printer such as <a href="http://www.wired.com/gadgetlab/2009/08/makerbot/" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">Makerbot</a> and RepRap.</div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a href="http://www.wired.com/images_blogs/gadgetlab/2010/02/ks05legasm_display_medium.jpg" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="ks05legasm_display_medium" class="alignnone size-large wp-image-33271" height="495" src="http://www.wired.com/images_blogs/gadgetlab/2010/02/ks05legasm_display_medium-660x495.jpg" style="border-bottom-style: none; border-bottom-width: 0px; border-color: initial; border-color: initial; border-left-style: none; border-left-width: 0px; border-right-style: none; border-right-width: 0px; border-style: initial; border-top-style: none; border-top-width: 0px; border-width: initial; display: block; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; padding-bottom: 5px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="ks05legasm_display_medium" width="660" /></a></div><div style="color: #333333; font-family: Arial, Verdana, sans-serif; font-size: 14px; line-height: 20px; margin-bottom: 20px; margin-left: 0px; margin-right: 0px; margin-top: 20px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">As the video shows, the robotic spider moves with grace and turns around with flair, even on a smooth surface. Weller has posted the details of his <a href="http://www.thingiverse.com/thing:1643" style="color: #007ca5; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">Lego spider project</a> and says he hopes 3-D printing enthusiasts will try it out.</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-3760192651512548722010-12-21T06:29:00.002-08:002010-12-21T06:29:33.069-08:00The ATHLETE Rover<div class="newsText" style="color: black; font-family: arial, helvetica, sans-serif; font-size: 12px;">The All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) vehicle concept is based on six 6 DoF (Degrees-of-Freedom) limbs, each with a 1 DoF wheel attached. ATHLETE uses its wheels for efficient driving over stable, gently rolling terrain, but each limb can also be used as a general purpose leg. In the latter case, wheels can be locked and used as feet to walk out of excessively soft, obstacle laden, steep, or otherwise extreme terrain. ATHLETE is envisioned as a heavy-lift utility vehicle to support human exploration of the lunar surface, useful for unloading bulky cargo from stationary landers and transporting it long distances.<br />
To demonstrate this concept, several prototype vehicles have been developed for testing at JPL. The first generation ATHLETE prototype is 2.75m wide, has a maximum standing height of just over 2m, a mass of approximately 850 kg, and maximum payload carrying capacity of 300 kg in Earth gravity. Two identical prototypes were constructed in 2005 and one of these is still operational.<br />
The second generation ATHLETE prototype was constructed in 2009 and is implemented as a coordinated system of two Tri-ATHLETEs, fully independent three-limbed robots. This innovation allows a straightforward cargo handling strategy: two Tri-ATHLETEs dock to opposite sides of a cargo pallet to form a six-limbed symmetrical vehicle, work together to move and emplace the cargo, then undock and depart. This strategy provides all the advantages of the six-limbed concept for cargo or habitat transport with the additional benefits of flexibility and modularity. The second generation prototype is designed to demonstrate cargo handling at one half the anticipated lunar scale. The robot stands to a maximum height of just over 4m, and has a payload capacity of 450 kg in Earth gravity.<br />
A side benefit of the wheel-on-limb approach is that each limb has sufficient degrees-of-freedom for use as a general-purpose manipulator (hence the name "limb" instead of "leg"). The prototype ATHLETE vehicles have quick-disconnect end effector adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. Mechanical action of the wheel rotation also actuates the tools, so that they can take advantage of the one horsepower motor usually used for driving to instead enable drilling, gripping or other power-tool functions.<br />
Since the vehicle has an alternative walking mode to traverse through extreme terrain, the wheels and wheel actuators can be sized for nominal, rather than worst-case obstacle climbing. There are substantial mass savings in the wheels and wheel actuators associated with designing for nominal instead of extreme terrain. The mass savings is great than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be lighter than a conventional mobility chassis for planetary exploration.<br />
ATHLETE is being developed by JPL as part of the Human-Robot Systems (HRS) Project managed by the Johnson Space Center (NASA JSC). HRS is one of several projects funded by the NASA Exploration Technology Development Program (ETDP) that is developing new technology in support of human exploration.<br />
<br />
<table border="0" cellpadding="0" cellspacing="0" class="capText" style="color: black; font-family: arial, helvetica, verdana, sans-serif; font-size: 13px; font-style: italic;"><tbody>
<tr><td><img alt="Fig 2: 1st generation ATHLETEs demonstrating cargo transport" border="0" height="179" src="http://www-robotics.jpl.nasa.gov/images/athleteFig2.jpg" width="271" /><br />
Fig 2: 1st generation ATHLETEs demonstrating cargo transport</td><td><img alt="Fig 3: Model of a Tri-ATHLETE prototype" border="0" height="158" src="http://www-robotics.jpl.nasa.gov/images/athleteFig3.jpg" width="250" /><br />
Fig 3: Model of a Tri-ATHLETE prototype</td></tr>
<tr><td><img alt="Fig 4: ATHLETE positioning a box with a gripper attachment" border="0" height="221" src="http://www-robotics.jpl.nasa.gov/images/athleteFig4.jpg" width="292" /><br />
Fig 4: ATHLETE positioning a box with a gripper attachment</td><td style="padding-left: 10px;"><img alt="Fig 5: ATHLETE deploying a drill attachment on a cliff face" border="0" height="325" src="http://www-robotics.jpl.nasa.gov/images/athleteFig5.jpg" width="240" /><br />
Fig 5: ATHLETE deploying a drill attachment on a cliff face</td></tr>
<tr><td colspan="2"><img alt="" height="6" src="http://www-robotics.jpl.nasa.gov/images/spacer.gif" width="10" /></td></tr>
<tr><td align="center" colspan="2"><img alt="Fig 6: ATHLETE digging with a scoop attachment" border="0" height="250" src="http://www-robotics.jpl.nasa.gov/images/athleteFig6.jpg" width="392" /><br />
Fig 6: ATHLETE digging with a scoop attachment<br />
</td></tr>
</tbody></table></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-15661439820129050012010-12-21T06:29:00.000-08:002010-12-21T06:29:05.555-08:00The RAMS Arms Robot<div style="font-family: arial, helvetica, sans-serif; font-size: 12px;">The RAMS (Robot-Assisted Micro-Surgery) system has been developed to utilize NASA telerobotics technology in a beneficial commercial application. JPL has developed a precision cable-driven master-and-slave telerobotic system for eye surgery and teamed with an industrial partner to test and commercialize the technology. The system provides scaled-down human-input motions, tremor filtering to improve precision, amplified forces fed back to the human operator, and programmable constrained motion of the instrument in the eye to minimize surgical impacts. The slave robot, which manipulates a tool in the eye, has 6 actuated degrees of freedom (DOFs), 6-DOF tip-force sensing, and 15-micron positioning accuracy. The master robot, held by the surgeon, has 6 position and 6 force-sensed DOFs, 3 actuated DOFs, and 25-micron tip-position measurement accuracy.</div><div style="font-family: arial, helvetica, sans-serif; font-size: 12px;"></div><div align="center" style="font-family: arial, helvetica, sans-serif; font-size: 12px;"><table border="0" cellpadding="4" cellspacing="0"><tbody>
<tr><td align="center"><a href="http://www-robotics.jpl.nasa.gov/systems/system.cfm?System=9" style="color: black; font-family: Arial, Helvetica, sans-serif; text-decoration: underline;"><img alt="Fig. 1: RAMS master and slave arms." border="0" height="208" src="http://www-robotics.jpl.nasa.gov/images/RAMSArms1-260.jpg" width="260" /><br />
<img align="right" alt="Click here for a larger image" border="0" height="21" src="http://www-robotics.jpl.nasa.gov/images/larger_image.gif" width="100" /></a></td><td align="center"><a href="http://www-robotics.jpl.nasa.gov/systems/system.cfm?System=9" style="color: black; font-family: Arial, Helvetica, sans-serif; text-decoration: underline;"><img alt="Fig. 1: RAMS master and slave arms." border="0" height="208" src="http://www-robotics.jpl.nasa.gov/images/RAMSArms2-260.jpg" width="260" /><br />
<img align="right" alt="Click here for a larger image" border="0" height="21" src="http://www-robotics.jpl.nasa.gov/images/larger_image.gif" width="100" /></a></td></tr>
<tr><td align="center" colspan="2"><img alt="Blue Line" height="2" src="http://www-robotics.jpl.nasa.gov/images/blue_dot.gif" width="530" /></td></tr>
<tr><td align="center" colspan="2"><div class="capText" style="color: black; font-family: arial, helvetica, verdana, sans-serif; font-size: 13px; font-style: italic;">Fig. 1: RAMS master and slave arms.</div></td></tr>
</tbody></table></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-3014919944823575432010-12-21T06:27:00.001-08:002010-12-21T06:27:12.390-08:00Cornell robot<div class="headline" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Verdana, Helvetica, Geneva, sans-serif; font-size: 20px; font-weight: bold; line-height: 26px; margin-bottom: 10px; margin-left: 4px; text-align: left;">Cornell robot discovers itself and adapts to injury when it loses one of its limbs</div><div class="byline" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #444444; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: bold; line-height: 14px; margin-bottom: 6px; margin-left: 4px; text-align: left;">By <a href="mailto:ws21@cornell.edu" style="color: #444444; font-family: Arial, verdana, Helvetica, sans-serif; font-size: 12px; font-weight: bold; line-height: 16px; text-decoration: none;">Bill Steele</a></div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">Nothing can possibly go wrong ... go wrong ... go wrong ...</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">The truth behind the old joke is that most robots are programmed with a fairly rigid "model" of what they and the world around them are like. If a robot is damaged or its environment changes unexpectedly, it can't adapt.</div><table align="left" border="0" cellpadding="0" class="photoleft" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: verdana, arial, helvetica, sans-serif; font-size: 10px; margin-right: 10px; width: 216px;"><tbody>
<tr><td><a href="http://www.news.cornell.edu/stories/Nov06/ResilientRobot.mov" style="color: #b31b1b; font-family: Arial, verdana, Helvetica, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; text-decoration: none;" target="_blank"><img alt="Closeup of resilient robot" border="0" height="144" src="http://www.news.cornell.edu/stories/Nov06/ResilientRobotClose.jpg" width="216" /></a><br />
<div class="credit" style="font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 11px; font-style: italic; font-weight: normal; text-align: right; vertical-align: text-top;">Lindsay France/University Photography</div></td></tr>
<tr><td class="caption" style="font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 11px; font-weight: bold; line-height: 12px; text-align: left; vertical-align: text-top;">This four-legged robot is not preprogrammed to walk. Like a newborn animal it explores itself and learns to use its limbs to move. When a leg is damaged, it repeats the process and works out a new method of locomotion. <a href="http://www.news.cornell.edu/stories/Nov06/ResilientRobot.mov" style="color: #b31b1b; font-family: Arial, verdana, Helvetica, sans-serif; font-size: 11px; font-weight: normal; line-height: 16px; text-decoration: none;" target="_blank">Watch a short QuickTime movie of the robot (1.8M; 1 min.)</a></td></tr>
</tbody></table><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">So Cornell researchers have built a robot that works out its own model of itself and can revise the model to adapt to injury. First, it teaches itself to walk. Then, when damaged, it teaches itself to limp.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">Although the test robot is a simple four-legged device, the researchers say the underlying algorithm could be used to build more complex robots that can deal with uncertain situations, like space exploration, and may help in understanding human and animal behavior.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">The research, reported in the latest issue (Nov. 17) of the journal Science, was carried out in the Cornell Computational Synthesis Lab under Hod Lipson, assistant professor of mechanical and aerospace engineering, with Josh Bongard, a former Cornell postdoctoral researcher now on the faculty at the University of Vermont, and Cornell graduate student Viktor Zykov.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">Instead of giving the robot a rigid set of instructions, the researchers let it discover its own nature and work out how to control itself, a process that seems to resemble the way human and animal babies discover and manipulate their bodies. The ability to build this "self-model" is what makes it able to adapt to injury.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">"Most robots have a fixed model laboriously designed by human engineers," Lipson explained. "We showed, for the first time, how the model can emerge within the robot. It makes robots adaptive at a new level, because they can be given a task without requiring a model. It opens the door to a new level of machine cognition and sheds light on the age-old question of machine consciousness, which is all about internal models."</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">The robot, which looks like a four-armed starfish, starts out knowing only what its parts are, not how they are arranged or how to use them to fulfill its prime directive to move forward. To find out, it applies what amounts to the scientific method: theory followed by experiment followed by refined theory.</div><table align="right" border="0" cellpadding="0" class="photoright" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: verdana, arial, helvetica, sans-serif; font-size: 10px; margin-left: 10px; margin-right: 10px; width: 324px;"><tbody>
<tr><td><img alt="Watching the robot move" border="0" height="232" src="http://www.news.cornell.edu/stories/Nov06/ResilientRobot.jpg" width="324" /><br />
<div class="credit" style="font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 11px; font-style: italic; font-weight: normal; text-align: right; vertical-align: text-top;">Lindsay France/University Photography</div></td></tr>
<tr><td class="caption" style="font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 11px; font-weight: bold; line-height: 12px; text-align: left; vertical-align: text-top;">Graduate student Viktor Zykov, former student Josh Bongard, now a professor at the University of Vermont, and Hod Lipson, Cornell assistant professor of mechanical and aerospace engineering, watch as a starfish-like robot pulls itself forward, using a gait it developed for itself. the robot's ability to figure out how it is put together, and from that to learn to walk, enables it to adapt and find a new gait when it is damaged. <a class="credit" href="http://www.news.cornell.edu/Utilities/Copyright.html" style="color: #b31b1b; font-family: Arial, verdana, Helvetica, sans-serif; font-size: 11px; font-style: italic; font-weight: normal; line-height: 16px; text-align: right; text-decoration: none; vertical-align: text-top;">Copyright © Cornell University</a></td></tr>
</tbody></table><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">It begins by building a series of computer models of how its parts might be arranged, at first just putting them together in random arrangements. Then it develops commands it might send to its motors to test the models. A key step, the researchers said, is that it selects the commands most likely to produce different results depending on which model is correct. It executes the commands and revises its models based on the results. It repeats this cycle 15 times, then attempts to move forward.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">"The machine does not have a single model of itself -- it has many, simultaneous, competing, different, candidate models. The models compete over which can best explain the past experiences of the robot," Lipson said.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">The result is usually an ungainly but functional gait; the most effective so far is a sort of inchworm motion in which the robot alternately moves its legs and body forward.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">Once the robot reaches that point, the experimenters remove part of one leg. When the robot can't move forward, it again builds and tests 16 simulations to develop a new gait.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">The researchers limited the robot to 16 test cycles with space exploration in mind. "You don't want a robot on Mars thrashing around in the sand too much and possibly causing more damage," Bongard explained.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">The underlying algorithm, the researchers said, could be applied to much more complex machines and also could allow robots to adapt to changes in environment and repair themselves by replacing parts. The work also could have other applications in computing and could lead to better understanding of animal cognition. In a way, Bongard said, the robot is "conscious" on a primitive level, because it thinks to itself, "What would happen if I do this?"</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #222222; font-family: Arial, Helvetica, Geneva, sans-serif; font-size: 12px; font-weight: normal; line-height: 16px; margin-left: 4px; margin-right: 10px; text-align: left;">"Whether humans or animals are conscious in a similar way -- do we also think in terms of a self-image, and rehearse actions in our head before trying them out -- is still an open question," he said.</div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-44655893572393892742010-12-21T06:26:00.001-08:002010-12-21T06:26:25.381-08:00robot designed at MIT<div id="Layer1" style="height: 451px; left: 1px; position: absolute; top: 0px; width: 200px; z-index: 6;"><img height="450" src="http://people.csail.mit.edu/edsinger/image/domo/domo_white_mid.jpg" width="338" /></div><div id="Layer2" style="height: 447px; left: 350px; position: absolute; top: 0px; width: 362px; z-index: 7;"><br />
<ul><li><span style="font-family: 'Times New Roman', Times, serif;">29 active degrees of freedom (DOF)</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Two 6 DOF force controlled arms using<a href="http://groups.csail.mit.edu/lbr/hrg/1995/mattw_ms_thesis.pdf"> Series Elastic Actuators</a> (SEA)</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Two 6 DOF force controlled hands using SEAs</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">A 2 DOF force controlled neck using SEAs</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Stereo pair of <a href="http://www.ptgrey.com/">Point Grey</a> Firewire CCD cameras</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Stereo Videre STH-DCSG-VAR-C Firewire cameras</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Intersense 3 axis gyroscope</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Two 4 DOF hands using Force Sensing Compliant (FSC) actuators</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Embedded brushless and brushed DC motor drivers</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">5 Embedded<a href="http://www.freescale.com/files/dsp/doc/prod_brief/DSP56F807PB.pdf"> Motorola 56F807</a> DSPs running a 1khz control loop</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">4 CANBus channels providing 100hz communication to external computation.</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">49 potentiometers, 7 encoders, 24 tactile sensors, 12 brushless amplifiers, 17 brushed amplifiers, 12 sensor conditioners embedded on-board</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">An estimated 500 fabricated mechanical components and 60 electronics PCBs</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">15 node [and growing] <a href="http://www.debian.org/">Debian Linux</a> cluster running a mixture of C/C++/<a href="http://www.python.org/">Python</a> and utilizing the <a href="http://yarp0.sourceforge.net/">Yarp</a> and pysense robot libraries.</span></li>
<li><span style="font-family: 'Times New Roman', Times, serif;">Weight: 42lbs. Height: 34" tall. Arm span: 5' 6"</span></li>
</ul></div><br />
<div id="Layer3" style="height: 195px; left: 2px; position: absolute; top: 540px; width: 711px; z-index: 6;"><span style="font-family: 'Times New Roman', Times, serif;">NEW: Our work with Domo has lead to the creation of the robotics companies <a href="http://hee3.com/">HeeHeeHee Labs</a> and <a href="http://mekabot.com/">Meka Robotics</a>.</span><br />
<span style="font-family: 'Times New Roman', Times, serif;">The Phd dissertation, "Robot Manipulation in Human Environments", is now available <a href="http://people.csail.mit.edu/edsinger/doc/edsinger_phdthesis_final.pdf">PDF</a>, as well as the slides from the dissertation talk, in <a href="http://people.csail.mit.edu/edsinger/doc/edsinger_defense_html/edsinger_thesis_defense_2006.htm">HTML</a> and <a href="http://people.csail.mit.edu/edsinger/doc/edsinger_thesis_defense_2006.pdf">PDF</a>.</span><br />
<span style="font-family: 'Times New Roman', Times, serif;">Domo is a new upper-torso humanoid robot at the <a href="http://web.mit.edu/">MIT</a> <a href="http://csail.mit.edu/">CSAIL</a> <a href="http://www.ai.mit.edu/projects/humanoid-robotics-group/">Humanoid Robotics Lab</a>. It is the doctoral work of <a href="http://csail.mit.edu/~edsinger">Aaron Edsinger</a>. The goal of Domo is to contribute a novel approach to robot manipulation in unstructured environments. The approach is centered on integrating compliant and force sensitive manipulators into a behavior based architecture that accomplishes useful manipulation tasks in human environments.</span><br />
<span style="font-family: 'Times New Roman', Times, serif;">The mechanical design is the work of<a href="http://csail.mit.edu/~edsinger"> Aaron Edsinger</a> and long time collaborator <a href="http://csail.mit.edu/~jaweber">Jeff Weber</a>. Be sure to visit his site for technical details on the mechanics. The head is a copy of a design done by Weber for the robot <a href="http://csail.mit.edu/~lijin">Mertz</a>. The visual system builds on the work of<a href="http://people.csail.mit.edu/cckemp/">Charlie Kemp</a> with the wearable <a href="http://people.csail.mit.edu/cckemp/">Duo</a>. The design and construction took approximately one year and many sleepless nights.</span><br />
<span style="font-family: 'Times New Roman', Times, serif;">This research is advised by <a href="http://csail.mit.edu/~brooks">Professor Rodney Brooks</a>, Director of the MIT CSAIL, and is sponsored by <a href="http://www.toyota.co.jp/en/index.html">Toyota Motor Corporation</a>and <a href="http://www.jsc.nasa.gov/">NASA</a>.</span></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-42187301959418203792010-12-21T06:24:00.000-08:002010-12-21T06:24:18.085-08:00Story of robotics innovation<table align="center" border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-family: Arial, Helvetica, sans-serif; font-size: 1em; text-align: justify; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;"><img height="598" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="725" /></span></td></tr>
<tr><td><table align="center" border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 601px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;"><em>The first industrial robot was created in a small Connecticut machine shop and brought to life by a handful of ingenious, persistent young men led by patent-holder George Devol. Their efforts lead to the founding of Unimation, the first and, for years, the largest robotics company in the world—the pioneering company whose innovations were the basis for the growth of industrial robotics worldwide. But in less than 30 years it was all over, ironically brought down by its own effective promotion, Japan’s adaptability, and something as mundane as the hydraulic power system. This account was condensed and edited by Leslie Ballard from George Munson’s book, “Pity the Pioneer: The Rise and Fall of Unimation, Inc.,” now being readied for publication.</em></span></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;">In the spring of 1951 the Korean War was in full swing, and I was sure I would be drafted. I saw no point in interviewing for employment, despite my newly awarded degree in physics from the University of Connecticut.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">When I heard about a starting position for a physicist at Manning, Maxwell and Moore (MM&M) in Bridgeport, Conn., I figured I had nothing to lose and made an appointment. A young engineer, sporting a bowtie, by the name of Joseph Engelberger, interviewed me—he hired me on the spot. Little did I suspect that this decision sealed my fate, as our association would change manufacturing the world over. Nor could I know that with his combination of entrepreneurship, marketing, and natural affinity for promotion he would become the “Father of Robotics.”</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">MM&M specialized in a variety of electronic devices. Joe had established their Aircraft Products Division. Military spending was up and we landed some lucrative subcontracts for jet engine controls. With brisk business that included the U.S. Air Force, we outgrew the company headquarters in Stratford, Conn. and in 1954 moved into our own plant in Danbury, Conn.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">The war wound down and so did our business. MM&M ordered Joe to liquidate the division, but instead, he began looking for a stable line of work to keep his workforce together. He bought five books on finance, sat down and read them all. In his words, “I got my MBA in one weekend.”</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;"><br />
</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>GEORGE DEVOL, ROBOTICS GENIUS</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In 1957, Joe met the creative genius George Devol. Born in Louisville, Kentucky in 1912, Devol circumvented formal education and, at age 20, formed United Cinephone, producing recording equipment using photocells. He developed the revolutionary barcode and patented hundreds of inventions, including digital magnetic recording.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Devol had observed mountains of scrap tooling, created by product design changes. It inspired his revolutionary idea—universal automation; automation that would not become obsolete, but would adapt to product changes. He also patented a device that could perform repeated tasks with greater precision and endurance than the human worker, at less cost, and be retrainable for new tasks. He called it Programmed Article Transfer, and later “Robot.”</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Devol facilitated the sale of the Aircraft Products Division to Consolidated Electric Corp. (later Condec) and I stayed on with their new company in Bethel, Conn.—Consolidated Control Corp.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Engelberger the entrepreneur and Devol the inventor now began collaborating. Soon, Joe convinced Condec’s CEO, Norman Schafler to finance Devol’s brainchild, the industrial robot. If Joseph Engelberger was the Father of Robotics, George Devol was the grandfather. Interestingly, the word robotics was coined by one of Joe’s fellow Columbia University alumni, Isaac Asimov. Joe received his master’s in physics in 1949 and Asimov received his Ph.D. in chemistry in 1948. Joe found Asimov’s books about robots inspiring. In his forward to Engelberger’s book, “Robots In Practice,” Asimov includes his Three Laws of Robotics:</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.<br />
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.<br />
3. A robot must protect its own existence as long as such protection does not</span></div></td></tr>
</tbody></table></td></tr>
<tr><td><div style="text-align: center;"><span style="font-size: x-small;"><img height="329" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--2.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="479" /></span></div></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>2,700 POUND BEHEMOTH</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Our beginnings were modest, only six of us were assigned to the project. We knew the robot had to be anthropomorphic, but which configuration would provide the greatest flexibility for the applications we foresaw?</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">First, we conducted market surveys to determine the parameters, considering four basic configurations: polar, cylindrical, Cartesian, and revolute coordinates. Our decision was to proceed with a polar coordinate design, and while a 6-axis machine would have best emulated the flexibility of the human arm and wrist, the expense and complication forced us to build to a 5-degree of freedom machine, having just two rather than three wrist axes.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">With the robot’s configuration determined we began to develop the prototype. A self-contained hydraulic supply operating at about 1,000 psi would provide sufficient power and require fewer gears, thus, less backlash, than an electric motor. However, hydraulic power technology was not advanced enough, and the demands for speed, stability and accuracy challenged every design aspect of the 2,700-pound behemoth. Our engineering design tasks included:</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">1. A digitally controlled system based on the binary system. (Remember, this was in 1956!)<br />
2. A nonvolatile solid-state memory system, which didn’t yet exist.<br />
3. Shaft position optical digital encoders for high-speed performance, which also didn’t exist.<br />
4. A high-performance digital servo controller capable of dynamic control with a wide range of payloads.<br />
5. High-performance hydraulic servo valves.<br />
6. Self-contained electrical and hydraulic power supplies.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>DYNASTAT MEMORY</strong></span> <span style="font-size: x-small;"> </span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Under Devol, we developed a ferroresonant sensor, the basis for a self-styled memory system, patented as “Dynastat.” We also needed an optical shaft position encoder to provide the necessary position feedback to close the loop between the robot arm’s actual position and its command positions. By 1965 we had perfected an optical Gray code encoder we called “Spirodisk.”</span></div></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;"><img height="171" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--4_1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="284" /><br />
<span style="font-size: xx-small;"><strong>The 1900 was the first Unimate series. Photo circa 1961.</strong></span></span></td><td valign="top"><span style="font-size: x-small;">We put together a hydraulically driven programmable arm that could pick up </span><span style="font-size: x-small;">metal letters and spell out short phrases, and in 1961 we introduced our robot at a trade show at Chicago’s Cow Palace. Nobody knew what we were displaying or why, and the hydraulic system leaked like a sieve, but we were on our way! We needed a company and product name. “Universal automation” contracted to Unimation and the industrial robot was born. We called it the Unimate®. </span><br />
<div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>THE JAPANESE SEE THE WAY</strong></span> <span style="font-size: x-small;"> </span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Engelberger and Devol now approached the Ford Motor Co. The Unimate got the attention of the VP of Production, who proclaimed that he could use “thousands of them.” A manufacturing engineer was assigned to “do something” with the specification, but he passed it on to suppliers who might be interested—FoMoCo did not have the vision to recognize the robot’s potential in assembling car bodies until much later.</span></div></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td valign="top"><span style="font-size: x-small;">WWII and the Korean War stimulated many new products and manufacturing technologies in the U.S., which led to a large dose of complacency. Thus, America’s successes gave way to international competition – notably from Japan – that was unforeseen and, eventually, unstoppable. Contrary to popular belief even now, robotics is not a Japanese-founded technology. It was exported from Versatran and Unimation in the U.S.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In the early 1960s, when we began our mission to revolutionize American manufacturing, labor was abundant and competition from abroad was not yet threatening. American manufacturing knew no real competition. It wasn’t until the 1970s that the rivalry of the Japanese awakened American industry to its vulnerability. As the highest paid workers in the world, Americans were competing with workers in every other nation, who were paid far less. This was not a favorable atmosphere for our product, which many viewed as frivolous.</span></div></td><td><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;"><img height="219" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--3_1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="310" /></span><br />
<strong><span style="font-size: xx-small;">In the mid-60s, the 2000 series Unimate was designed and built,initially produced in groups of three. This was replaced by a vastly improved machine.</span></strong></div></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;">Japan was fighting the perception that “Made in Japan” meant shoddy goods. In response, they set out to produce quality goods, heeding W. Edwards Deming’s Total Quality Management philosophy: Quality carries with it reduced costs and improved competitiveness. Deming’s fellow Americans did not grasp his philosophy until much later, to their detriment. With the pressure on, American industrial leaders had to rethink their position.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">While justification for the robot, both economically and socially, seemed obvious to us, we were not surprised to find resistance in the workforce, particularly in mass production industries—the social threat was seen as devastating. But economic justification was far from obvious for even the most forward thinking accountants. We had our work cut out for us.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>HUMAN RESOURCE IMPLICATIONS OF THE ROBOTICS REVOLUTION</strong></span> <span style="font-size: x-small;"> </span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Interestingly, a 1983 study of human resource implications of robotics concluded, “The most remarkable thing about job displacement and job creation impacts of robots is the skill-twist [sic] that emerges so clearly when the jobs eliminated are compared to the jobs created. The jobs eliminated are semiskilled or unskilled, while the jobs created require significant technical background. We submit this is the true meaning of the robotics revolution.” It appears that we were prophetic, as this study revealed what we had earnestly stated since the mid-60s.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>DIECASTING – THE 1ST KILLER APP</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In 1961 we got our opportunity to put our innovation to the test at G.M.’s diecasting plant in Trenton, NJ. In wild anticipation, we readied Serial Number 001 for shipment. Naturally, we were concerned about how the diecast machine operators would react to this man replacement. In fact, their consensus was that our machine was a curiosity destined to fail. However, until the application of the robot to spot welding automobile bodies came along in the late 1960s, no other industry encouraged the proliferation of the industrial robot like diecasting. It inherently required all of the attributes the robot had to offer. Eventually some 450 Unimate robots were employed in diecasting.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In 1962, Pullman Inc., of railroad car fame, became a silent partner in Unimation, investing $3M to buy the high-tech element they desired for their corporate structure. The Pullman people, from the top down, were good people and it was a pleasure to be associated with them, but in the mid-70s new management lost interest. Condec bought Pullman’s 51% interest in the company and remained sole owner until Unimation went public in 1981.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>INTERNATIONAL PARTNERS</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In our efforts to broaden our customer base it was natural that we should strike out internationally. Since our cash flow was entirely negative, we needed a partner. We chose the largest manufacturing business in the world— Guest, Keen & Nettlefold in Wales. GKN was enthusiastic and committed. We offered strong support, educating their engineers in robot “lore” and guiding the sales force. But during my 1966 visit I observed that every plant had old machines and old methods. Good applications were few and far between, so the relationship was dissolved. Joe then set up Unimation Ltd. in Telford, England, which led to considerable business throughout Europe and Scandinavia. Then, in 1966 he licensed Finland’s Nokia, Ab. to market robots in Scandinavia and Eastern Europe.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"></div></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td valign="top"><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>KAWASAKI LICENSES THE UNIMATE</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">While Unimation was establishing the robot’s credentials, Japan was enjoying economic prosperity, but anticipating a labor shortage. Thus, in 1967 Joe was</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">invited to give a lecture in Tokyo to a large group of engineers. This interaction culminated in a 1969 licensing agreement with Kawasaki Heavy Industries to manufacture and market Unimate robots for the Asian market— a good marriage that endured and prospered for 15 years. By 1983 they had shipped over 2,400 Japanmade Kawasaki Unimate robots.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Joe wrote to several manufacturers suggesting that they might want to look closely at their technology and our patents. As a result, ASEA, Cincinnati Milacron, and IBM became licensees from which we derived royalties on their robot sales. As George Devol had learned long ago, ownership of patents is a valuable asset, from which we benefited handsomely. They protected our intellectual properties and helped us develop a strong licensing position that lasted years. It was a gratifying position, but they were eroding our market share and making our technology obsolete.</span></div></td><td><span style="font-size: x-small;"><img height="365" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--5_1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="452" /><br />
<strong><span style="font-size: xx-small;">Unimation President, Joe Engelberger in his trademark bowtie, development engineer George Munson, and Unimation chief engineer Maurice Dunne prep Unimate serial #001 for shipment to the first installation; GM’s diecasting plant in Trenton, NJ. Unimate #001 is now on display at the Ford Museum at Greenfield Village in Michigan. Photo taken in 1961.</span></strong></span></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>GM REVOLUTIONIZES AUTO MAKING</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">The automotive industry was the engine that drove the American economy, so we concentrated our energies there. Until the end of the 1960s, the auto body assembly line was a moving conveyor on which major body subassemblies were hung. Our nemesis was that it used a level of skill and intelligence the robot didn’t have. Yet we knew that the big payoff was in applying the robot to the body assembly lines. This required a product designed for both automation and indexed conveyors. These measures would open up many other applications, yielding a superior product while reducing cost. We now needed a champion at a high corporate level with insight, foresight, and guts. GM’s plant manager Les Richards had all three.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">GM had rebuilt its plant in Lordstown, Ohio in 1969, making it the most automated automotive plant in the world, building 110 cars per hour, twice the rate of any plant then in existence. Lordstown was to be the answer to Japan’s onslaught. It was to produce a high-quality small car that would satisfy the American public at a competitive cost, putting GM back on top.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">The technological impact of the Lordstown experiment revolutionized automobile making and secured the robot’s place. It wasn’t long before other companies turned to robotics and indexing systems for a more disciplined approach to manufacturing. At the same time, the European market came alive with Unimates at Fiat, Volvo, Mercedes Benz, British Leyland, BMW. Their unions welcomed robots performing all of the dangerous jobs.;</span></div></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;"><img height="232" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--6_1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="325" /><br />
<strong><span style="font-size: xx-small;">The three models of the Programmable Universal Machine for Assembly or “PUMA” family are shown here, the 260, 600 and 750. The small 6-axIs 260 (similar to the Stamford Arm); the human-sized 5- or 6-axIs 550 and the larger 6-axIs machine.</span></strong></span></td><td valign="top"><span style="font-size: x-small;">The activity in the auto industry created many nonautomotive employment opportunities, as diverse industries sought to improve their position through the application of technology: Bendix, Pratt & Whitney, Dupont, Whirlpool, GE, and many others. For some time our only competitor was Cincinnati Milacron of Ohio. This changed radically in the late 1970s when Japanese conglomerates began producing industrial robots.</span><br />
<div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>UNIMATE FAMILY EVOLVED</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Our robot “family” grew with each new application’s demands. The original 1900 Series developed into other series with extended reach; increased repeatability and lift capacity; 6-degrees of freedom; stronger wrists; Univision. New technical innovations were incorporated: solid-state memory; microprocessors; high performance electric motors replaced hydraulic and pneumatic ones; transistor controls; new programming languages.</span></div></td></tr>
</tbody></table></td></tr>
<tr><td><div style="text-align: center;"><span style="font-size: x-small;"><img height="377" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--7_2.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="560" /><br />
<span style="font-size: xx-small;"><strong>“Lordstown spot welding”: The technological impact of the Lordstown experiment<br />
revolutionized automobile making and secured the robot’s place in manufacturing.<br />
The plant built 110 cars per hour, twice the rate of any plant then in existence. 1969.</strong></span></span></div></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td valign="top"><span style="font-size: x-small;">In 1977, Joe shrewdly bought Victor Scheinman’s company, Vicarm, and renamed it Unimation West. We further developed Scheinman’s robot into the Programmable Universal Machine for Assembly (PUMA). We also acquired his VAL language, which was cutting edge.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Unimation also attempted to augment its line of machines by entering into marketing license agreements with other manufacturers, such as Trallfa of Norway and Electrolux of Sweden, but all were only mildly successful. The bottom line was that we were not very good at marketing someone else’s product.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><strong>ELECTRIC DRIVE ROBOTS</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">What happened next marked the beginning of the end. Joe’s innate business sense failed him, and his rigidity skewed his judgment in a critical decision. In 1981, the 9000 with VAL was Unimation’s answer to all the competitors who were seriously eroding its customer base. But it could not overcome one great</span></div></td><td><span style="font-size: x-small;"><img height="201" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--8_1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="335" /><br />
<strong><span style="font-size: xx-small;">December 1982. Isaac Asimov; Bernard Sallott, Director of Society of Manufacturing Engineers; and George Munson, Unimation V.P. of Systems at the SME annual awards.</span></strong></span></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td><span style="font-size: x-small;"><img height="258" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--9.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="259" /></span></td><td valign="top"><span style="font-size: x-small;">deficiency—hydraulics. The auto industry wanted electric- driven robots. Joe balked at this, convinced that the muscular robots required had to be hydraulically driven. He met with GM’s CEO to argue his position, but lost. The partnership that was struck in 1961 virtually ended at that meeting. To rub salt in the wound, GM then announ-ced its partnership with Fujitsu/Fanuc to market Fanuc’s line of electric robots.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">By 1981 Unimation was carrying long-term debt of $19M, owed to Condec for development of the robot. Schafler, pressured to generate more cash, restructured the company. Paul Allegretto, a Condec executive, was made Executive VP and I was named VP and General Manager of the newly created Systems Division.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Allegretto took the company public to pay off Condec and Joe had no choice but to go along. The proceeds paid all indebtedness and the remaining $6M provided working capital for Unimation—not much for a pioneer whose technology was becoming obsolete, and with significant competition.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">With the success of taking the company public, Allegretto told Joe that if he weren’t made CEO he would quit. Joe accepted Allegretto’s offer to quit and then reassumed the reins.</span></div></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td valign="top"><span style="font-size: x-small;">But things did not go well for the company. Sales dropped as the competition gained momentum. GM was leading the charge for electric-driven robots, which Unimation still did not have. Joe’s previous confrontation over hydraulics with GM’s CEO didn’t help, and between Cincinnati Milacron, Asea, and GMF Robotics, Unimation’s position was seriously undermined.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Yet Unimation had a number of suitors, all with a desire to gain a foothold in robotics. In December 1982 Westinghouse paid $107M, buying its way to the top of the domestic robot industry. The merger of the two companies moved rapidly. Almost immediately, Unimation’s fortunes plummeted and its market share eroded. The general economic recession caused a drop in sales, but industry competition contributed greatly to Unimation’s financial downward trend. The largest segment of our business, auto manufacturing, was shifting to electric-driven robots.</span></div></td><td><span style="font-size: x-small;"><img height="150" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--11.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="205" /></span></td></tr>
</tbody></table></td></tr>
<tr><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 725px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td valign="top"><span style="font-size: x-small;">Those of us who grew up with the entrepreneurial spirit of Unimation could not readily adjust to the ponderous ways of giant Westinghouse. I was the first to go among the executives. In July 1983 I became SVP of Robot Systems Inc. in Georgia. Joe continued as president of Unimation, though his was an association of oil and water. Unimation West eventually broke away; Kawasaki and Unimation Europe ground to a halt under the new management.</span><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Finally, in 1984 Joe threw in the towel, saying, “I resigned in dismay. I was heartbroken because this was my baby, and it was crumbling before my eyes.” In 1985 he founded Transition Research Corp., which later became HelpMate Robotics.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In March 1985 I returned to Unimation as Manager of Distributor Sales. As I became familiar with the “new” Danbury operation, I felt that it was only a matter of time before Unimation would be absorbed into Westinghouse’s Pittsburgh operation. Sure enough, my duties were suspended as operations in Danbury wound down. Westinghouse’s disclosure that it was setting up a joint venture with Matsushita Electric seemed the final move in a series of strategic shifts since the merger that were “essential to improving competitiveness in the factory automation market.” Time would tell otherwise.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Before shutting the doors at Shelter Rock Lane—where it all started back in 1954—and moving what little remained of Unimation to Pittsburgh, those eligible for early retirement were so advised by Westinghouse. I was one of them.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">— In 1988 Westinghouse sold Unimation to Stäubli of France.<br />
— In 2003 the Unimate was inducted into the Robot Hall of Fame.<br />
— In 2003 I was honored with the Robotics Industries Association’s Joseph F. Engelberger Award.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;"><br />
Please <a href="http://find.botmag.com/121192" style="color: #0a10fd; font-weight: bold; text-decoration: none;">CLICK HERE</a> for a gallery of photos on the history of Unimation’s robots and a link to the Johnny Carson show video. Photos and video courtesy of George Munson. —<em>the editors</em></span></div></td><td><table border="0" cellpadding="5" cellspacing="0" style="border-collapse: collapse; font-size: 1em; width: 402px;"><tbody style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-color: rgb(204, 204, 204); border-top-style: none; border-top-width: 1px; border-width: initial;">
<tr><td bgcolor="#cccc66" width="392"><div style="margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="font-size: x-small;"><img height="219" src="http://find.botmag.com/sites/upload_files/botmag/files/unimation--10.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="392" /><br />
<strong>UNIMATE STARS ON THE TONIGHT SHOW</strong></span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">In those early years we took on any challenge that would earn us some attention, including trade shows and TV appearances. The Unimate was even invited to appear live on The Tonight Show Starring Johnny Carson—that’s how much of a novelty we were. Joe and I took our 2,700-pound baby to the NBC studios in Burbank for two weeks of rehearsal. We decided to perform three acts: putt a golf ball, take Ed McMahon’s place in a beer commercial, and conduct the band and play Milton DeLugg’s accordion. Simple enough, eh?</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">For the putting act, all we had to do was hit the ball consistently from a known distance on a relatively flat surface. But there was one variable: the hydraulic power supply operated on the principal of charging an accumulator from a positive displacement pump, controlled by an “unloading” valve that cycled between 950 and 1000 psi, so the speed of the robot’s motions varied, depending upon the pressure level. So, in addition to a fine job of programming and a prayer that the cycling of the unloading valve would be consistent, we couldn’t miss.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">We were the first act that night. But not 30 seconds before going live on-air, a stagehand inadvertently kicked the ball off the mark. I spotted it and was about to replace it, but realizing the opportunity for a “save,” I handed it to Joe on stage, who repositioned it with dramatic flair. To Johnny’s relief, and the delight of the audience the Unimate took its putter in “hand,” approached the ball and tapped it into the cup.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Next, the Unimate was to open and pour a beer. The pneumatically operated “fingers” of the robot’s hand were either open or closed without any tactile sensing or cushioning—not only was this a very good way to crush a can but also to burst it in a spray of beer. To control this, we adjusted the air pressure and stroke of the fingers to treat the can more kindly. But even with reduced pressure, the violence with which it grabbed was still enough to rile up the effervescent contents and send them spewing into the air. We finally discovered that if we nearly froze the beer we could pull the stunt off without soaking everyone. Of course, not wishing to be wasteful, the process of discovery required us to consume the opened beer during our investigations. Well, that’s show biz.</span></div><div style="margin-bottom: 0.9em; margin-top: 0.5em;"><span style="font-size: x-small;">Finally, Joe was to program the machine to lead the band, after which it was to drop the baton, reach for Milton’s accordion and pretend to play it. All went well with the band picking up the beat in time. Suddenly, the Unimate dropped the baton, lunged for the accordion and, rather than playing it, proceeded to thrash it about mercilessly! The audience loved it. Milton was astounded. Johnny, of course, thought it was great fun.</span></div></td></tr>
</tbody></table></td></tr>
</tbody></table></td></tr>
<tr><td><span style="font-size: x-small;"> </span></td></tr>
</tbody></table>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-49096217978253694312010-12-21T06:22:00.001-08:002010-12-21T06:22:53.293-08:00DSP ROBOTICS Flowstone program<div style="border-collapse: collapse; font-family: Arial, Helvetica, sans-serif; font-size: 10px; margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><span style="color: maroon;"><strong><span style="font-size: large;">Flowstone program</span></strong></span></div><div style="border-collapse: collapse; font-family: Arial, Helvetica, sans-serif; font-size: 10px; margin-bottom: 0.9em; margin-top: 0.5em; text-align: center;"><img height="314" src="http://find.botmag.com/sites/upload_files/botmag/files/bot3.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px;" width="350" /></div><div style="border-collapse: collapse; font-family: Arial, Helvetica, sans-serif; font-size: 10px; margin-bottom: 0.9em; margin-top: 0.5em;">Flowstone program Flowstone is a graphical programming environment that can be used to control many various kinds of robotic components such as the Lynxmotion SSC-32 servo controller. A great example of a Flowstone program and what one can do is an application called Simple Lynxmotion Arm Control. By manipulating vertical sliders users can control any Lynxmotion SSC- 32 arm, select the COM port with a drop box, change the servo each slider controls, and even set the minimum and maximum range of each slider independently. All this functionality was created without typing a single line of code. Lynxmotion is the official U.S. distributor of Flowstone, and there is a huge selection of robots at this site, visit <a href="http://www.lynxmotion.com/" style="color: #0a10fd; font-weight: bold; text-decoration: none;" title="www.lynxmotion.com">www.lynxmotion.com</a>.</div><div style="border-collapse: collapse; font-family: Arial, Helvetica, sans-serif; font-size: 10px; margin-bottom: 0.9em; margin-top: 0.5em;"><span style="color: maroon;"><strong>DSP ROBOTICS</strong></span><br />
<a href="http://www.dsprobotics.com/" style="color: #0a10fd; font-weight: bold; text-decoration: none;" target="_blank">www.dsprobotics.com</a></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-50875382220492627422010-12-21T06:21:00.001-08:002010-12-21T06:21:58.519-08:00For Jan. 1, 2010, the first advanced Humanoid robot called ATOM-7xp<span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #000fff;"><b></b></span><br />
<center><center><b><span style="font-family: arial; font-size: x-small;"><center><h2>For Jan. 1, 2010, the first advanced Humanoid robot called ATOM-7xp <img src="http://www.futurebots.com/made.gif" /></h2></center><hr /><center><img alt=" FUTURE-BOT COMPONENTS " src="http://www.futurebots.com/atom1.jpg" /><h3><img src="http://www.futurebots.com/ball.gif" /> January 1, 2010 from the FutureBots labs an 8 year secret project the ATOM-7xp</h3></center></span></b></center></center>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-12450846666438447122010-12-21T06:20:00.000-08:002010-12-21T06:20:06.125-08:00MicroRaptor Vision Robot<a href="http://www.cns.atr.jp/hrcn/DB/home.html" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: justify; text-decoration: underline;" target="_blank"><span class="Apple-style-span" style="font-size: x-small;"><br />
</span></a><a href="http://www.cns.atr.jp/hrcn/DB/home.html" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: justify; text-decoration: underline;" target="_blank"><span class="Apple-style-span" style="font-size: x-small;"><br />
</span></a><a href="http://www.cns.atr.jp/hrcn/DB/home.html" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: justify;" target="_blank"><span class="Apple-style-span" style="font-size: x-small;"><img alt="Figure: Head of a robot endowed with vision." height="320" hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/robot_head.jpg" vspace="2" width="288" /></span></a><a href="http://www.cns.atr.jp/hrcn/DB/home.html" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;" target="_blank"></a><span class="Apple-style-span" style="font-size: x-small;"></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">Figure: Head of a robot endowed with vision<span class="Apple-style-span" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;">I'm a lay artificial intelligence researcher. I occasionally get interested in machine vision. At various times, I've searched online for information about the subject, but it can be difficult. There are quite a few sites that have links to other sites. And there are conferences, commercial products, and other such things that can be found. But it's hard to find much to bring it all together.</span></span></div><span class="Apple-style-span" style="font-size: x-small;"><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">This page sets out to bring some of these resources together. I don't have the time to create a truly exhaustive resource. My hope, then, is to make this a decent starting point for others looking for background information. To that end, I'm organizing information by category and trying to provide summaries, speculations, and other opinions.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One other goal I have for this page is to demystify machine vision. The popular press has a habit of making the products of machine vision research look far more impressive than they often actually are. Even someone like me with a little knowledge of how a lot of the techniques work can easily be fooled by a new trick into thinking the field is much farther along than it really is. And for various reasons, the web sites of research projects or commercial products don't often reveal much about the techniques used. Admittedly, some of my explanations are speculations based on what I find online and sometimes just reckoned by asking myself, "how would I do that?"</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">I invite you to let me know of your own work. I especially welcome information about current and historically significant research projects, but I also welcome information from the private sector about new or significant products under development or in use today. And feel free to let me know if you find any of my explanations is inaccurate or incomplete. <a href="mailto:jvc_ai@carnell.org &subject=Machine Vision Introduction" style="text-decoration: underline;">Send me email</a> about your projects, products, and thoughts.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">What is Machine Vision?</h2><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.securityatwork.org.uk/Main/patternCS.htm" style="text-decoration: underline;" target="_blank"><img alt="Figure: Facial measures used
in a biometrics vision system." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/facial_metrics.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: Facial measures used<br />
in a biometrics vision system.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">"Machine vision" is a field of study and technology whose goal is to endow machines with the ability to perceive selective aspects of the world using visual means.</div></span></span><div><span class="Apple-style-span" style="font-size: x-small;"><span class="Apple-style-span" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><div style="text-align: justify;">I apologize if this sounds like a circular definition. One can easily get lost in a particular concept or technology when trying to define machine vision. Perhaps it's best to start with a more ostensible definition, then. Those of us fortunate enough to have functional eyes have an incredible ability to perceive and understand the world through them. Engineers have long sought to endow machines with this same capability. It's easy to assume that this just means duplicating the mechanisms people use in machines, but that's not all there is to it. Some techniques involve projecting and reflecting laser beams off distant targets, for example, which is very different from how you and I work. Some systems can read and understand information in bar codes or other special constructs that are difficult for humans to deal with.</div><div style="text-align: justify;"><br />
</div><div style="text-align: justify;">Most importantly, few techniques being researched or in use today really resemble the awesome complexity and flexibility available to humans. We MV researchers have our own bag of tricks. It may be that some day, we bring all those tricks together and find we can make machines "see" as well as or even better than humans do, but we're no where near there, yet.</div><div style="text-align: justify;"><br />
</div><div style="text-align: justify;">All practical machine vision systems in use today exist for their own specific purposes. Some are used to ensure that parts coming off assembly lines are manufactured correctly. Some are used to detect the lines in a road for the benefit of cars that drive themselves. Though some interested parties claim otherwise, there are no general purpose vision systems, either in laboratories or on the market.</div><div style="text-align: justify;"><br />
</div><div style="text-align: justify;">If it sounds like it's difficult to define machine vision, don't fret. The point is that the field of machine vision is not simply interested in duplicating human vision. What is essential is the basic goal of visual perception; of the ability to "understand" the world visually, sufficiently for moving about in and interacting with a complex, ever-changing world and discerning information in the environment essential to the core goals of consumers of these faculties.</div><div style="text-align: justify;"><br />
</div><br />
<h2 style="color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">General-Purpose Vision</h2><div style="text-align: justify;"><br />
</div><div style="text-align: justify;">As mentioned above, all practical machine vision end products available now are for specific purposes. I contrasted that with general purpose machine vision. Let me define what that means, then.</div></span></span><div><span class="Apple-style-span" style="font-size: x-small;"><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">I'll start, again, with an ostensible model. Human vision is general purpose. In our everyday experiences, we see a rich panoply of things in all sorts of lighting conditions. We are able to operate well in almost any circumstance in which there's even a modest amount of light entering our eyes and which isn't damaging them.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Merely being able to see light is nearly useless, though. The best video cameras today are still just recording or transmission devices; they don't do anything else practical with it. By contrast, we are poor recording and transmission devices. It's our faculties for visual perception that distinguish us. So let's talk about what we do with the visual information we can see.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">We can recognize the boundaries between objects. We can recognize objects. We can recognize the repetitions that compose both simple and rich textures. We can intuit the nature and location of light sources without seeing them directly. We can recognize the three dimensional nature of the things we see. We can see how things are connected together and how larger objects are subdivided into smaller ones. We can recognize that the two halves of a car on either side of a telephone pole are actually parts of a single car that is behind the pole. We can tell how far away things are. We can detect the motion of objects we see. We can recognize complex mechanisms with lots of moving parts as components of single larger objects and distinguish them from the backdrop of the rest of the world. We can even recognize a silver pitcher amidst a noisy background as a thing unto itself, even though we only see the reflections of that background.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Perhaps the most interesting feature of human vision that distinguishes it from most machine vision techniques crafted to date is that we can deal very well with novel situations. A new car you've never seen before is still obviously a car because it looks like a car. You instantly catalog novel objects and register essential differences.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">How does one distill all this down into a clear definition, then? What is general purpose machine vision? I think it's best to define it in terms of a set of core goals. A machine can be said to have general purpose machine vision if it can:</div><ol style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><li style="text-align: justify;">Construct a 3D model of the open space within its visual field sufficient for movement within that space and interaction with the objects within it</li>
<li style="text-align: justify;">Distinguish most any whole object, especially a complex moving one, from the rest of a visual field</li>
<li style="text-align: justify;">Recognize arbitrarily complex textures as continuous surfaces and objects</li>
<li style="text-align: justify;">Have a hierarchic way of characterizing all the objects within a scene and their relative positional and connectivity relationships to one another.</li>
<li style="text-align: justify;">Characterize a novel object using a three dimensional animated model composed of simpler primitives and be able to recognize that object in most any orientation</li>
<li style="text-align: justify;">Be able to recognize and separate objects in a wide variety of lighting conditions, including complex arrangements of shadows</li>
<li style="text-align: justify;">Be able to separate and recognize objects that are transparent, translucent, or reflective, given sufficient visual cues</li>
</ol><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">There are probably other milestones one could add to this, but it seems a pretty lofty set for now.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h1 style="-webkit-text-decorations-in-effect: none; color: #6e3133; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Sensors Used</h1><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Spot Sources</h2><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: A photoresistor
for detecting light." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/light_sensor.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: A photoresistor<br />
for detecting light.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">The most basic kind of sensor that can be used for vision is one that sees only a single "pixel". A photoelectric cell -- in this case, a photoresistor -- like the one in the figure at right is an example. Note that while, when you think of pixels, you probably think of very small parts of a larger picture, I don't necessarily mean it in this sense. An entire picture can be composed of a single pixel. What matters in this sense is the field of view of the imaging sensor. In the case of many photoelectric cells, for instance, the field of view might include up to half the full sphere of the view around it. To narrow the field of view of a photocell like this is a simple matter. One could, for example, put a small box over the cell and drill a small hole in it so only light coming from a source in the direction of that hole can get to the sensor.</div></span></div><div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">In keeping with the idea that an electronic eye need not be limited to working like our eyes, let's consider some other kinds of spot-source sensors. One technique involves a speaker outputting a very high frequency tone and using a microphone to pick it up. The closer or larger a nearby object is, the more it will reflect that sound and hence the stronger will be the signal to the microphone. A laser beam and a photoelectric cell can serve a similar purpose. In addition to sensing differences in intensity, they can also be used to determine how long it takes for the signal to get from emitter to detector and thus determine the distance to one or more objects.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Cameras</h2><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.medicaldesign.com/articles/ID/11358" style="text-decoration: underline;" target="_blank"><img alt="Figure: A charge-coupled device
(CCD) used in digital cameras." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/ccd.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: A charge-coupled device<br />
(CCD) used in digital cameras.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Most digital cameras use the same basic approach to imaging. At their heart is a device that serves the same purpose as a piece of film that is called a charge-coupled device, or "CCD". A set of one or more lenses focuses light onto the CCD, which is made up of a rectangular grid of individual light sensors similar to the photoelectric cell figured in the previous section. The figure at right shows an example of a CCD that has a grid of 1,024 sensors across by 1,280 sensors down and is used in medical X-ray imaging devices. A digital camera outputs information in a form that can easily be interpreted by a computer as a grid of levels of light in one or more discrete electromagnetic wave bands (e.g., red, green, and blue or X-rays and infrared frequencies).</div></div><div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Technically, a computer can use an analog camera as its input, but in any case, a digital computer ultimately must use digital information. The continuous stream of signals from an analog camera, then, must be converted into a stream of digital information that can be interpreted in the same way as a digital camera's output.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Laser Scanners</h2><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://graphics.stanford.edu/projects/mich/more-david/more-david.html" style="text-decoration: underline;" target="_blank"><img alt="Figure: A laser imager scanning a statue." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/laser_scanner.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: A laser imager scanning a statue.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">Some machine vision systems use lasers to directly sense the three dimensional shapes of their immediate surroundings.</div></span></div><div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The basic idea behind this is to exploit the fact that light travels at a known and thus predictable velocity. A laser pulse is sent out in some direction and may be detected by a sensor like the photoelectric cell described earlier if it hits some object relatively nearby or which is highly prone to reflect light back in the direction it came from. Using a very fast clock, the electronics that coordinate the laser and light sensor measure how long it took for the light pulse to be detected and hence calculate how far away the reflective surface is. Because a laser beam can be made to be very fine-pointed, it is generally reasonable to assume that it will only hit a single surface and so only a single response will come back to the sensor.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">By gradually pointing the laser at different places -- usually within a rectangular grid pattern -- in the system's field of view, sending pulses of light at each, and taking measurements of the time each pulse takes to be reflected, one can gradually build an image. The image formed is not like the one you are used to. Instead of representing levels of light, each pixel in such an image represents a distance to the surface that the laser hit when it was aimed in that direction.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Echoic Triangulation</h2><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.ims.forth.gr/rg_geophysics.html" style="text-decoration: underline;" target="_blank"><img alt="Figure: Output from a ground-penetrating radar." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/ground_penetrating.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: Output from a ground-penetrating radar.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">One particularly interesting idea that has found many expressions is the idea of using echoes to detect objects that are not otherwise visible.</div></span></div><div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The laser scanners described above use echoes, but rely on the object being detected to be fairly solid and the space between the emitter and the subject being imaged to be fairly empty, relative to the much higher density of the subject.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Imaging objects underground is a great example of a case where the goal is to "see" objects amid surroundings that are not nearly as varied in their densities as one would find with air and rock, for example. One key technique is to project some wave of energy -- perhaps sound waves or microwave energy -- down into the ground and detect the energy that is reflected off of layers and objects in the ground. Because they do have different densities or other properties that affect the energy projected, they will reflect to varying degrees. As each pulse of energy is sent out, the detector is continually measuring the degree of energy coming back over time. The intensity of the returning signal and the amount of time that has passed since the original pulse was sent are typically used to create a linear gradient. By moving the device along a linear path on the ground and sending pulses at each point, a two dimensional image can be composed by taking each linear gradient as a vertical column of pixels and each position along the ground as a horizontal starting point for that vertical column of pixels.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The result of most ground-penetrating radar systems like the one in the figure at right, while often appearing in 2D images like the figure, could not look more alien to our own sense of how vision works. But it's important to recognize that there is information in a visual system like this. With training, anyone can learn to recognize the significance of what's in such images. And so can a machine. And while we don't have the capacity to deal easily with it, one could also take many slices in a grid drawn on the ground and put them together like pages in a book to form a three dimensional picture. To be useful to a human, it would probably be necessary to delete some of the resulting three-dimensional pixels -- also known as "voxels" -- so one can see the other parts as "solid" objects.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One fascinating extension of this same concept is to generate an image of what's below the ground using sound. In one arrangement, two or more microphones are placed on the ground around an area to be imaged. A person with a sledgehammer moves from point to point on a grid and strikes the ground. A computer records the echoes and times when sounds arrive. Again, it may be that more than one pulse is heard by a given microphone, because different objects underground may reflect sound in different ways. Sound waves may even separate and take different pathways to a given microphone. The end result, again, is either an image representing a sort of 2D slice through the ground or a 3D image representing a volume of ground.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">This same sort of concept is also used in familiar medical imaging technologies like MRI and PET scanners, not to mention the ubiquitous ultrasound equipment.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h1 style="-webkit-text-decorations-in-effect: none; color: #6e3133; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Primitive Visual Features</h1><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">It's natural to want to dive right into the high level techniques and goals of machine vision, but it's important to understand some of the lower level features that we use to characterize images. Most higher level vision approaches involve particular solutions to the problems of how to recognize them or build larger structures based on them.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="text-align: justify;"><nobr> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#Edges" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Edge Detection</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#Regions" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Regions and Flood-Fill</a></nobr> </div><nobr><div style="text-align: justify;"> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#Textures" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Texture Analysis</a></div></nobr><a href="" name="Edges"></a><br />
<h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;"><a href="" name="Edges"></a><a href="http://www.alexandria.nu/ai/machine_vision/introduction/#PrimitiveFeatures" style="text-decoration: underline;"><img alt="Back up to Primitive Visual Features" border="0" src="http://www.alexandria.nu/images/to_top_h2.gif" /></a> Edge Detection</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">One of the oldest concepts in machine vision, edge detection is also one of the most enduring. The essence of the technique is to scan an image, pixel by pixel, in search of strong contrasts. With each pixel considered, the pixels around it are also considered, and the more variation there is, the stronger that pixel will be considered to be a part of an "edge", presumably of some surface or object. Typically, the contrast sought is of brightness. This is a function of what can be thought of as the black and white representation of an image, but sometimes hue, multiple color channels, or other pixel-level features are considered.</div></span></div><div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">This pixel-level edge detection operation is so simple and common that it can be found in many ordinary paint programs. The figure below illustrates one use of edge enhancement in the popular PhotoShop program:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Using PhotoShop to "detect" and enhance edges." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/edge_detection.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Using PhotoShop to "detect" and enhance edges.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The idea of doing contrast-based edge detection had a lot of momentum in the early days when scientists studying human vision determined that our own visual systems use this technique. Once replicated in machines, it seemed like we were just a short way off from having general purpose vision. But early successes in the ability to find edges at the pixel level did not quickly translate into successes in higher level vision goals. We'll explore this more in coming sections.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One of the challenges in translating edges based on contrast into edges of objects is that contrasts can be caused by factors other than the obvious. For instance, a "specular" reflection of light as off a shiny surface can cause the appearance of a sharp edge around the reflection. Similarly, a shadow cast upon a surface can create a strong contrast at the boundary between the shadowed and lighted portions of that surface. These artifacts tend to lead edge detection algorithms to get "false positive" results. Following is an illustration of how shadows can create false positives:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: The effects of partial shadows on edge detection with an image of leaves." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/shadow_edges.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: The effects of partial shadows on edge detection with an image of leaves.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">On the other hand, "false negative" results can be caused by something as simple as a blurry edge. Consider the figure below:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Unexpected results can come from edge detection with blurry edges." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/bad_edges.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Unexpected results can come from edge detection with blurry edges.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Note how the woman's nose is completely invisible to this edge detection approach because the edges we perceive are actually very soft and subtle in terms of contrasts. Other edges we infer, like the one at the top of her hair or on her left shoulder, are also missing because of weak contrasts. The shine off her forehead and chin also clearly create strong enough contrasts to result in false positive matches of edges.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The above figures also illustrate how easily edges we perceive as continuous get broken up in pixel-level edge detection algorithms. The messiness of having lots of neighboring and intersecting edges packed into small spaces also really complicates things.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Consider one of the central issues with simple edge-finding algorithms. We'll call it the "threshold problem". It can be expressed simply as "how strong of a contrast is strong enough to consider a place in an image to represent an edge? If one chooses a threshold that's too low, there will be too many edges to be able to be useful. If the threshold is too high, too few edges will be found. The following illustrates the problem:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Edge detection using different thresholds: a.) source image; b.) high threshold; c.) medium threshold; d.) low threshold." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/edge_thresholds.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Edge detection using different thresholds: a.) source image; b.) high threshold; c.) medium threshold; d.) low threshold.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The sad truth is that there is no "right" answer when it comes to choosing a threshold value. What most researchers don't want to admit is that they do not rely on automation to decide what threshold value to use. They choose a value based on the particular application, lighting conditions, and other finer details. This raises the classic AI problem of the "brain inside the brain". That is, it takes an intelligent agent -- the researcher -- to frequently determine a key factor in proper edge detection so it can be "automated". In some situations, like in a factory, the conditions can be controlled. General-purpose vision cannot assume such controlled conditions, though. Your eyes certainly don't.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Despite the shortcomings, edge detection has found much expression in very practical industrial and research systems. The figure below illustrates a sample use of a simple sort of edge detection algorithm in inspection of a manufactured part:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: An inspection system detects that one of four expected cables is missing." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/edge_assembly_inspection.gif" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: An inspection system detects that one of four expected cables is missing.</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">In this case, a linear slice one pixel wide is taken where cables are expected to lie in the image. The number of sharp edges (six here) is counted up and divided by two edges per cable, revealing that there are only three of four expected cables. Linear slices like this can be used to spot-check object widths, rotational alignments, and other useful metrics that are helpful in inspection systems. And full-image scans are also used to detect edges in roads and other systems for use in more sophisticated applications.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Regions and Flood-Fill</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">Finding the edges of objects may seem the basis of finding objects in an image, but it's only the beginning. The edges found on a picture of a human ear, for example, will be far more complicated than the overall shape of the ear. Edges provide a means to finding objects, but finding regions within an image can be thought of as one step higher in abstraction.</div></span></div><div><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One of the most basic means of finding regions in an image is to use a "flood-fill" algorithm. This term comes from the similarity of the algorithm to the basic flood-fill operation most paint programs have. To the program, it's as though a region is a flat plain into which color can be poured, but which has "sharp" edges beyond which the color won't spread. Those edges are usually defined in exactly the same way as we considered above with regards to edge detection.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">It's helpful to use the paint program's flood-fill analogy because of its intuitive nature. The following figure shows a picture with some areas sectioned off using a flood-fill algorithm. Each distinct region found gets its own unique color.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Using flood-fill to isolate major regions of an image." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/flood_fill_beach.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Using flood-fill to isolate major regions of an image.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The limits of flood fill start becoming pretty obvious with the above image. For example, notice how the ocean is divided into two parts by the large rock? The left side of the ocean (blue) and the right (green) are obviously part of the same object, to you and me, but not to an algorithm simply seeking out unique regions using a basic flood-fill algorithm.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Another issue is that a flood-fill operation can "spill out" of one region to another. See how the white region includes the nearby rock, part of the cliff on the left side, most of the farther-off rock, the white foam where the ocean meets the beach, and so on? Few of us would assume that all of these separate objects are really part of the same object, yet the flood-fill algorithm doesn't see these distinctions.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One way in which a typical paint program's flood-fill algorithm differs from one used for machine vision is in how they deal with gradients. The following figure illustrates the distinction.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Two ways to interpret a smooth gradient using a flood-fill algorithm." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/flood_fill_gradient.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Two ways to interpret a smooth gradient using a flood-fill algorithm.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">A typical paint program will take note of the color of the pixel you first clicked on and seek all contiguous pixels that are similar to that one. Hence the separate bands above in the middle image. A typical edge detection algorithm as described above would not find any edges within the gradient; only around the circle and square. A more appropriate flood-fill algorithm for machine vision, then, will fill smooth regions, even where the color subtly changes from pixel to pixel. The filling stops wherever there are harder edges.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Texture Analysis</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">One of the more interesting primitive features that can be dealt with in machine vision is repeating and quasi-repeating patterns. Strong textures can confound simple object detection because they can invoke edge detection and stop flood-fill operations. Following are some images that include strong textures that can easily foil such simple operations.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Samples of textures, such as grass, bricks, leopard spots, marble, and water." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/texture_samples.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Samples of textures, such as grass, bricks, leopard spots, marble, and water.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Texture recognition is such a challenge to deal with in large part because it's difficult even to define the concept of textures formally. Even dictionaries don't seem to do it much justice. Here are some examples:</div><ul style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><li style="text-align: justify;"><a href="http://www.cogsci.princeton.edu/cgi-bin/webwn2.1?s=texture" style="text-decoration: underline;" target="_blank">The characteristic appearance of a surface having a tactile quality</a></li>
<li style="text-align: justify;"><a href="http://www.ackland.org/tours/classes/glossary.html" style="text-decoration: underline;" target="_blank">The tactile quality of a surface or the representation or invention of the appearance of such a surface</a></li>
<li style="text-align: justify;"><a href="http://www.google.com/url?sa=X&start=12&oi=define&q=http://www.gaf.de/presshelp/glossary/p81.htm" style="text-decoration: underline;" target="_blank">In a photographic image the frequency of change and arrangement of tones</a></li>
</ul><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">What the above sample images illustrate, though, is how obvious the notion of texture seems to our visual systems, even if it's difficult to formally define.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One characteristic that seems somewhat consistent about textures is what can be called a "color scheme". In the first image, the grass is heavy in the greens and blacks. The bricks are heavy in reds and blacks. The water is heavy in blues.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">How can we use this in automation? Here's a simple illustration. Imagine taking a sampling of many or all of the colors in a patch of some texture of interest. We'll call that collection of colors the color scheme. Now for each color, we find all pixels in the source image that have that same color and add them to a total selection. Following is an illustration of this using the above images as sources:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: The same images above with some textures selected based purely on color schemes." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/texture_color_schemes.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: The same images above with some textures selected based purely on color schemes.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">It should be fairly apparent that in most of the cases above, the color scheme-based selections seem very strongly biased towards highlighting just the textures of interest. The leopard one seems a poor example, to be sure. That seems to be because the black spots themselves are very similar in color to the black in the tree branches, leaves, and so forth. Whatever its power, this neat trick is surely not sufficient for recognizing textures.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">What we could do with the selections made, then, is to start by removing the "noise" pixels. That is, we can find places in an image -- like with the grass -- where there are small, stray islands of pixels not in the selection and just add them. Likewise, we can find stray islands of selected pixels among non-selected ones and remove them from the selection. Next, we could segment an entire image up into large blocks -- perhaps squares -- and, for each, see if a large percent of the pixels in it are among the selection. The resulting "block map" can be used to pick out the rough shape or shapes of items with the given texture. And so on.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The above thought experiment assumes that we have "intelligently" picked out some patch of an image as a candidate for a texture. What would be to stop us, alternatively, from picking a patch that contains both some of the water and some of the hills on the shore in the right-hand image, for example?</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One other issue this dodges is changes in illumination, as from shadows or the like.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Besides the notion of color schemes applying to textures, there does tend to be genuine structure. The grass texture, for example, has edges that favor up and down orientations. The bricks are definitively ordered from top to bottom in a zig-zag pattern. The leopard's spots are definitively spots with semi-regular spacing, if no obvious ordering. This facet seems to require some more sophisticated processing to deal with.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">One interesting approach to texture analysis involves taking a large number of samples of pairs of nearby pixels. For each pixel in the source image, we look around at those pixels within a fixed radius of it. For each pair of pixels, we note the brightness of each pixel. Let's say instead of recognizing 256 shades of gray (brightness), we recognize only 8. We then create a matrix (grid) that's 8 columns wide and 8 rows tall, where columns represent the first pixel's brightness and the rows represent the second's. For each pair we find, we look in the matrix for the place that represents that pair's combination of brightness levels. Each place in the matrix starts out as zero, so each time we find a match for a combination, we add one to it.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">With a little extra math, we can boil the resulting matrix down to a set of simplified characteristics called "energy", "inertia", "correlation", and "entropy". These can further simplify the task of recognizing a texture using a neural network or classifier system.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">When we're done, we have a matrix that could be used by, say, a neural network to recognize textures. With a little more math, we can improve the ability to deal with some different orthogonal (90°) rotations of a given texture. One downside to this concept, however, is that it doesn't directly address finding edges of textures. Much of the literature seems to focus on cases where the entire image is of an homogeneous texture and nothing else. And one limitation seems to be that if one zooms in or out of a given texture, the resulting matrices will probably be different for a set of texture images.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">There are other variants of this sort of concept that involve different mathematical complexities. They generally seem to suffer some of the same limitations, though. If anything, they seem more exercises in fascinating mathematics than in practical vision systems. It seems so much easier to pick out textures in color images than in black and white, yet these techniques focus naively on black and white for mathematical elegance. Despite these sorts of shortcomings, though, their conceptual basis seems to have merit.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">As a side note, there is a related but separate field of study into what is called "texture synthesis", which is about using a sample image texture to generate extensions of that texture or new textures altogether based on multiple source images. Following is an illustration of some examples of this concept. Each real image is paired with a new texture programmatically generated based on it.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Above are source images and below are new textures synthesized based on them." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/texture_synthesis.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Above are source images and below are new textures synthesized based on them.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Although synthesis is not the same thing as analysis, there does seem to be a useful symmetry here. The ability to recall a texture from memory is essentially an ability to synthesize it using some set of rules. These rules should be simpler than the original image, in a sense, and be more generic than what one would expect from just tiling an image to create a repeating texture.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h1 style="-webkit-text-decorations-in-effect: none; color: #6e3133; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Two Dimensional Perception</h1><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">Taking a step above the primitive features of images discussed above, we can start to talk more about the substantive content in images. We'll focus in this section on two-dimensional features. That is, we'll limit ourselves to images that don't have intrinsic depth; as though we were considering a bulletin board with flat things pinned to it.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Following are some examples of images that we can process in a two-dimensional context.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Some images that are good candidates for 2D perceptual processing." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/2d_examples.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Some images that are good candidates for 2D perceptual processing.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Not surprisingly, there are many ways to approach analyzing such images. Since there's still no such thing as general purpose machine vision systems, yet, deciding which one to use is often a matter of what one is trying to accomplish.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Pixel Pattern Matching</h2><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Images of bathroom tiles in our illustration." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/bathroom_tiles.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Images of bathroom tiles in our illustration.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">As stated above, the goal of a machine vision system often determines the method chosen. Let's say our goal was to identify the contents of whole images against known images.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Let's say we have a set of images of bathroom tiles that we manufacture. In our application, we will be fed images of whole, single tiles. The images are always of the same width and height. We also have a finite set of images of the tiles we manufacture. As we're fed new images to identify, then, we want to identify which known tile the new image is most like.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Since we have whole images, we decide to do best-fit matching of the whole images. Looking at the figure at right, it seems the main distinguishing feature among the four sample tiles is their overall color. That suggests one simple approach might be to find the average color of each tile. Our database of known tile models would simply have the same average color calculated on one or perhaps several samples of each tile model. So as a new tile image comes past our analyzer, it takes the average color and finds the one in the database that has the shortest "distance" from the sampled color to each archetype's average color. To avoid problems that might arise from the white-space surrounding each tile, we might ignore the outer 10% margins of each image when finding the average color.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Let's say that we found that there are tiles that had the same average color but which are different in shape. Some are square, some hexagonal, and some triangular. The above color-based algorithm would probably not be sufficient. We might modify our algorithm to include shapes. To deal with shape, we'll opt to create simple masks for each known shape. Each mask is just a two-color image, as illustrated by the following figure.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center><table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Sample masks for recognizing squares, triangles, and hexagons." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/pixel_shape_masks.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Sample masks for recognizing squares, triangles, and hexagons.</td><td></td></tr>
</tbody></table><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The shapes don't have to be perfectly clean or straight. Each tile model, then, is associated with one of the known shapes. So when we see a new image, we compare the shape of the tile within it against the known shapes. To do this, we might first use a flood-fill starting from one corner of the image to select the white margin around the tile. From this selection we create a new image that has the same two colors as our shape masks. Next, we compare the two images, pixel by pixel. For each pixel that doesn't match, we add one to a count of mismatched pixels. In the end, whichever mask has the lowest mismatch count is the one we choose as best representing the shape of the tile in our test image.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">To make our algorithm a little better, we also use the mask we created using flood fill to find the average color by only looking within the area that is not in the outer-margin selection. Armed with the known shape and average color of the tile, we again find the best match for these two properties in our database and thus identify our image.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">This thought experiment illustrates how straightforward some vision applications can be when they are defined carefully to reduce their potential complexity. What if we increased the complexity of our present problem? Let's say one series of tiles is white and square, but each has a different large letter (e.g., "A", "B", "C") on it. The above algorithm is no longer sufficient.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">To solve this problem, we decide to first identify the model or model series of tile and, if a tile is identified to be in the "letter" series, we'll use a new algorithm to identify which letter it is. We could use the masking approach described above, but let's be creative and say we want to use a neural network. We buy an off-the-shelf neural network software package and train it to recognize each of the letters that we might find on the tiles in the letter series. Training done, we switch the neural net into its regular behavior mode and go from there. With each tile put before it, the neural net will output which model (letter) it thinks the tile represents.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">In each case in the above example, we've considered what might loosely be called pixel patterns. We considered the average color of a textured object, the overall shape in terms of a mask, and the shape of some bitmapped feature (letters) within such a shape. We never resorted to trying to find lines or corners or other more abstract features. We didn't even need to deal with images being at different scales or rotations, let alone in varying lighting conditions.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Big Blobs</h2><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">One practical technique available for use in 2D perception applications is the isolation of objects of interest into "blobs" that can be counted, characterized, or have their positional relationships considered.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Insects "thresholded" to isolate
them from a fairly plain background." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/blobs_insects.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Insects "thresholded" to isolate<br />
them from a fairly plain background.<br />
</td><td></td></tr>
</tbody></table><br />
<div style="text-align: justify;">The figure at right illustrates a typical example. The technique used to isolate the insects in the image from the background is trivial. The brightness of each pixel is measured and, if it is above a certain threshold value, it is painted white and otherwise black. Each insect, in the thresholded image, can be thought of as a "blob" in the image. We'll call them "blob objects", or "blobjects".</div><br />
<div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">It's easy for us to perceive the individual blobjects, but can be quite a challenge to get a piece of software to do as well. The simplest approach would be to consider every black pixel in the image and, for each, perform a flood-fill. The flood-filling would continue until one of the following conditions is met:</div><ol style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><li style="text-align: justify;">The width of a bounding box gets larger than some constant W.</li>
<li style="text-align: justify;">The height of a bounding box gets larger than some constant H.</li>
<li style="text-align: justify;">The area (number of pixels) of the region gets larger than some constant A<sub>max</sub>.</li>
<li style="text-align: justify;">The region gets fully filled and the area of the region is larger than some constant A<sub>min</sub>.</li>
</ol><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Only in this last condition would we conclude that we've found an insect blob. To help speed up execution a bit, we would keep track of all the pixels we've already tested so as we're continuing the scan, we don't consider the same blobject twice, for example.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">It should be apparent from this example, however, that blob detection is not going to be a clean process using the above algorithm. Wherever insects touch or overlap one another, it's likely we will meet one of the above failure conditions. The bounding box might get too big or the total area filled by a region might be exceeded. It becomes necessary to introduce more sophisticated techniques to get more accurate information.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Admittedly, this technique, which can be practical in controlled circumstances and with certain classes of tasks, can be quite useful, is actually very limited. The need to manually set a usable threshold value for separating blob from background means it's usually necessary to ensure that the background against which blobjects are to be placed must be in high contrast to the blobjects. And what makes this fundamentally a two dimensional perception problem is the fact that the blobjects really need to be guaranteed to be generally non-touching and non-overlapping. This is usually much harder to come by in a three dimensional environment.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Point Orientation</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: 'Times New Roman';"><span class="Apple-style-span" style="font-family: Tahoma, Arial;">One somewhat simplified version of blob detection that has found practical application is navigation based on known, fixed points that can be perceived. The term "astral navigation" is a term that's common fare in popular science fiction to identify how a space vessel can get its bearings by observing the positions of stars around the vessel. And now we actually do have such vessels which are able to do this, including Nasa's </span><a href="http://nmp.jpl.nasa.gov/ds1/" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-decoration: underline;" target="_blank">Deep Space One</a><span class="Apple-style-span" style="font-family: Tahoma, Arial;">.</span></span></div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The concept of orienting based on the positions of points is fairly straightforward. First, one takes an image of the stars in the current field of view. In deep space, most of the possible field of view is black or very nearly so. Most visible objects appear as small dots perhaps one or a few pixels in size. It's easy to isolate these blobs from the black of space. Their positions in the image are recorded as a list of points. The goal is to be able to identify which known star each of the given points represents. Once one knows with certainty which stars any two of the points in the image are, it's then easy to figure out which way the spacecraft is facing.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Using the relative distances between stars as a
way of identifying stars for use in self orientation." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/astral_navigation.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Using the relative distances between stars as a<br />
way of identifying stars for use in self orientation.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Although there are plenty of ways to use point position information in the source image to figure out which stars one is seeing, let me describe one very simplistic way to illustrate how easy it can be. First, assume that our camera cannot change its zoom level. We know that our spacecraft will stay within our own solar system, which means that no matter where we are within the solar system, the positions of luminous objects (stars, galaxies, etc.) outside our galaxy will not appear significantly different than if we were somewhere else in the solar system. So in any picture our spacecraft takes of any two known luminous bodies outside this system, the distance measured between them will be the same. Our solution, then, begins with a database with two basic kinds of information. The first is a list of known luminous bodies. The second is a list of distances between any two known luminous bodies. So we take a picture of the sky and separate all the bright blobs into their own point positions. For each pair of points, we measure the distance and go to our database of body-to-body distances to find candidates. As we do this, we'll have different possible alternatives, but the more distances we measure and correlations we make, the more we will be able to narrow the interpretations down to exactly one and to increase our certainty of the interpretation.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">It's important to note that this technique works great in the context of astral navigation because we can count on the field of view to vary minimally within the distances that we care to work with. This is what makes it a two dimensional perceptual problem. If we were talking about a spacecraft that traversed many light years' distance, the field of view would change enough that we would have to change our approach because it would now be a three dimensional perceptual problem.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">2D Feature Networks</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">Given a complex two dimensional scene and a goal of being able to identify all the objects within it, one general approach is to identify a variety of easily isolated primitive features and to attempt to match the combinations of such features against a database of known objects. There are many ways to go about this, and no way seems to fit all needs. Still, we'll consider a few here.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The astral navigation technique described earlier can be a good starting point for identifying objects in a 2D scene. The first thing to do is identify important points. This can be done by identifying exceptionally bright or dark points, blobs of a significant color, and so forth and calculating distances among the points to see if there are known configurations in the scene. Another primitive feature that some researchers have had success in isolating is sharp corners and junctions where three or more lines meet. In an image of a "pac man", for example, there are three sharp corners that form the pie wedge of a mouth in the circular body. A picture of a stick man would have lots of corners and junctions. Once the raw image is processed to find such corners and junctions, they too become points whose relative positions can be measured and compared to known proportions. When a significant percentage of the components that define some object are matched, we can isolate that portion of the image as a single instance of that kind of object.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Another interesting technique that has been tried is to study the outline of a shape. A shape can be isolated using edge detection or flood filling, for example. A secondary image that only includes the outline of a single shape can be isolated. It's not hard, then, to find the smallest possible circle that can fit around that shape and identify its center point. Then the code traverses from one point to the next in the outline. For each point, the distance from the center and the angle are measured. Because the object can be rotated in any angle, one goal is to "rotate" the image until it fits a "standard" orientation. One way to do so would be to find the three or more points that are farthest from the center; i.e., those that touch the outer circumference. Using one of a variety of techniques, one of these points can be identified as the "first" one. The whole image would effectively be rotated around the center point so that that first point is straight above the center point. The image wouldn't literally be rotated, of course. What would actually happen is that an angular offset would be added to each angle-plus-distance measured point so that the "first" point would actually be the first one in the list of such points. Then the distances-from-center would be normalized so that the farthest-out ones would be exactly one distance unit from the center.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The result of all this processing, then, can be graphed as a linear profile, with the X axis going from zero to the full 360 degrees and the Y axis measuring from zero to one the normalized height. This graph can be further analyzed to find known patterns. One simple way to do this is to reduce the resolution of the graph so that the X and Y values range from, say, zero to sixteen and to create a 16 x 16 matrix with true and false values. There would be a true value at any point in the matrix where at least one point is found at that combination of X and Y values. That matrix can then be compared against a database full of such matrices for known shapes.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h1 style="-webkit-text-decorations-in-effect: none; color: #6e3133; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Three Dimensional Perception</h1><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">In contrast to two dimensional perception, three dimensional perception is all about processing of information in all three spatial dimensions, not just in flat or virtually flat worlds; usually, of detecting where some or all objects within a visual field are in space. Although there are many interesting experiments and products that deal in 3D perception, this area is much less well developed. Let me introduce some broad areas of interest.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="text-align: justify;"><nobr> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#Direct3DPerception" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Lasers and Direct 3D Perception</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#Binocular" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Binocular Vision</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#Geometric" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Geometic Perception</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#GenericViews" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Generic Views</a></nobr> </div><nobr><div style="text-align: justify;"> <a href="http://www.alexandria.nu/ai/machine_vision/introduction/#LightShadow" style="text-decoration: underline;"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Intuiting Shape from Light and Shadow</a></div></nobr><a href="" name="#Direct3DPerception"></a><br />
<h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;"><a href="" name="#Direct3DPerception"></a><a href="http://www.alexandria.nu/ai/machine_vision/introduction/##3D" style="text-decoration: underline;"><img alt="Back up to Three Dimensional Perception" border="0" src="http://www.alexandria.nu/images/to_top_h2.gif" /></a> Lasers and Direct 3D Perception</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">No discussion of 3D perception could be complete without consideration of the most obvious technique of perceiving objects in space: directly. Human eyes are presented with flat images that we have to work with to guess at how far away things are in space. Certain kinds of devices, though, actually "see" how far away things are.</div></span><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Emitted and reflected
microwave pulse." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/radar.jpg" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Emitted and reflected<br />
microwave pulse.<br />
</td><td></td></tr>
</tbody></table><br />
<div style="text-align: justify;">It all started with radar, a British invention dating back to some time between World Wars One and Two. Scientists found that radio waves would reflect off of some kinds of objects and sometimes back toward their original sources. Since we know how fast a radio wave travels -- the same speed as light waves -- it became possible to measure how far away the reflector was based on how long it takes for a radio wave to be received after it was transmitted. It is true of an ordinary transmitter, like a radio broadcasting aerial tower, that its signals get reflected back toward it. And you could probably measure the time differences, but you'd have two problems. First, you'd have to send out very short pulses instead of a continuous broadcast. You need a "beginning" for your signal so you have a beginning of when it returns so you can calculate the time difference. Second, you wouldn't know where in space the reflector -- a car, for example -- is. To figure out that, you need to focus the radio beam so it mainly travels in a single direction. Then you would sweep your transmitter / receiver combination back and forth or around in a full circle so you cover a wide field of view. While radio waves were where radar technology began, technically speaking, most radar systems doesn't use radio waves, but the narrower microwaves. They penetrate weather and certain materials better, but they also can be used to create finer images.</div><br />
<div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">What is true of radio and microwave waves in this regard is also true of light waves. You could, technically, have a friend stand miles away with a mirror and shine a flashlight in his direction and measure how long it takes before you see the reflection in order to calculate how far away he is, but the time delay would be so small that you probably wouldn't notice it. It's been estimated that an object traveling as fast as light could circle Earth about seven times in a single second. Still, we have long had electronics that operate fast enough to detect such small time delays.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">The gold standard today in direct perception of 3D spaces is to use lasers. Using the same concepts described above for radar, a scanner makes a laser beam scan left to right, top to bottom in the same sort of way you typically read a page of text in a book. In each direction the scanner is aimed, a laser pulse is fired and a light detector determines how long it takes for a reflection to be measured. Since we have a direction and a distance, we can plot a point in 3D space where the reflection occurred and hence where some part of a physical object is. And since a laser beam can be made to stay very sharp over large distances, it's possible to get very precise 3D coordinates using a laser scanner. Following is an example of a machine for surveying using a laser scanner.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.histru.auckland.ac.nz/Scanner.html" style="text-decoration: underline;" target="_blank"><img alt="Figure: Laser scanner used in high definition surveying." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/laser_scanner_field.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: Laser scanner used in high definition surveying.</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">3D points do not a 3D picture make, though. Usually, the next step is to connect the points together into 3D surfaces. One could simply do this by assuming every reflection point measured is connected to the ones to the left and right and above and below. The problem with this is that one ends up seeing the entire world as one single, solid object. We know, of course, that the 3D world is composed of many separate objects and we know that some things are in front of others that we can't see.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">If I were trying to endow a robot with the ability to see using laser scanning, I would probably want it to understand this idea of one object occluding another and the idea that there may be space between them that can't be seen yet. One very simple way to do this is to modify the above mesh-building algorithm slightly. For each pair of neighboring points, I would calculate how far away they are in depth. Above a certain depth, I would declare the two points part of separate surfaces and below it, I would assume the two points part of the same surface. The following figure illustrates this:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: One way to determine when surfaces are connected or disconnected." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/laser_scanner_separate_surfaces.gif" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: One way to determine when surfaces are connected or disconnected.</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">How would I set the threshold? The broken record comes around again here to sing the refrain that there is no universal answer. Perhaps our goal would be to make it so our robot can move around in space and so we might arbitrarily choose a threshold of, say, 3 feet if our robot can move within a 3 foot wide space. We could also use the "tears" in the 3D mesh to segment distinct objects out using familiar techniques like our basic flood-fill algorithm.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Before you decide that in laser scanners we finally have found the ultimate solution to the 3D perception problem, let me throw water on that fire. If the goal is to get a 3D image of the world, laser scanning is an excellent solution. If the goal is to get a machine to understand the world, laser scanners do nothing than measure distances to points. They don't "understand" the world any better than digital cameras do. And they tend to not see light levels or colors like a camera does; only points in space. As you'll see later, though, laser scanning can be used in conjunction with other techniques as "cheats" to work around solving certain problems that are easily solved by our own visual systems.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h2 style="-webkit-text-decorations-in-effect: none; color: #7c4815; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Binocular Vision</h2><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.fuzzgun.btinternet.co.uk/rodney/rodney.htm" style="text-decoration: underline;" target="_blank"><img alt="Figure: Two cameras in a stereo (binocular) arrangement." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/stereo_eyes.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: Two cameras in a stereo (binocular) arrangement.</td><td></td></tr>
</tbody></table><br />
<div style="text-align: justify;">We have two eyes. And while it's true they give us a sharper image than a single eye would, the most interesting benefit of having two eyes is that we can use them to help us judge distances. We do so using "binocular" vision techniques.</div><br />
<div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">To understand what this means, try a simple experiment. Look at a corner or other vertical edge on a distant wall. Close your left eye. Stick your finger up at arm's length so the tip is just to the left of that vertical edge. Now open your right eye and close your left. You should see that the edge is now to the right of the edge. Try alternating between having just your left and right eyes open and you'll see that your finger appears to move between being to the left and to the right of that vertical edge. The reason for this is fairly obvious: your eyes are in two different places in the world and so see different views of it. Your brain makes use of these differences to tell you useful things, like the fact that your finger is closer to you than that vertical edge is.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">In theory, binocular vision makes perfect sense and is pretty easy to imagine. In practice, though, making software to line up objects seen by two cameras in a binocular arrangement is not so easy to do. One way that's been explored by some researchers is to find "interesting" points in a stereo pair of images and to measure how different those points are from one another in the horizontal direction. Provided one can tell when two points of interest represent the same point in 3D space, one can build up what Hans Moravec calls an "evidence grid". The horizontal offset of each point pair provides "evidence" of a real point in space where that point exists. The following illustrates this idea:</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><center style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.frc.ri.cmu.edu/~hpm/project.archive/robot.papers/1996/9609.stereo.paper/SGabstract.html" style="text-decoration: underline;" target="_blank"><img alt="Figure: Representation of 3D points discovered as "evidence of occupancy" in three dimensions of certain points in a stereo pair of images." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/stereo_occupancy_grid.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: Representation of 3D points discovered as "evidence of occupancy" in three dimensions of certain points in a stereo pair of images.<br />
</td><td></td></tr>
</tbody></table></center><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">While it just looks like a cartoonish version of the original image on the left, the one on the right is actually just a 2D projection of a 3D evidence grid built up from a stereo pair of images like the original one seen here. One could take that 3D image and rotate it to project a view from any place within or "outside" the room imaged. And with a bit more processing, one can make some guesses about how points in the evidence grid are related to form meshes of the same sort described earlier for use in 3D laser scanning.</div><br />
<table border="0" cellpadding="0" cellspacing="0" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.ces.clemson.edu/~stb/research/stereo_p2p/" style="text-decoration: underline;" target="_blank"><img alt="Figure: Depth discontinuity segmentation." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/stereo_discontinuity.jpg" vspace="2" /></a></td></tr>
<tr><td style="text-align: center;">Figure: Depth discontinuity segmentation.</td><td></td></tr>
</tbody></table><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><br />
</span></div><span class="Apple-style-span" style="font-family: Tahoma, Arial;"><div style="text-align: justify;">Another technique involves trying to match pieces of the left image with pieces of the right image using literal bitmap matching. Because we shouldn't have vertical variation, only horizontal variation, we start by breaking up the original bitmaps into separate horizontal slices from top to bottom in each image and comparing corresponding slices. Each slice is a 1D bitmap, which is much easier to process. The next part is to break this 1D image up into segments using edge detection. Each span of pixels between two bounding edges is considered as a single, solid "object". The average color of each object can easily be calculated and that color can be searched for in the opposite image's corresponding slice. Using some brute computation, we should be able to come up pretty readily with a pretty good interpretation of which objects in the left image slice lines up with the ones in the right image slice's. There will often be miscellaneous bits that don't match, of course. They make computation a little more complicated. But we can even improve on the quality of our guesses by comparing the results of one horizontal slice pair against the ones directly above and below it to see if there are correlations. The end result, though, is yet another set of points in space that are defined by the edges found earlier and including bitmaps that nicely fill the spaces between those points.</div></span><br />
<div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><table border="0" cellpadding="0" cellspacing="0" style="text-align: justify;"><tbody>
<tr><td style="text-align: center;"><img alt="Figure: Calculating distance using stereo disparity." hspace="2" src="http://www.alexandria.nu/ai/machine_vision/introduction/stereo_disparity.gif" vspace="2" /></td></tr>
<tr><td style="text-align: center;">Figure: Calculating distance using stereo disparity.</td><td></td></tr>
</tbody></table><br />
<div style="text-align: justify;">One somewhat low-complexity technique for determining distance to objects in a scene involves taking a relatively small portion of what one eye sees -- roughly around the center of its field of view -- and finding out where the best match for it is in the right eye. This requires moving a frame of the same size as the one for the left eye from left to right in the right eye's field of view. At each point, the differences of each left/right pixel pair are summed up. Once this survey is done, the place that had the lowest sum of differences is considered the best match. The horizontal pixel offset of that frame's position in the right versus the left camera's corresponding frame is then used to calculate how far away the subject matter is. This works fairly well when what the frames contain is pretty homogeneous, in terms of distance, or when parts of the background -- perhaps a wall behind a person -- that do creep into the frame are relatively flat in texture. This technique is analogous to how your own eyes work, but it only give distance for whatever is in the frame, rather than building a complete 3D scene.</div><br />
<div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;"><br />
</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><h1 style="-webkit-text-decorations-in-effect: none; color: #6e3133; font-family: 'Bookman Old Style', 'Book Antiqua', 'Times New Roman'; margin-bottom: 0px; text-align: justify;">Final Thoughts</h1><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">I hope this brief introduction to machine vision has been helpful to you. As I stated in the beginning, it is by no means complete, but it's not a bad intro if you are just getting started or are just curious. I also hope it has successfully given you the sense that a lot of the stuff being done today is not as complicated -- or competent -- as it is often portrayed in the popular media and technical literature. There's a lot one can do with just some simple tricks.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Moreover, we're clearly nowhere near achieving the ultimate goal of general purpose vision in machines. There's plenty of room for aspiring AI researchers to get in the game, even today. The road ahead is long and the prospects are great.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Incidentally, I have made a point of not making reference to my own machine vision research projects because I didn't want this to be primarily about my work. I invite you, however, to check out my <a href="http://www.alexandria.nu/ai/machine_vision/" style="text-decoration: underline;">machine vision site</a> for more about what I'm working on.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial; text-align: justify;">Following are some other sites I found of interest in the subject of machine vision.</div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="text-align: justify;"><nobr><a href="http://www.ecs.soton.ac.uk/publications/rj/1995-1996/isis/ndm/journal.htm" style="text-decoration: underline;" target="_blank"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Vehicle Detection and Recognition for Autonomous Intelligent Cruise Control</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr><a href="http://www.alexandria.nu/ai/machine_vision/introduction/Test%20&%20Measurement%20World%20-%20Algorithm%20Choices%20Give%20Pattern%20Matching%20an%20Edge" style="text-decoration: underline;" target="_blank"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />http://www.reed-electronics.com/tmworld/article/CA187424.html</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr><a href="http://vision.gel.ulaval.ca/~bilodeau/vi99.pdf" style="text-decoration: underline;" target="_blank"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Outline-Based Part Segmentation Using Intermediate-Level Symmetries (PDF)</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr><a href="http://journalofvision.org/4/7/3/article.aspx" style="text-decoration: underline;" target="_blank"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Journal of Vision - Junctions and cost functions in motion interpretation</a></nobr> </div><nobr></nobr><div style="text-align: justify;"><nobr><span class="Apple-style-span" style="white-space: normal;"></span></nobr><nobr><a href="http://www3.sympatico.ca/vpaquin/tutorial/tutorial1.htm" style="text-decoration: underline;" target="_blank"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Point Pattern Matching</a></nobr> </div><nobr><div style="text-align: justify;"><a href="http://www.sci.utah.edu/~cscheid/spr05/imageprocessing/project4/" style="text-decoration: underline;" target="_blank"><img border="0" src="http://www.alexandria.nu/images/bullet.gif" />Project 4: Feature Detection</a></div></nobr><br />
<form action="http://www.alexandria.nu/ai/feedback.asp" method="post" style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"><center><table border="0" cellpadding="0" cellspacing="0" style="border-bottom-color: black; border-bottom-style: solid; border-bottom-width: 2px; border-left-color: black; border-left-style: solid; border-left-width: 2px; border-right-color: black; border-right-style: solid; border-right-width: 2px; border-top-color: black; border-top-style: solid; border-top-width: 2px; border-width: initial; text-align: justify;"><tbody>
<tr><td style="background-color: #550033; color: white; font-weight: bold; padding-bottom: 2px; text-align: center;">Your Feedback</td></tr>
<tr><td style="background-color: #ffffee; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: right;"><table border="0" cellpadding="1" cellspacing="0"><tbody>
<tr><td style="color: #666666; text-align: right;">Name (optional):</td><td><input name="Name" size="50" /></td></tr>
<tr><td style="color: #666666; text-align: right;">Email (optional):</td><td><input name="Email" size="50" /></td></tr>
<tr><td colspan="2"><table cellspacing="8"><tbody>
<tr><td nowrap=""><b>Prove Your Humanity:</b><br />
<table><tbody>
<tr><td><img src="http://seasoned.com/community/message_board/hip_163522.jpg" /></td><td><input name="Hip" /></td></tr>
</tbody></table>Please enter the code you see here. This is designed to<br />
protect our message board from spam posted by automated software.<br />
Those programs can't easily read these codes like you and I can.</td></tr>
</tbody></table></td></tr>
<tr><td style="color: #666666; text-align: right;">Subject:</td><td>AI - MV Intro - Final Thoughts</td></tr>
</tbody></table><textarea cols="52" name="Body" rows="4" wrap="soft"></textarea><table border="0" cellpadding="0" cellspacing="0"><tbody>
<tr><td><a href="mailto:jvc_ai@carnell.org &subject=AI - MV Intro - Final Thoughts" style="text-decoration: underline;">Or write me an email instead.</a></td><td> </td><td><input type="submit" value="Send Message" /><br />
</td></tr>
</tbody></table></td></tr>
</tbody></table></center></form><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div><div style="-webkit-text-decorations-in-effect: none; color: black; font-family: Tahoma, Arial;"></div></span><div style="text-align: justify;"><br />
</div><a href="http://www.cns.atr.jp/hrcn/DB/home.html" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: justify; text-decoration: underline;" target="_blank"><br />
</a></div></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-52316294234979364522010-12-21T06:09:00.001-08:002010-12-21T06:09:17.253-08:00Quadrupedal Hybrid Robot<div style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px; text-align: justify;"><span style="font-family: arial; font-size: 17px;"><strong>Project Snow: Introduction</strong></span><br />
<strong style="font-family: 'trebuchet ms';"></strong><br />
<span style="font-family: 'trebuchet ms'; font-style: italic;">Project Snow </span><span style="font-family: 'trebuchet ms';">is my next robotics venture. After being fascinated by number of hybrid wheeled-legged robots I decided to design my own Qudrupedal Hybrid robot.</span><br />
<br />
<span style="font-family: 'trebuchet ms';">My <a href="http://aaqilkhan.blogspot.com/2007/08/autonomous-robotic-rover.html" style="color: #5588aa; text-decoration: none;">previous robot</a> was based on a </span><em style="font-family: 'trebuchet ms';">PIC16F877A</em><span style="font-family: 'trebuchet ms';"> microcontroller and had limited environmental sensing capabilities. Its autonomous capabilities relied on an infrared ranging sensor to detect objects and avoid collisions. Its arm was retrofitted with an end-effector to pick up objects. However, the arm was manually teleoperated by a graphical user interface on a remote computer by means of wireless communications using peer-to-peer RF transceivers.</span><br />
<br />
<span style="font-family: 'trebuchet ms';">After being inspired by famous robots like the </span><a href="http://www.bostondynamics.com/content/sec.php?section=BigDog" style="color: #5588aa; font-family: 'trebuchet ms'; text-decoration: none;">Big Dog</a><span style="font-family: 'trebuchet ms';"> from Boston Dynamics and </span><a href="http://gizmodo.com/gadgets/halluc-ii/" style="color: #5588aa; font-family: 'trebuchet ms'; text-decoration: none;">Halluc-II</a><span style="font-family: 'trebuchet ms';"> - a Japanese robot, which boast advanced quadrupedal and octopedal locomotion respectively, I</span><br />
<span style="font-family: 'trebuchet ms';">started the design work on my next robot during the winter of 2007-2008, when we had record snowfall in Toronto, ON - hence the name for this project.</span><br />
</div><span class="Apple-style-span" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/SBehr4H1b_I/AAAAAAAAAU0/FZJyChk_ka0/s1600-h/bigdog.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5194798470366851058" src="http://2.bp.blogspot.com/_fTdr602nenQ/SBehr4H1b_I/AAAAAAAAAU0/FZJyChk_ka0/s320/bigdog.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span><span class="Apple-style-span" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/SBeiC4H1cBI/AAAAAAAAAVE/IdEfdYsqHpM/s1600-h/H2A.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5194798865503842322" src="http://2.bp.blogspot.com/_fTdr602nenQ/SBeiC4H1cBI/AAAAAAAAAVE/IdEfdYsqHpM/s320/H2A.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; height: 228px; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center; width: 269px;" /></a></span><span class="Apple-style-span" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px;"><br />
</span><span class="Apple-style-span" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px;"><br />
</span><div align="justify" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px;"><span style="font-family: 'trebuchet ms';"><strong><span style="font-family: arial; font-size: 17px;">Highlights of Snow</span></strong></span></div><div align="justify" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 20px;"><br />
<span style="font-family: 'trebuchet ms';"><strong>System Overview</strong><br />
This robot will have an on board ASUS motherboard powered by an Intel Pentium processor at its core, running a custom software written in Visual Basic and will use image processing capabilities of<a href="http://www.roborealm.com/" style="color: #5588aa; text-decoration: none;">Roborealm</a> embedded in the VB application. The OS will be a stripped down version of Windows XP which will happily run on 1GB RAM.<br />
<br />
<strong>Sensors</strong><br />
The robot will be endowed with fixed infrared sensors on its front, rear and sides to sense its environment. These sensors will act more like ears, reacting in real-time to intruders and obstacles in its path. These sensors will allow the robot to "listen" to its surroundings and the software will compensate to avoid collisions.<br />
<br />
For its eyes, a 3 megapixel camera will give it vision capabilities and with RoboRealm, the robot can be programmed to react to objects of different shapes and colors, as well as aid in navigation. The camera will be mounted on a turret to allow for 180deg pan angle and 90deg tilt angle.<br />
<br />
<strong>Mechanics</strong><br />
The design of this robot will be quadrupedal hybrid type. 'Quadrupedal' implies that the robot will “walk” on four legs like any quadruped animal. 'Hybrid' means that it will have leg and wheel combination. While the legs will give it the Maneuverability to climb low profile objects or traverse uneven terrain, the wheels will give it the agility to react to sudden threats. While this may not exactly allow it to dodge bullets, it can, for example, move back quickly to protect itself from a slamming door or a charging dog and turn with within zero-radius.</span></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-34008970805212419062010-12-21T06:08:00.000-08:002010-12-21T06:08:25.645-08:00Zigbee + solar + robot<div class="post-body entry-content" style="color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 1.6em; margin-bottom: 0.75em; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><span style="font-weight: bold;"><span style="color: black;"><span class="Apple-style-span" style="color: #333333; font-weight: normal; line-height: normal;"></span></span></span><br />
<div class="post hentry" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: dotted; border-bottom-width: 1px; margin-bottom: 1.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.5em; padding-bottom: 1.5em;"><div class="post-body entry-content" style="line-height: 1.6em; margin-bottom: 0.75em; margin-left: 0px; margin-right: 0px; margin-top: 0px;"><a href="http://3.bp.blogspot.com/_fTdr602nenQ/RyPzFz4yCbI/AAAAAAAAAJg/1oPxy-y9K0I/s1600-h/Aug5_2007+015.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126208082031020466" src="http://3.bp.blogspot.com/_fTdr602nenQ/RyPzFz4yCbI/AAAAAAAAAJg/1oPxy-y9K0I/s400/Aug5_2007+015.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><br />
<span style="font-size: 13px;"></span><br />
<div style="text-align: justify;"><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><span style="font-weight: bold;"><span style="font-size: 17px;">INTRODUCTION</span></span></span></span><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The aim of this project was to design and build a </span></span><span style="font-family: 'trebuchet ms'; font-size: 13px; font-weight: bold;">4WD differential drive rover </span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">with </span></span><span style="font-family: 'trebuchet ms'; font-size: 13px; font-weight: bold;">integrated robotic arm </span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">capable of <span style="color: #3333ff; font-style: italic;">autonomous</span>operation with <span style="font-weight: bold;">obstacle avoidance </span>and <span style="font-weight: bold;">object detection</span>. Also designed to have a manual override <span style="color: #3333ff; font-style: italic;">teleoperated </span>mode for wireless remote manipulation using a graphical user interface on a PC for both rover as well as arm control.</span></span><br />
<br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><span style="font-size: 17px; font-weight: bold;">3D DESIGN & MODELING </span></span></span><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Below are the 3D CAD models of various components of the robot designed in Solidworks.</span></span></div><span style="font-size: 13px;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/Rtmy_Yt991I/AAAAAAAAAFI/k3BiNISLB3g/s1600-h/cover_btm_4.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105308454637926226" src="http://2.bp.blogspot.com/_fTdr602nenQ/Rtmy_Yt991I/AAAAAAAAAFI/k3BiNISLB3g/s400/cover_btm_4.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://3.bp.blogspot.com/_fTdr602nenQ/Rtmy_ot992I/AAAAAAAAAFQ/zC9gJt7Zs_s/s1600-h/cover_top.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105308458932893538" src="http://3.bp.blogspot.com/_fTdr602nenQ/Rtmy_ot992I/AAAAAAAAAFQ/zC9gJt7Zs_s/s400/cover_top.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://3.bp.blogspot.com/_fTdr602nenQ/Rtmy_ot993I/AAAAAAAAAFY/Xwtmb8EZfUo/s1600-h/sharp+-+gp2d12+-+bracket.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105308458932893554" src="http://3.bp.blogspot.com/_fTdr602nenQ/Rtmy_ot993I/AAAAAAAAAFY/Xwtmb8EZfUo/s400/sharp+-+gp2d12+-+bracket.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://4.bp.blogspot.com/_fTdr602nenQ/Rtmy_4t994I/AAAAAAAAAFg/PARtL5rcl0Q/s1600-h/parts+list.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105308463227860866" src="http://4.bp.blogspot.com/_fTdr602nenQ/Rtmy_4t994I/AAAAAAAAAFg/PARtL5rcl0Q/s400/parts+list.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><br />
<a href="http://1.bp.blogspot.com/_fTdr602nenQ/Rtm2EIt996I/AAAAAAAAAFw/Dv8x_-9uXxM/s1600-h/robotarm_part1.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105311834777188258" src="http://1.bp.blogspot.com/_fTdr602nenQ/Rtm2EIt996I/AAAAAAAAAFw/Dv8x_-9uXxM/s400/robotarm_part1.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://2.bp.blogspot.com/_fTdr602nenQ/Rtm2EYt997I/AAAAAAAAAF4/tYSqOY-l7Os/s1600-h/robotarm_part2.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105311839072155570" src="http://2.bp.blogspot.com/_fTdr602nenQ/Rtm2EYt997I/AAAAAAAAAF4/tYSqOY-l7Os/s400/robotarm_part2.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://2.bp.blogspot.com/_fTdr602nenQ/Rtm2EYt999I/AAAAAAAAAGI/Sxaj6BuREQQ/s1600-h/asm+-+robot+assy.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105311839072155602" src="http://2.bp.blogspot.com/_fTdr602nenQ/Rtm2EYt999I/AAAAAAAAAGI/Sxaj6BuREQQ/s400/asm+-+robot+assy.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://4.bp.blogspot.com/_fTdr602nenQ/Rtmy_4t995I/AAAAAAAAAFo/qLu_ETCzY30/s1600-h/rover1.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105308463227860882" src="http://4.bp.blogspot.com/_fTdr602nenQ/Rtmy_4t995I/AAAAAAAAAFo/qLu_ETCzY30/s400/rover1.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><span style="font-weight: bold;"><br />
</span></span></span><div style="text-align: justify;"><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><span style="font-weight: bold;"><span style="font-size: 17px;">SOFTWARE & HARDWARE OVERVIEW</span></span></span></span><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The graphical user interface for remote operation was created in<span style="font-style: italic;">Visual Basic 6.</span> Initial concept for software control of rover and arm required knowledge of <span style="font-weight: bold;">inverse kinematics</span> for precision real-time control of the arm. Below are sketches of initial conceptualization.</span></span></div><span style="font-size: 13px;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/RtoLGIt99-I/AAAAAAAAAGQ/5t-v3VQOCyc/s1600-h/scan1.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105405327625287650" src="http://2.bp.blogspot.com/_fTdr602nenQ/RtoLGIt99-I/AAAAAAAAAGQ/5t-v3VQOCyc/s400/scan1.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><br />
<a href="http://2.bp.blogspot.com/_fTdr602nenQ/RtoLHIt99_I/AAAAAAAAAGY/cntS7GdgPh4/s1600-h/scan2.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105405344805156850" src="http://2.bp.blogspot.com/_fTdr602nenQ/RtoLHIt99_I/AAAAAAAAAGY/cntS7GdgPh4/s400/scan2.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://2.bp.blogspot.com/_fTdr602nenQ/RtoLHIt9-AI/AAAAAAAAAGg/S483jdA8hjc/s1600-h/scan3.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105405344805156866" src="http://2.bp.blogspot.com/_fTdr602nenQ/RtoLHIt9-AI/AAAAAAAAAGg/S483jdA8hjc/s400/scan3.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span><a href="http://3.bp.blogspot.com/_fTdr602nenQ/RyQBNj4yChI/AAAAAAAAAKQ/rj30uCMRt_Q/s1600-h/Pololu+servo+controller.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126223608337795602" src="http://3.bp.blogspot.com/_fTdr602nenQ/RyQBNj4yChI/AAAAAAAAAKQ/rj30uCMRt_Q/s200/Pololu+servo+controller.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><br />
<div style="text-align: justify;"><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">These concepts were incorporated into software and tested on servos that would later be part of the robotic arm. I decided on the <a href="http://www.pololu.com/products/pololu/0207/" style="color: #5588aa; text-decoration: none;">Pololu Micro Serial Servo Controller</a> that accepts serial data using <span style="font-style: italic;">RS232 protocol </span>and can drive up to 8 servos. After some mathematical calculation involving arm lengths, weights and torque requirements, I selected Hitec <a href="http://www.hitecrcd.com/servos/show?name=HS-805BB" style="color: #5588aa; text-decoration: none;">HS-805BB</a> and <a href="http://www.hitecrcd.com/servos/show?name=HS-755HB" style="color: #5588aa; text-decoration: none;">HS-755HB</a> servos.</span></span></div><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><br />
Images below shows servo data calibration and inverse kinematic equations that I derived:<br />
</span></span><a href="http://3.bp.blogspot.com/_fTdr602nenQ/RyP--z4yCcI/AAAAAAAAAJo/iQ49vlYDTrY/s1600-h/Calib_Pololu.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126221155911469506" src="http://3.bp.blogspot.com/_fTdr602nenQ/RyP--z4yCcI/AAAAAAAAAJo/iQ49vlYDTrY/s400/Calib_Pololu.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><span style="font-size: 13px;"><a href="http://3.bp.blogspot.com/_fTdr602nenQ/RtoPCYt9-DI/AAAAAAAAAG4/kert7UjuhL0/s1600-h/asm+-+robot+assy1.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5105409661247289394" src="http://3.bp.blogspot.com/_fTdr602nenQ/RtoPCYt9-DI/AAAAAAAAAG4/kert7UjuhL0/s400/asm+-+robot+assy1.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span><a href="http://3.bp.blogspot.com/_fTdr602nenQ/R0pA_b2t1KI/AAAAAAAAANc/oigjpVlHqYs/s1600-h/calib_gp2d12.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5136989783522661538" src="http://3.bp.blogspot.com/_fTdr602nenQ/R0pA_b2t1KI/AAAAAAAAANc/oigjpVlHqYs/s400/calib_gp2d12.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><div style="text-align: justify;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/RyQACT4yCdI/AAAAAAAAAJw/1nHfM-T5eNU/s1600-h/Tamiya+-+Gearhead+motor+and+wheel2.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126222315552639442" src="http://2.bp.blogspot.com/_fTdr602nenQ/RyQACT4yCdI/AAAAAAAAAJw/1nHfM-T5eNU/s400/Tamiya+-+Gearhead+motor+and+wheel2.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; height: 224px; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; width: 256px;" /></a><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The rover component of the robot consisted of four 6V DC gearhead motors from<a href="http://www.tamiyausa.com/product/item.php?product-id=72102" style="color: #5588aa; text-decoration: none;">Tamiya</a></span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><a href="http://www.tamiyausa.com/product/item.php?product-id=72102" style="color: #5588aa; text-decoration: none;"> </a>powered by a<a href="http://dimensionengineering.com/Sabertooth2X10.htm" style="color: #5588aa; text-decoration: none;">Sabertooth 2x10</a>motor controller from Dimension Engineering. The Sabertooth was set to operate in serial mode, where speed and direction data are sent as serial data packets. It provides a continuous current of 10A per channel which is sufficient to drive two motors per channel. Thus, the two left motors and two right motors are paralleled and driven by each channel of the motor controller.</span></span></div><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><br />
Below is the Sabertooth 2x10 motor controller.</span></span><a href="http://2.bp.blogspot.com/_fTdr602nenQ/RyQAbT4yCeI/AAAAAAAAAJ4/tH-uCnmxQWc/s1600-h/Sabertooth2X10big.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126222745049369058" src="http://2.bp.blogspot.com/_fTdr602nenQ/RyQAbT4yCeI/AAAAAAAAAJ4/tH-uCnmxQWc/s400/Sabertooth2X10big.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><br />
<div style="text-align: justify;"><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Both the motor controller and servo controller slave devices are driven by a master <a href="http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=1335&dDocName=en010242" style="color: #5588aa; text-decoration: none;">PIC16F877A </a>microcontroller. In manual control override mode, the PIC communicates with the software on a PC via a bidirectional wireless UART link and receives instructions that drives both controllers as well as other devices like LEDs & sensors as well as transmits data from sensors back to the software.</span></span></div><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><br />
</span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><span style="font-weight: bold;"><span style="font-size: 17px;">ROBOT TELE-CONTROL APPLICATION</span></span></span></span><a href="http://4.bp.blogspot.com/_fTdr602nenQ/RyTCXz4yClI/AAAAAAAAAKw/LxR0P1SnXIw/s1600-h/Robot+GUI_Oct07.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126435990175615570" src="http://4.bp.blogspot.com/_fTdr602nenQ/RyTCXz4yClI/AAAAAAAAAKw/LxR0P1SnXIw/s400/Robot+GUI_Oct07.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><a href="http://3.bp.blogspot.com/_fTdr602nenQ/R0pA-b2t1JI/AAAAAAAAANU/sJTZf-1rfR4/s1600-h/Robot+GUI_Oct07.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5136989766342792338" src="http://3.bp.blogspot.com/_fTdr602nenQ/R0pA-b2t1JI/AAAAAAAAANU/sJTZf-1rfR4/s400/Robot+GUI_Oct07.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><div style="text-align: justify;"><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><br />
I </span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">developed the </span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">software in VB6 with the purpose of implementing real-time control of the rover and arm without the need for writing down scripts for pre-planning the robot's movements. The graphical user interface is divided into 4 quadrants. The top-left quadrant shows the graphical position of the robotic arm. The arm itself is made of 4 segments including wrist & gripper. The user can drag the end-effector to any location (Xp, Yp) in cartesian space with base of the arm at (0,0) and the software calculates the angles of A, B and C using inverse kinematic equations. The user can also move each arm segment independently without moving the other segments. Alternatively, each angle can be changed by directly editing the angle in the appropriate text box and using forward kinematics, the end-effector position (X, Y) is computed.</span></span><br />
<br />
<a href="http://3.bp.blogspot.com/_fTdr602nenQ/RyTGoj4yCmI/AAAAAAAAAK4/6_qjPdg37E8/s1600-h/VB_Code.JPG" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126440675984935522" src="http://3.bp.blogspot.com/_fTdr602nenQ/RyTGoj4yCmI/AAAAAAAAAK4/6_qjPdg37E8/s400/VB_Code.JPG" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The top right quadrant has controls for the rover. Basic controls for the differential drive are forwards, backwards, turn right, turn left and rotate either side. The navigation frame in the sidebar shows the speed of the rover as a percentage of full speed. Other controls on the rover like, wrist rotation, gripper controls and various LED lights can also be controlled from here.</span></span><br />
<br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The bottom left quadrant displays options for the infrared sensor.<span style="font-weight: bold;">Scan180 </span>option sweeps the sensor 180 degrees and displays the result graphically like a radar. The <span style="font-weight: bold;">Follow Target</span> option locks the rover onto a moving object in front of it and follows it while maintaining its distance.</span></span><br />
</div><div style="text-align: justify;"><a href="http://4.bp.blogspot.com/_fTdr602nenQ/RyQE-z4yCkI/AAAAAAAAAKo/EOzrBcJEzu4/s1600-h/Logitech+Game+Controller.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126227752981236290" src="http://4.bp.blogspot.com/_fTdr602nenQ/RyQE-z4yCkI/AAAAAAAAAKo/EOzrBcJEzu4/s200/Logitech+Game+Controller.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The robot can also be operated by a Playstation 2 style USB game controller using DirectX routines. One advantage of this game controller is that it has lot of digital switching options for activating various controls on the robot. The two analog sticks worked very well for the robotic arm as well rover manipulation. When the keyboard control option is enabled, all of the features mentioned above for the rover and the arm can be controlled directly from the keyboard by various keyboard shortcuts.</span></span></div><div style="text-align: justify;"><div style="text-align: justify;"><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Other software features include, graphical enhancements like grid (Polar and Cartesian), arm segment loci, inside and outside borders for the arm, ability to create macros, scripts, real-time <span style="font-weight: bold;">battery power </span>and <span style="font-weight: bold;">wireless signal strength </span><span>indicators</span>.</span></span><br />
<br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">When the wireless link is enabled from the application, any changes in rover or arm positions are transmitted to the robot in real-time. If sensor is enabled, the software requests the robot to send current sensor data. The progress bars for the signal strength and battery voltage indicator are also periodically updated.</span></span></div><br />
<br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><span style="font-weight: bold;"><span style="font-size: 17px;">ELECTRONICS AND CIRCUIT DESIGN</span></span></span></span><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Figuure below shows the block diagram of the electronics involved:</span></span></div><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';"><br />
</span></span><div style="text-align: justify;"><span style="font-size: 13px;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/Ru8u-ub-FyI/AAAAAAAAAHY/j6Wb0fKIG88/s1600-h/Robot+Block+Diagram+-+Sep17.bmp" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5111355757240915746" src="http://2.bp.blogspot.com/_fTdr602nenQ/Ru8u-ub-FyI/AAAAAAAAAHY/j6Wb0fKIG88/s400/Robot+Block+Diagram+-+Sep17.bmp" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span></div><a href="http://4.bp.blogspot.com/_fTdr602nenQ/RyT5gz4yCpI/AAAAAAAAALM/6bSRxJTVlZQ/s1600-h/PIC16F877+IC.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126496617933965970" src="http://4.bp.blogspot.com/_fTdr602nenQ/RyT5gz4yCpI/AAAAAAAAALM/6bSRxJTVlZQ/s200/PIC16F877+IC.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">At the heart of the electronics is the PIC16F877A microcontroller, which has sufficient GPIO pins and all the peripherals to interface with the rest of the electronics. The system derives its power from a 6V SLA battery which conveniently sits in the rover's belly, which was designed to accomodate the SLA. The electronics is divided into 5V and 3.3V electronic sub systems, each is powered by its own regulator and filtering circuits. The Sabertooth 2x10 motor controller provides the regulated 5V at 100mA, which powers the 5V system and I built an onboard 3.3v regulator for the 3.3V systems, which includes the wireless <a href="http://www.maxstream.net/products/xbee/xbee-pro-oem-rf-module-zigbee.php" style="color: #5588aa; text-decoration: none;">XBee Pro</a> module from Maxstream.</span></span><br />
<div style="text-align: justify;"><br />
<a href="http://2.bp.blogspot.com/_fTdr602nenQ/Ry1i1D4yCqI/AAAAAAAAALY/0LlaIlkAsMk/s1600-h/XBee+Modules.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5128864214360918690" src="http://2.bp.blogspot.com/_fTdr602nenQ/Ry1i1D4yCqI/AAAAAAAAALY/0LlaIlkAsMk/s200/XBee+Modules.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The PIC communicates serially with the XBee Pro module using the PICs onboard hardware USART. Since, the XBee Pro and PICmicro constitute a mixed 5V/3.3V system, they cannot be interfaced directly with each other. Instead, an intermediate transceiver or buffer is required for interfacing in such situations to prevent damaging the XBee due to overvoltage. The circuit below was designed to do just this.</span></span><br />
<br />
<span style="font-size: 13px;"><a href="http://2.bp.blogspot.com/_fTdr602nenQ/Ru82fub-FzI/AAAAAAAAAHg/o-QB5HdBueU/s1600-h/Xbee+Interface.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5111364020757993266" src="http://2.bp.blogspot.com/_fTdr602nenQ/Ru82fub-FzI/AAAAAAAAAHg/o-QB5HdBueU/s400/Xbee+Interface.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span><a href="http://1.bp.blogspot.com/_fTdr602nenQ/R0pB372t1MI/AAAAAAAAANs/rhUY3-dsABI/s1600-h/Aug5_2007+006.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5136990754185270466" src="http://1.bp.blogspot.com/_fTdr602nenQ/R0pB372t1MI/AAAAAAAAANs/rhUY3-dsABI/s400/Aug5_2007+006.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><span style="font-size: 13px;"><a href="http://3.bp.blogspot.com/_fTdr602nenQ/Rx08S8w4poI/AAAAAAAAAIo/IvYZw5QvGUc/s1600-h/Aug5_2007+004.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5124318247263970946" src="http://3.bp.blogspot.com/_fTdr602nenQ/Rx08S8w4poI/AAAAAAAAAIo/IvYZw5QvGUc/s400/Aug5_2007+004.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span></div><span style="font-size: 13px;"><a href="http://3.bp.blogspot.com/_fTdr602nenQ/Rx08f8w4ppI/AAAAAAAAAIw/SdHWK0Zve3Q/s1600-h/Aug16_2007+008.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5124318470602270354" src="http://3.bp.blogspot.com/_fTdr602nenQ/Rx08f8w4ppI/AAAAAAAAAIw/SdHWK0Zve3Q/s400/Aug16_2007+008.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a></span><a href="http://4.bp.blogspot.com/_fTdr602nenQ/R0pBkr2t1LI/AAAAAAAAANk/5z0SRnDXOQU/s1600-h/Aug5_2007+010.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5136990423472788658" src="http://4.bp.blogspot.com/_fTdr602nenQ/R0pBkr2t1LI/AAAAAAAAANk/5z0SRnDXOQU/s400/Aug5_2007+010.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; display: block; margin-bottom: 10px; margin-left: auto; margin-right: auto; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px; text-align: center;" /></a><div style="text-align: justify;"><br />
<a href="http://1.bp.blogspot.com/_fTdr602nenQ/RyPweT4yCYI/AAAAAAAAAJI/8hiYS6tXrQ0/s1600-h/Crystal+Oscillator.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126205204402932098" src="http://1.bp.blogspot.com/_fTdr602nenQ/RyPweT4yCYI/AAAAAAAAAJI/8hiYS6tXrQ0/s200/Crystal+Oscillator.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Data transfer between the microcontroller and servo and motor controllers is also done serially. However, since the hardware USART pins were already used for PIC-XBee communication, I had to implement UART (Universal Asynchronous Receiver Transmitter) function in software. Since, implementing software UART requires very precise timing functions to accomplish a predetermined baud rate, the selection of the system clock becomes very crucial. Using a 4.00MHz or 8.00MHz crystal, for example, will result in timing which is not divisible by standard baud rates:</span></span></div><div style="text-align: center;"><span style="color: #000099; font-size: 13px; font-style: italic;"><span style="font-family: 'trebuchet ms';">4.000 MHz / 9600baud = 416.6666... (not an integer)</span></span></div><br />
<div style="text-align: justify;"><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">For serial communications, selecting a crystal frequency which is exactly divisible by standard baud rates, is very important. </span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Using the wrong crystal results in loss of data which results in unsatisfactory performance in teleoperated mode. </span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">In my case, I decided to use<span style="font-weight: bold;">18.432 MHz</span>.<br />
</span></span></div><div style="text-align: center;"><span style="color: #000099; font-size: 13px; font-style: italic;"><span style="font-family: 'trebuchet ms';">18.432 MHz / 9600 = 1920 (an integer)</span></span></div><div style="text-align: justify;"><br />
<a href="http://1.bp.blogspot.com/_fTdr602nenQ/RyTLtD4yCoI/AAAAAAAAALE/ehqwylvDSOI/s1600-h/FTDI+-+FT232RL.jpg" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5126446250852485762" src="http://1.bp.blogspot.com/_fTdr602nenQ/RyTLtD4yCoI/AAAAAAAAALE/ehqwylvDSOI/s200/FTDI+-+FT232RL.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: right; margin-bottom: 10px; margin-left: 10px; margin-right: 0pt; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">The microcontroller communicates with the radio transceiver (XBee) using the PIC's built in hardware USART peripheral at 38.4 kbps. The other XBee at the user's end is programmed to operate at the<span style="font-family: 'trebuchet ms';">same</span> baud rate and communicates </span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">with the computer </span></span><span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">through a serial port. For computers without a RS232 serial port, as in my case, a USB port can be made to emulate one using the<a href="http://www.ftdichip.com/Products/FT232R.htm" style="color: #5588aa; text-decoration: none;">FT232R - USB UART IC</a> from FTDI Ltd. To make interfacing th<span style="font-family: 'trebuchet ms';">e XBee through the USB port on the computer, <a href="http://www.sparkfun.com/commerce/product_info.php?products_id=718" style="color: #5588aa; text-decoration: none;">Sparkfun Electronics</a> has a</span></span></span><span style="font-family: 'trebuchet ms';">breakout board for FT232R IC, with internal oscillator, EEPROM as well as 3.3V </span>regulator on board.<br />
<br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms'; font-size: 17px;"><span style="font-weight: bold;">PROTOTYPE DEMONSTRATION</span></span></span><br />
<span style="font-size: 13px;"><span style="font-family: 'trebuchet ms';">Slideshow of ongoing autonomous rover prototype with integrated robotic arm.</span><embed height="350" src="http://www.youtube.com/v/Buk58yXyyvI" type="application/x-shockwave-flash" width="425" wmode="transparent"></embed><span style="font-family: 'trebuchet ms';">My robotic arm being controlled manually from a graphical user interface (GUI) application that I developed in VB6.</span><embed height="350" src="http://www.youtube.com/v/d9fc5kzCai0" type="application/x-shockwave-flash" width="425" wmode="transparent"></embed><span style="font-family: 'trebuchet ms';">Earlier 3D model and simulation of the robot arm (Concept stage).</span></span></div><span style="font-size: 13px;"><embed height="350" src="http://www.youtube.com/v/OYughELI-hk" type="application/x-shockwave-flash" width="425" wmode="transparent"></embed></span><br />
<div style="clear: both;"></div></div><div class="post-footer" style="color: #999999; font: normal normal normal 78%/normal 'Trebuchet MS', Trebuchet, Arial, Verdana, sans-serif; letter-spacing: 0.1em; line-height: 1.4em; margin-bottom: 0.75em; margin-left: 0px; margin-right: 0px; margin-top: 0.75em; text-transform: uppercase;"><div class="post-footer-line post-footer-line-1"><br />
</div><div class="post-footer-line post-footer-line-2"><span class="post-labels"></span></div><div class="post-footer-line post-footer-line-3"></div></div></div><div class="comments" id="comments"><a href="" name="comments"></a></div><br />
<span style="font-weight: bold;"><span style="color: black;"><br />
</span></span><br />
<span style="font-weight: bold;"><span style="color: black;">Click photo to go to a Robotics Project</span><br />
<span style="font-size: 17px;"><br />
</span></span><span style="font-family: 'times new roman'; font-size: 17px;"><span style="color: #000099;"><span style="font-size: 17px;">Original Teleoperated Robot With 5 DOF Arm</span></span></span><br />
<a href="http://aaqilkhan.blogspot.com/2007/08/autonomous-robotic-rover.html" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5113800276082992706" src="http://3.bp.blogspot.com/_fTdr602nenQ/RvfeQcw4pkI/AAAAAAAAAHw/mFp33c6XXTI/s320/Aug6_2007.jpg" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; float: left; margin-bottom: 10px; margin-left: 0px; margin-right: 10px; margin-top: 0px; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="font-size: 11px;"><br />
</span><span style="font-family: 'times new roman'; font-size: 11px;"><span style="color: #000099;"><span style="color: red;"><span style="font-size: 14px;"></span></span></span></span><br />
<span style="font-family: 'times new roman'; font-size: 11px;"><span style="color: #000099;"><span style="color: red;"><span style="font-size: 14px;">Current Robotics Project: </span><span style="font-size: 11px;"><span style="font-weight: bold;"><span style="font-size: 14px;">Snow</span> </span></span>(work in progress)</span></span></span><a href="http://aaqilkhan.blogspot.com/2008/04/project-snow-quadrupedal-hybrid-wheeled.html" style="color: #5588aa; text-decoration: none;"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5194498488376061890" src="http://4.bp.blogspot.com/_fTdr602nenQ/SBaQ2oH1b8I/AAAAAAAAAUc/qPLN3DbrYUQ/s320/Snow.JPG" style="border-bottom-color: rgb(204, 204, 204); border-bottom-style: solid; border-bottom-width: 1px; border-left-color: rgb(204, 204, 204); border-left-style: solid; border-left-width: 1px; border-right-color: rgb(204, 204, 204); border-right-style: solid; border-right-width: 1px; border-top-color: rgb(204, 204, 204); border-top-style: solid; border-top-width: 1px; cursor: pointer; float: left; margin-bottom: 10px; margin-left: 0pt; margin-right: 10px; margin-top: 0pt; padding-bottom: 4px; padding-left: 4px; padding-right: 4px; padding-top: 4px;" /></a><br />
<div style="clear: both;"></div></div><div class="post-footer" style="color: #999999; font-family: Georgia, serif; font-size: 13px; font: normal normal normal 78%/normal 'Trebuchet MS', Trebuchet, Arial, Verdana, sans-serif; letter-spacing: 0.1em; line-height: 1.4em; margin-bottom: 0.75em; margin-left: 0px; margin-right: 0px; margin-top: 0.75em; text-transform: uppercase;"><div class="post-footer-line post-footer-line-1"><br />
</div></div>Unknownnoreply@blogger.com1tag:blogger.com,1999:blog-1198018729130303420.post-56853105447551564922010-12-21T06:03:00.001-08:002010-12-21T06:03:53.862-08:00Controlling IR devices from Roborealm<div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">One of the latest exciting additions to the <a href="http://www.roborealm.com/" style="color: #667755;">Roborealm software</a> is the inclusion of a driver for the <a href="http://usbuirt.com/" style="color: #667755;">USB-UIRT</a>. This device is a USB based programmable IR emitter. You can use it in conjunction with roborealm to quickly and easily control a variety of consumer grade robots and toys. In this tutorial I will show you how to control your <a href="http://www.isobotrobot.com/eng/index.html" style="color: #667755;">ISOBOT robot</a> with Roborealm.</div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><br />
</div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">The requirements for this tutorial are:</div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><br />
</div><li style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><strong>Roborealm</strong> software <a href="http://roborealm.com/" style="color: #667755;">available here</a>.</li><br />
<li style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><strong>USB-UIRT</strong> <a href="http://usbuirt.com/" style="color: #667755;">hardware</a> ($50)</li><br />
<li style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><strong>TOMY ISOBOT</strong> ($100)</li><br />
<li style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><strong>Webcam </strong>(optional)</li><br />
<li style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">PC Joystick</li><br />
<div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;"><br />
</div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px;"></div><div style="text-align: justify;">First we will look at using a joystick on your PC to control the ISOBOT. The Basics: Be sure to download roborealm from www.roborealm.com. It is a free download and will install into a folder which you can copy to your program files. </div><div style="text-align: justify;"> Open up the roborealm software and select Interface -> Joystick. You should see a window that looks like the one on the below. </div><br />
<div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px;"><img alt="" class="aligncenter size-medium wp-image-905" height="300" src="http://profmason.com/wp-content/uploads/2009/03/joystickmapper-252x300.jpg" style="display: block; margin-left: auto; margin-right: auto; text-align: justify;" title="joystickmapper" width="252" /></div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">For our purposes, we will use the joypad to control the ISOBOT. I notice that by moving the joypad, the values in the View Switch box change. I will define a vara\aible called “joypad” by typing that into the drop down box under viewswitch. This variable will change as I move the joypad. I notice that in my case joypad takes the following values:</div><ol style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"><li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Forward 0</li>
<li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Right 9000</li>
<li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Backward 18000</li>
<li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Left 27000</li>
</ol><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px;"></div><div style="text-align: justify;">I will use these values to control the ISOBOT. Now go ahead and close the joystick mapper by selecting OK.</div><div style="text-align: justify;">Next we need to setup the USB-UIRT Device. Select Control -> Other -> USB_HID. When you double click on the item you should see the window below:</div><br />
<div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px;"><a href="http://profmason.com/wp-content/uploads/2009/03/usbuirt.jpg" style="color: #667755;"><img alt="" class="aligncenter size-medium wp-image-907" height="229" src="http://profmason.com/wp-content/uploads/2009/03/usbuirt-300x229.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; display: block; margin-left: auto; margin-right: auto; text-align: justify;" title="usbuirt" width="300" /></a></div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">To start off we will just define four basic actions for the robot:</div><ul style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"><li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Move Forward (Corresponding to pushing forward on the ISOBOT remote)</li>
<li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Turn Right (Corresponding to pressing diagonally Right on the remote)</li>
<li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Turn Left (Corresponding to pressing diagonally Left on the remote)</li>
<li style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">Move Backward (Corresponding to pulling backward on the remote)</li>
</ul><div style="font: normal normal normal 90%/175% 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"></div><div style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif; letter-spacing: -1px; text-align: justify;">We will need to complete the following four steps to program the USB-UIRT with a command for the ISOBOT.</div><span class="Apple-style-span" style="letter-spacing: -1px;"><img alt="" class="alignright size-full wp-image-908" height="158" src="http://profmason.com/wp-content/uploads/2009/03/usbuirtforward.jpg" style="float: right; text-align: justify;" title="usbuirtforward" width="223" /></span><div style="text-align: justify;"><span class="Apple-style-span" style="letter-spacing: -1px;"><br />
</span></div><span class="Apple-style-span" style="letter-spacing: -1px;"><div style="text-align: justify;">1. First we need to give the command a name and then learn the command. In the dialog box type in a name for the command.(Such as forward) Then press the LEARN Button. </div></span><span class="Apple-style-span" style="letter-spacing: -1px;"><div style="text-align: justify;">2. When you press the A dialog will pop up. Point the ISOBOT remote at the USBUIRT. Press and hold the stick forward until the light on the USBUIRT blinks three times and the dialog disappears. You should now see a code in the IR Code window. You can test this code by turning on your ISOBOT and pressing the test Transmit button. If everything is working correctly, your ISOBOT should move forward when the Test Transmit button is pressed. </div></span><span class="Apple-style-span" style="letter-spacing: -1px;"><img alt="" class="alignright size-full wp-image-909" height="139" src="http://profmason.com/wp-content/uploads/2009/03/usbuirtvariable.jpg" style="float: right; text-align: justify;" title="usbuirtvariable" width="215" /></span><div style="text-align: justify;"><br />
</div><span class="Apple-style-span" style="letter-spacing: -1px;"><div style="text-align: justify;">3. Now we need to attach the command you just learned to the joystick. In the variable dropdown box, select the variable that you created in the Joystick dialog (joypad). Since we know that the joypad is going forward when it has a value of 9000, we will select Set and then Equal in the next box. In the last box we will select a value of 9000. Now click the New button on the right hand side, and your actions will be saved. You may choose to have the USB-UIRT light up on transmitting and the number of repetitions. If you are having trouble, you might try decreasing the number of repetitions to prevent interference between subsequent repetitions. </div></span><span class="Apple-style-span" style="letter-spacing: -1px;"><div style="text-align: justify;">4. Now you need to repeat this process for each of the four signals that you might be interested in. </div><a href="http://profmason.com/wp-content/uploads/2009/03/usbuirt2.jpg" style="color: #667755;"></a></span><br />
<div><a href="http://profmason.com/wp-content/uploads/2009/03/usbuirt2.jpg" style="color: #667755; font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"><img alt="" class="aligncenter size-full wp-image-910" height="382" src="http://profmason.com/wp-content/uploads/2009/03/usbuirt2.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; display: block; margin-left: auto; margin-right: auto; text-align: justify;" title="usbuirt2" width="500" /></a><div style="text-align: justify;"><span class="Apple-style-span" style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"><br />
</span></div><span class="Apple-style-span" style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"><div style="text-align: justify;">Now play with your robot! You can watch a movie of my ISOBOT running around under Computer Control below. I show the joystick briefly during the video.</div></span><div style="text-align: justify;"><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" height="344" style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;" width="425"><embed type="application/x-shockwave-flash" width="425" height="344" src="http://www.youtube.com/v/lDc6qMx9bCs&hl=en&fs=1" allowscriptaccess="always" allowfullscreen="true"></object></div><span class="Apple-style-span" style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"><div style="text-align: justify;">You might want to try the <a href="http://www.roborealm.com/tutorial/Path_Planning/slide010.php" style="color: #667755; font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;">path planning tutorial</a><span class="Apple-style-span" style="font-family: 'Lucida Grande', 'Lucida Sans Unicode', Verdana, sans-serif;"> using your new Roborealm controlled ISOBOT!</span></div></span></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-52107306933270117762010-12-21T06:02:00.000-08:002010-12-21T06:02:02.155-08:00Lego Head Tracking Robot<div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">Do you have a lego mindstorm NXT?</span></div><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; font-size: x-small; line-height: 16px;"><br />
</span></div><span class="Apple-style-span" style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; font-size: x-small; line-height: 16px;"><div style="text-align: justify;">Do you want to play videogames using optical motion tracking?</div><div style="text-align: justify;">This the idea I just had yesterday when playing with my robots.</div></span><br />
<div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">What you need:</span></div><ul style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; list-style-image: initial; list-style-position: initial; list-style-type: none; margin-left: 0px; padding-bottom: 0px; padding-left: 10px; padding-right: 0px; padding-top: 0px; text-indent: -10px;"><li style="margin-bottom: 8px; margin-left: 10px; margin-right: 0px; margin-top: 7px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">3 lego nxt light sensors</span></li>
<li style="margin-bottom: 8px; margin-left: 10px; margin-right: 0px; margin-top: 7px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">3 lego nxt cables</span></li>
<li style="margin-bottom: 8px; margin-left: 10px; margin-right: 0px; margin-top: 7px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">lego nxt core</span></li>
<li style="margin-bottom: 8px; margin-left: 10px; margin-right: 0px; margin-top: 7px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">a webcam which doesn’t have an IR optical filter OR a nintendo wiimote</span></li>
<li style="margin-bottom: 8px; margin-left: 10px; margin-right: 0px; margin-top: 7px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">a floppy disk</span></li>
<li style="margin-bottom: 8px; margin-left: 10px; margin-right: 0px; margin-top: 7px; text-align: justify;"><a href="http://www.free-track.net/english/" style="color: #0066cc; text-decoration: none;"><span class="Apple-style-span" style="font-size: x-small;">free-track software for windows</span></a></li>
</ul><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">How it works:</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px;"></div><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">the lego light sensors emits light in the visible and infra red spectrum</span></div><span class="Apple-style-span" style="font-size: x-small;"><div style="text-align: justify;">this 3 beams are captured by your camera</div><div style="text-align: justify;">the coordinates of the 3 dots are used by the software free-track to estimate the player position</div><div style="text-align: justify;">the head position is used in the game to look around</div></span><br />
<div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">How to build it:</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">I used a 3 point cap model: build a triangle structure as in figure 1. I did it in such a way that you can attach it to a sport cap.</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;"><a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/frontcap.jpg" style="color: #0066cc; text-decoration: none;"><img alt="" class="alignnone size-medium wp-image-181" height="225" src="http://www.epokh.org/blog/wp-content/uploads/2009/04/frontcap-300x225.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; max-width: 100%; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Front picture of the model" width="300" /></a><a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/sidecap2.jpg" style="color: #0066cc; text-decoration: none;"><img alt="" class="alignnone size-medium wp-image-183" height="225" src="http://www.epokh.org/blog/wp-content/uploads/2009/04/sidecap2-300x225.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; max-width: 100%; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Side picture of the model" width="300" /></a></span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/sidecap1.jpg" style="color: #0066cc; text-decoration: none;"><span class="Apple-style-span" style="font-size: x-small;"><img alt="" class="alignnone size-medium wp-image-182" height="225" src="http://www.epokh.org/blog/wp-content/uploads/2009/04/sidecap1-300x225.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; max-width: 100%; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Side picture of the model" width="300" /></span></a></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px;"></div><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">If your camera has ann IR filter you are in big trouble but you can still remove it as <a href="http://www.free-track.net/english/hardware/webcam_filter_removal.php" style="color: #0066cc; text-decoration: none;">described here.</a></span></div><span class="Apple-style-span" style="font-size: x-small;"><div style="text-align: justify;">If you have a camera without an IR filter you are very lucky: open a floppy disk and remove the magnetic disk. Cut it and pose it in front of the camera lens as in figure 2.</div></span><br />
<div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/floppyfilter.jpg" style="color: #0066cc; text-decoration: none;"><span class="Apple-style-span" style="font-size: x-small;"><img alt="" class="alignnone size-medium wp-image-185" height="225" src="http://www.epokh.org/blog/wp-content/uploads/2009/04/floppyfilter-300x225.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; max-width: 100%; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Floppy disk used to filter only sensor ir" width="300" /></span></a></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">Measure the distances as in figure 3 and insert them in the free-track configuration.</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/legotracking.jpg" style="color: #0066cc; text-decoration: none;"><span class="Apple-style-span" style="font-size: x-small;"><img alt="" class="alignnone size-medium wp-image-186" height="211" src="http://www.epokh.org/blog/wp-content/uploads/2009/04/legotracking-300x211.jpg" style="border-bottom-style: none; border-color: initial; border-left-style: none; border-right-style: none; border-top-style: none; border-width: initial; max-width: 100%; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Free-Track configuration" width="300" /></span></a></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">The Free-Track configuration can be downloaded from <a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/legohead.ftp" style="color: #0066cc; text-decoration: none;">Free-Track configuration to use with Lego Head Tracker</a>.</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">Write a simple program for the lego nxt that switches on and off the sensors when the user hold down the contact sensor. The simple program in NXT-G can be downloaded from <a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/modelcap.rbt" style="color: #0066cc; text-decoration: none;">Lego Nxt program</a></span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px;"></div><div style="text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">You can also use RoboRealm to track them:</span></div><span class="Apple-style-span" style="font-size: x-small;"><div style="text-align: justify;"><embed allowfullscreen="true" allowscriptaccess="always" height="344" src="http://www.youtube.com/v/K-WHJHmluME&hl=en&fs=1&rel=0&color1=0x006699&color2=0x54abd6" type="application/x-shockwave-flash" width="425"></embed></div><div style="text-align: justify;">The RoboRealm configuration file is <a href="http://www.epokh.org/blog/wp-content/uploads/2009/04/legotrack.robo" style="color: #0066cc; text-decoration: none;">Robo Realm script file</a>.</div><div style="text-align: justify;">And now enjoy your game!</div></span><br />
<div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" height="344" width="425"><embed type="application/x-shockwave-flash" width="425" height="344" src="http://www.youtube.com/v/LTa2WCYhDz8&hl=en&fs=1&color1=0x006699&color2=0x54abd6" allowscriptaccess="always" allowfullscreen="true"></object></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><object classid="clsid:d27cdb6e-ae6d-11cf-96b8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=6,0,40,0" height="344" width="425"><embed type="application/x-shockwave-flash" width="425" height="344" src="http://www.youtube.com/v/UAjvcoLkDSc&hl=en&fs=1&color1=0x006699&color2=0x54abd6" allowscriptaccess="always" allowfullscreen="true"></object></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">If you are lazy to make one you can buy one from my website:</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><a href="http://shop.robomotic.com/" style="color: #0066cc; text-decoration: none;"><span class="Apple-style-span" style="font-size: x-small;">http://shop.robomotic.com/</span></a></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><span class="Apple-style-span" style="font-size: x-small;">To find other distributors go here:</span></div><div style="color: #333333; font-family: 'Lucida Grande', Verdana, Arial, sans-serif; line-height: 16px; text-align: justify;"><a href="http://www.pixelpartner.de/openKMQen.htm" style="color: #0066cc; text-decoration: none;"><span class="Apple-style-span" style="font-size: x-small;">http://www.pixelpartner.de/openKMQen.htm</span></a></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-5230567266005281042010-12-21T05:58:00.000-08:002010-12-21T05:58:16.101-08:00RC truck robot conversion<span class="Apple-style-span" style="background-color: white; border-collapse: collapse; color: #536482; font-family: Arial, Helvetica, sans-serif; font-size: 10px; line-height: 15px;"></span><br />
<div class="post bg2" id="p8" style="background-position: 100% 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 4px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 10px; padding-right: 10px; padding-top: 0px;"><div class="inner" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><div class="postbody" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; float: left; font-size: 10px; line-height: 1.48em; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; width: 536px;"><div class="content" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: 'Lucida Grande', 'Trebuchet MS', Verdana, Helvetica, Arial, sans-serif; font-size: 1.3em; line-height: 1.4em; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; min-height: 3em; outline-color: initial; outline-style: initial; outline-width: 0px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">RC truck robot conversion</span></span><br />
<br />
<br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1143s.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<br />
This project covers the conversion of a cheap off-the-shelf RC truck into a powerful robot vision platform capable of ball following, etc.<br />
<br />
I always like seeing the projects around the world using expensive robots, running complex vision processing software and dreamt of the day I would have my own to play with.<br />
<br />
Win the lottery or build one on the cheap?<br />
<br />
Cheap wins every time.<br />
<br />
I say cheap, but what I mean is cheap-ish. It depends on how complicated you want it to be and how much you have lying around. Maybe it should be "cheap compared to a Corrobot or Whitebox robot" (although they have much more functionality)<br />
<br />
Anyway.<br />
<br />
<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Parts required</span></span><br />
<br />
Toyabi Skullcrusher RC monster truck<br />
SSC-32 for webcam tilt<br />
Logitech Pro 9000 webcam<br />
Sabertooth 2x10a speed controller<br />
Dell C610 laptop system board + proc + memory + wireless<br />
12v - 19v converter to run laptop from 12v SLA battery<br />
12v battery (I used a 12v SLA 7ah but its a bit too heavy, maybe a LIPO?)<br />
RS232 - TTL converter (homemade or Ebay) for Sabertooth<br />
USB - RS232 converter for SSC-32<br />
Remote control unit and keyfob (homemade or Ebay) - this is a failsafe so I can cut power to the Sabertooth<br />
USB hub<br />
12V fan<br />
Old satellite set-top box - emptied this and used it as a box to house it all in.<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Software</span></span><br />
<br />
Windows Xp<br />
Roborealm<br />
UltraVNC<br />
<br />
<br />
<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Sourcing the RC truck</span></span><br />
<br />
Whilst surfing Ebay one day I came across new RC monster trucks selling very cheaply. The interesting thing about them was that they had tank-style steering instead of the normal Ackerman steering like most RC trucks.<br />
<br />
They can be bought in Europe from Seben racing and in the USA from Amazon and are called "Skull Crusher" from Toyabi.<br />
<br />
<br />
<a class="postlink" href="http://www.amazon.com/Remote-Control-Scale-Monster-Yellow/dp/B000ODT7RK/" style="border-bottom-color: rgb(54, 138, 210); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">http://www.amazon.com/Remote-Control-Scale-Monster-Yellow/dp/B000ODT7RK/</a><br />
<br />
Here are some videos of the trucks in action<br />
<br />
<a class="postlink" href="http://www.wilhelmy-it.de/seben/pictures/racing_king/king.html" style="border-bottom-color: rgb(54, 138, 210); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">http://www.wilhelmy-it.de/seben/pictures/racing_king/king.html</a><br />
<br />
<embed flashvars="" id="VideoPlayback" src="http://www.youtube.com/v/GFLU0xfkD3s" style="height: 326px; width: 400px;" type="application/x-shockwave-flash" wmode="transparent"></embed><br />
<br />
I had a good robotty feeling about these asked my better half to get me one for my birthday.<br />
The truck is HUGE and comes with a simple speed controller which is on/off rather than proportional, still, it was amazing what could be achieved with such a cheap model.<br />
Stock, it will spin on the spot, climb all sorts of objects and gradients.<br />
<br />
It has independent suspension and gearing to each wheel and runs from two small-ish motors. It is VERY big and came in an enormous box.<br />
Construction is better than expected for such a cheap model, but the tyres are some kind of foammy PVC moulding. Apart from that, lots of space on board and has springs at each corner. The transmission is by gears down 4 articulated arms to the wheels.<br />
<br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1095-1.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<br />
<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Stripping down</span></span><br />
<br />
1st job was to strip it apart and remove the old speed controller and replace it with a Sabertooth from Dimension Engineering. You need to cut off the Electrolytic filter capacitors from the motors as they will blow under PWM motor control. Leave the ceramic capacitors in place to filter out some noise.<br />
<br />
The body shell comes off easily and the lights disconnect via a small connector. I cut out the rest of the controller and receiver circuitry.<br />
I added the Sabertooth temporarily and set it up for RC mode and an old 27Mhz proportional set I had lying around to test with. All ok, had even more fun driving it around under proper proportional control<br />
<br />
It can move very quickly<br />
<br />
<embed flashvars="" id="VideoPlayback" src="http://www.youtube.com/v/Qdr3425YmsA" style="height: 326px; width: 400px;" type="application/x-shockwave-flash" wmode="transparent"></embed><br />
<br />
And with the new speed controller, very slowly too !!<br />
<br />
<embed flashvars="" id="VideoPlayback" src="http://www.youtube.com/v/U5EEBoXVQ-Y" style="height: 326px; width: 400px;" type="application/x-shockwave-flash" wmode="transparent"></embed><br />
<br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1100.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1102.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1108.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1113.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1115.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<br />
<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Adding the brains</span></span><br />
<br />
Next step was to add a brain to the system.<br />
<br />
An old Dell C610 laptop I had lying around was dis-membered and the system board, memory and processor saved for the robot.<br />
<br />
An old satellite set-top box served as the case for the project and mounted pretty easily to the top of the truck. I then mounted the system board and Sabertooth controller inside, along with a 12V 7AH SLA battery (slung under the casing), DC-DC converter to power the laptop from 12V and a failsafe.<br />
<br />
The failsafe is a small 433Mhz key fob transmitter to cut the power to the motors if the robot should decide to make a break for freedom. It's a simple RC switch bought from Ebay very cheaply and used to toggle power to the Sabertooth.<br />
<br />
I also added a small 12v fusebox and wired it all up with some cable from an old PC PSU.<br />
My 8 year old webcam was rubbish so I went out and bought a Logitech 9000 PRO, which is very good and came on a tilting base which started me thinking that I could control it from the laptop via a servo to tilt the camera as the robot adjusted its distance from an object.<br />
<br />
I used an SSC-32 servo controller from Lynxmotion to drive the servo and connected the controller to the PC via a USB to serial converter (I had already used up the laptops COM1 serial port for the motor controller)<br />
<br />
The Dell laptop only has one USB port so I added a small hub to allow the USB to Serial module and the USB webcam to connect at the same time. With hindsight, I would have surfed Ebay for a Laptop system board that had USB 2 instead of 1.1, but it was all I had and works well for now.<br />
The COM1 to Sabertooth cable needs to have a Serial to TTL converter fitted, you can find these on Ebay for under $10 or make your own as I did. This converts the RS232 levels to a lower 5v level suitable for the speed controller inputs.<br />
<br />
I also added a chunky power on/off switch and soldered a remote pushbutton switch to the power button of the laptop. (Could have just drilled a hole in the case and used a prodder). A small 12V fan was added to the case to help cool down the laptop system board which isn't used to running without a case to direct the airflow.<br />
<br />
The laptop had a Wifi card to enable me to remote control the robot from another laptop for making adjustments, etc without having to hook up a keyboard/mouse/monitor to the robot.<br />
<br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1140.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1141.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/image001.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<br />
<br />
<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Software install</span></span><br />
<br />
I installed the XP and the Dell drivers on the laptop and installed a free bit of software called Roborealm which is a great visual processing program intended for robotic applications which just so happens to have an SCC-32 and a Sabertooth module in-built. Joy !!<br />
<br />
<a class="postlink" href="http://www.roborealm.com/" style="border-bottom-color: rgb(54, 138, 210); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">http://www.roborealm.com</a><br />
<br />
There is an example green ball follower script that you can download from the Roborealm website that I modified slightly to work with my hardware and I spent a few hours tinkering with the settings in the filters until I had it as good as i could get it.<br />
<br />
I also downloaded the free software, UltraVNC, on the two laptops which allows me to remote into the robot and make adjustments or just watch what is going on on the screen and webcam.<br />
<br />
<a class="postlink" href="http://www.uvnc.com/" style="border-bottom-color: rgb(54, 138, 210); border-bottom-style: solid; border-bottom-width: 1px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">http://www.uvnc.com/</a><br />
<br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://www.roborealm.com/help/Getting_Started_2.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span><br />
<br />
<br />
<br />
<br />
<br />
<span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: underline;"><span style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Testing</span></span><br />
<br />
The next day after finishing it, I took the robot along to a robot event and tested it out by gently kicking a green ball along the floor and watching the robot following it where ever it went. It can go forward, left, right and it backs away if the ball comes towards it. I had one finger on the failsafe remote at all times.<br />
<br />
It only shot off once when it liked the look of the green trees through the window. A quick tweak of the color filter's Hue setting fixed that.<br />
<br />
Not a finished project by any means, but a great platform to have a bit of fun and test out all my odd ideas for sensors, etc.<br />
<br />
If you have an older laptop lying around (about PIII 1Ghz) and any kind of PC driven speed controller then you can set this up fairly quickly. Make it as simple or as complicated as you wish. No real programming required, just tweaking of scripts, etc<br />
<br />
My next steps are to add some sensors (ultrasonic and IR) to the robot and swap the heavy Lead-acid battery for a LIPO to ease some of the weight from the suspension.<br />
<br />
I may have a stab and reverse Engineering the ball follow script and add the sensor inputs from the SCS-32 into the Roborealm program loop.<br />
<br />
Remember the failsafe. This robot can move fast and is quite heavy. Could easily cause injury if it escapes and runs amok.<br />
<br />
If you need any more detail, just ask. I will try to help in any way I can.<br />
<br />
Have fun.<br />
<br />
(For the video below, I modified the script to follow an orange ball as green balls don't work to well on grass. I wish I had taken a video from the robot event as they had large open floors where I kicked the ball off slowly and watched it roll a long way with the robot in hot pursuit !!)<br />
<br />
<br />
<embed flashvars="" id="VideoPlayback" src="http://www.youtube.com/v/oLbW9OgLJw4" style="height: 326px; width: 400px;" type="application/x-shockwave-flash" wmode="transparent"></embed><br />
<br />
<br />
<br />
<span class="Apple-style-span" style="border-color: initial; border-style: initial; outline-color: initial; outline-style: initial;"><img alt="Image" src="http://i88.photobucket.com/albums/k199/OracsRevenge/DSCN1143s.jpg" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" /></span></div></div><dl class="postprofile" id="profile8" style="border-bottom-width: 0px; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 1px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: inline; float: right; font-size: 10px; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 5px; min-height: 80px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; width: 155px;"><dt style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a class="username-coloured" href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=memberlist.php&mode=viewprofile&u=2&sid=6c55dc096d244eb47c13bef249a9a96f" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: inline !important; font-size: 10px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: none;"><span class="Apple-style-span" style="color: black;">Orac</span></a></dt>
<dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Site Admin</dd><dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"> </dd><dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><strong style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; font-weight: normal; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Posts:</strong> 83</dd><dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><strong style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; font-weight: normal; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Joined:</strong> Sat Mar 28, 2009 8:03 pm</dd></dl><div class="back2top" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; font-size: 10px; height: 11px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: right;"><a class="top" href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=viewtopic.php&f=3&t=4#wrap" style="background-attachment: initial; background-clip: initial; background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/imageset/icon_back_top.gif); background-origin: initial; background-position: 0% 0%; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; float: right; font-size: 10px; height: 11px; letter-spacing: 1000px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; text-indent: 11px; width: 11px;" title="Top"><span class="Apple-style-span" style="color: black;">Top</span></a></div><span class="corners-bottom" style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_left.png); background-position: 0px 100%; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: -10px; margin-right: -10px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_right.png); background-position: 100% 100%; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"></span></span></div></div><div class="post bg1" id="p28" style="background-position: 100% 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 4px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 10px; padding-right: 10px; padding-top: 0px;"><div class="inner" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="corners-top" style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_left.png); background-position: 0px 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: -10px; margin-right: -10px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_right.png); background-position: 100% 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"></span></span><div class="postbody" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; float: left; font-size: 10px; line-height: 1.48em; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; width: 536px;"><h3 style="border-bottom-style: none; border-color: initial; border-color: initial; border-left-style: none; border-right-style: none; border-style: initial; border-top-style: none; border-width: initial; font-family: 'Trebuchet MS', Verdana, Helvetica, Arial, sans-serif; font-size: 1.5em; font-style: normal; font-weight: bold; letter-spacing: -1px; line-height: 18px; margin-bottom: 0.3em !important; margin-left: 0px !important; margin-right: 0px !important; margin-top: 0px !important; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 2px; text-transform: none;"><a href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=viewtopic.php&f=3&t=4#p28" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 15px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><span class="Apple-style-span" style="color: black;">Re: RC truck robot conversion</span></a></h3><div class="author" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Verdana, Helvetica, Arial, sans-serif; font-size: 1em; line-height: 1.2em; margin-bottom: 0.6em; margin-left: 0px; margin-right: 15em; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 5px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="color: black;"><a href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=viewtopic.php&p=28&sid=6c55dc096d244eb47c13bef249a9a96f#p28" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="Post" height="9" src="http://www.ukrobotgroup.com/phpBB3/styles/prosilver/imageset/icon_post_target.gif" style="border-bottom-style: none; border-bottom-width: 0px; border-color: initial; border-color: initial; border-left-style: none; border-left-width: 0px; border-right-style: none; border-right-width: 0px; border-style: initial; border-top-style: none; border-top-width: 0px; border-width: initial; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Post" width="11" /></a>by <strong style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=memberlist.php&mode=viewprofile&u=70&sid=6c55dc096d244eb47c13bef249a9a96f" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;">it monkey</a></strong> » Thu Jun 04, 2009 9:53 pm</span></div><div class="content" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: 'Lucida Grande', 'Trebuchet MS', Verdana, Helvetica, Arial, sans-serif; font-size: 1.3em; line-height: 1.4em; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; min-height: 3em; outline-color: initial; outline-style: initial; outline-width: 0px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Love this bot... so much I've just taken delivery of my own Toyabi Skullcrusher. <img alt=":mrgreen:" src="http://www.ukrobotgroup.com/phpBB3/images/smilies/icon_mrgreen.gif" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Mr. Green" /><br />
<br />
The on/off controls are bloody hard work.<br />
<br />
When swtiching to the Sabertooth motor controller how did you know to use a 10A one? I'm not to clued up with the old Amps Volts Ohms stuff... my skills are more computer and network based <img alt=":geek:" src="http://www.ukrobotgroup.com/phpBB3/images/smilies/icon_e_geek.gif" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 13px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Geek" /><br />
<br />
At the moment I intend to copy what you've done to some extent. Got Roborealm tracking colours and also just got Microsoft Robotics Studio. So I'm looking forward to playing around with that.</div></div><dl class="postprofile" id="profile28" style="border-bottom-width: 0px; border-color: initial; border-left-color: rgb(255, 255, 255); border-left-style: solid; border-left-width: 1px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: inline; float: right; font-size: 10px; margin-bottom: 10px; margin-left: 0px; margin-right: 0px; margin-top: 5px; min-height: 80px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; width: 155px;"><dt style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=memberlist.php&mode=viewprofile&u=70&sid=6c55dc096d244eb47c13bef249a9a96f" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><span class="Apple-style-span" style="color: black;">it monkey</span></a></dt>
<dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"> </dd><dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><strong style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; font-weight: normal; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Posts:</strong> 16</dd><dd style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; line-height: 1.2em; margin-bottom: 0px; margin-left: 8px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><strong style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; font-weight: normal; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Joined:</strong> Thu Jun 04, 2009 9:24 pm</dd></dl><div class="back2top" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; font-size: 10px; height: 11px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: right;"><a class="top" href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=viewtopic.php&f=3&t=4#wrap" style="background-attachment: initial; background-clip: initial; background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/imageset/icon_back_top.gif); background-origin: initial; background-position: 0% 0%; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; float: right; font-size: 10px; height: 11px; letter-spacing: 1000px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none; text-indent: 11px; width: 11px;" title="Top"><span class="Apple-style-span" style="color: black;">Top</span></a></div><span class="corners-bottom" style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_left.png); background-position: 0px 100%; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: -10px; margin-right: -10px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_right.png); background-position: 100% 100%; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"></span></span></div></div><div class="post bg2" id="p29" style="background-position: 100% 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 4px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 10px; padding-right: 10px; padding-top: 0px;"><div class="inner" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="corners-top" style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_left.png); background-position: 0px 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: -10px; margin-right: -10px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span style="background-image: url(http://www.ukrobotgroup.com/phpBB3/styles/prosilver/theme/images/corners_right.png); background-position: 100% 0px; background-repeat: no-repeat no-repeat; border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: block; font-size: 1px; height: 5px; line-height: 1px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"></span></span><div class="postbody" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; clear: both; float: left; font-size: 10px; line-height: 1.48em; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; width: 536px;"><h3 style="border-bottom-style: none; border-color: initial; border-color: initial; border-left-style: none; border-right-style: none; border-style: initial; border-top-style: none; border-width: initial; font-family: 'Trebuchet MS', Verdana, Helvetica, Arial, sans-serif; font-size: 1.5em; font-style: normal; font-weight: bold; letter-spacing: -1px; line-height: 18px; margin-bottom: 0.3em !important; margin-left: 0px !important; margin-right: 0px !important; margin-top: 0px !important; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 2px; text-transform: none;"><a href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=viewtopic.php&f=3&t=4#p29" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 15px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><span class="Apple-style-span" style="color: black;">Re: RC truck robot conversion</span></a></h3><div class="author" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: Verdana, Helvetica, Arial, sans-serif; font-size: 1em; line-height: 1.2em; margin-bottom: 0.6em; margin-left: 0px; margin-right: 15em; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 5px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><span class="Apple-style-span" style="color: black;"><a href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=viewtopic.php&p=29&sid=6c55dc096d244eb47c13bef249a9a96f#p29" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-decoration: none;"><img alt="Post" height="9" src="http://www.ukrobotgroup.com/phpBB3/styles/prosilver/imageset/icon_post_target.gif" style="border-bottom-style: none; border-bottom-width: 0px; border-color: initial; border-color: initial; border-left-style: none; border-left-width: 0px; border-right-style: none; border-right-width: 0px; border-style: initial; border-top-style: none; border-top-width: 0px; border-width: initial; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; max-width: 100%; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="Post" width="11" /></a>by <strong style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-size: 10px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;"><a class="username-coloured" href="http://www.ukrobotgroup.com/joomla/index.php?option=com_jfusion&Itemid=6&jfile=memberlist.php&mode=viewprofile&u=2&sid=6c55dc096d244eb47c13bef249a9a96f" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; display: inline !important; font-size: 10px; font-weight: bold; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: initial; outline-width: 0px; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; text-decoration: none;">Orac</a></strong> » Fri Jun 05, 2009 12:22 am</span></div><div class="content" style="border-bottom-width: 0px; border-color: initial; border-left-width: 0px; border-right-width: 0px; border-style: initial; border-top-width: 0px; font-family: 'Lucida Grande', 'Trebuchet MS', Verdana, Helvetica, Arial, sans-serif; font-size: 1.3em; line-height: 1.4em; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; min-height: 3em; outline-color: initial; outline-style: initial; outline-width: 0px; overflow-x: hidden; overflow-y: hidden; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">Hi, the motors are not going to tax the Sabertooth too much as its designed for much bigger work. I had the 10A already and used it. No reason why any other motor controller would not be suitable.<br />
<br />
When you get to wiring it all in, don't forget to remove the bigger capacitors that are tacked to the outside of the motors as they will fry under PWM from the new controller. You can leave the small round ones in place (Ceramic)<br />
<br />
No doubt you will have some good fun with it. As you say, no need to build one just like mine, everyone customises their bots to their own needs/taste.<br />
<br />
Keep us updated on how you progress.<br />
<br />
Thanks</div></div></div></div>Unknownnoreply@blogger.com0tag:blogger.com,1999:blog-1198018729130303420.post-48777685346337596282010-12-21T05:56:00.000-08:002010-12-21T05:56:48.538-08:00Obstacle Avoidance Using Omnidirectional Vision Robot<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><br />
</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">In the previous article on obstacle avoidance (see Obstacle Avoidance in the Real World), our robot used simple sonar and infrared sensors to detect the presence of obstacles, then adjusted its heading accordingly. As we saw, it takes quite a few such sensors to handle all the possible obstacle configurations the robot might encounter. And the reaction of the robot to the presence of an obstacle is always fairly simple: just turn left, right or back up.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><br />
</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">We will now try a different approach using our omnidirectional vision system. The most obvious advantage of 360-degree vision is that you can see your entire surroundings in a single snapshot. We can think of each pixel in the image as a potential obstacle detector analogous to our sonar and infrared sensors but now we have thousands of them instead of just five or six. Of course, these pixel values don't tell us the distance to objects like the active range sensors do, but as we shall see, we can still develop an effective navigation algorithm nonetheless.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><br />
</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The image below is a typical view through the omnidirectional vision setup. The camera's resolution is set to 320x240 pixels and we can see the reflection of the room in the spherical mirror, as well as the ceiling past the edges of the mirror.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><span class="Apple-style-span" style="color: blue;"><a href="http://www.pirobot.org/blog/0004/images/omni-obs-1.jpg" style="color: blue;"><img border="0" height="228" name="graphics1" src="http://www.pirobot.org/blog/0004/images/omni-obs-1.jpg" style="text-align: left;" width="304" /></a></span></div><div style="color: blue; text-align: justify;"><a href="http://www.pirobot.org/blog/0004/images/omni-obs-1.jpg"><br />
</a></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">Using the <a href="http://www.roborealm.com/" style="color: blue;" target="_blank">RoboRealm</a> vision program, we can "unwrap" this image using the <b>Polar module</b>, then crop the artifacts to produce a rectangular panoramic image as shown below:</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><br />
</div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics2" src="http://www.pirobot.org/blog/0004/images/omni-obs-2.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The resulting rectangular image is now 280x86 pixels, and we must remember that the left edge of the image represents the same place in space as the right edge. In other words, the topology of the rectangular strip is that of a loop with straight ahead at the mid point and straight behind at the left and right edges.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><br />
</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">When looking at such an image, it is obvious to our own eyes where the floor is clear and where there are obstacles. But how do we extract that information from the image so our robot can see the same thing? The folks over at RoboRealm have an awesome tutorial on how to do just this. I would highly recommend checking it out at<a href="http://www.roborealm.com/tutorial/Obstacle_Avoidance/slide010.php" style="color: blue;">http://www.roborealm.com/tutorial/Obstacle_Avoidance/slide010.php</a><a href="http://www.roborealm.com/tutorial/Obstacle_Avoidance/slide010.php" style="color: blue;">.</a> What follows is simply an elaboration of the methods described in that tutorial and applied to 360-degree images.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">One method that works well is based on edge detection. The image below shows the result of applying RoboRealm's <b>Prewitt edge filter </b>to the original image:</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><br />
</div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics3" src="http://www.pirobot.org/blog/0004/images/omni-obs-3.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><div style="text-align: justify;">Note how objects are now outlined by their edges whereas the floor is generally a featureless black. In particular, we can see that even the relatively narrow support legs of the chair stand out nicely—something that is often difficult to "see" with infrared or sonar. There is also a false edge beneath the large ball on the left which is due to the ball's shadow. We can eliminate such artifacts by using the<span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"> </span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><b>Auto Threshold</b></span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"> </span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;">and</span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"> </span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><b>Clean</b></span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"> </span><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;">modules which results in the following image:</span></div><div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div></span><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"></div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics4" src="http://www.pirobot.org/blog/0004/images/omni-obs-4.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">Now here comes the key step. RoboRealm has a module called <b>Side Fill</b> that can be applied to any of the four edges of the image. In this case we will apply it to the bottom edge. The Side Fill module paints white pixels upward from the bottom edge until it runs into a white pixel already in the image. Because we are working with a thresholded edge map, this coincides with the first potential obstacle in that vertical slice of the image. The picture below shows the result:</div><span style="color: black;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="1" height="112" name="graphics5" src="http://www.pirobot.org/blog/0004/images/omni-obs-5.jpg" style="text-align: left;" width="369" /></span></span><br />
<div style="text-align: justify;"><span style="color: black;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">To eliminate the narrow spikes, we use the <b>Erode</b> module, then smooth the edges with the<b>Smooth Hull</b> module resulting in our final image:</div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics6" src="http://www.pirobot.org/blog/0004/images/omni-obs-6.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">This image gives us a rough map of the clear areas on the floor. This becomes clearer if we superimpose the image with our original as follows:</div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics7" src="http://www.pirobot.org/blog/0004/images/omni-obs-7.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">Since the image is panoramic, each point along the horizontal axis represents a possible direction of travel for the robot with straight ahead being at the mid point and straight back corresponding to the left/right edges. In this case, we can see that the safest direction to go is about 90 degrees off to the left.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">But how can we get a more precise heading from this image? In the original RoboRealm tutorial, the horizontal coordinate of the highest peak is used. However, in this case that would correspond to the narrow peak on the right which is too narrow for our robot to pass. Instead, we want a gap that is both deep and wide. Unfortunately, there isn't an easy way to get this information directly from RoboRealm. We need a way to label the points along the boundary of our blob, and then analyze these points to give us the best gap. RoboRealm does help us get started by using the <b>Harris Corners</b> module. This module looks for "corner" points in an image and can also be used to trace the boundary of our floor blob. Applying it to our floor map image yields the following set of Harris points:</div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics8" src="http://www.pirobot.org/blog/0004/images/omni-obs-8.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The Harris Corners are represented by the green points in the image above and correspond to those points where the contour has a noticeable "kink". Some of these kinks are rather subtle and we don't care so much about the kinks themselves but simply the fact that the collection of points nicely traces the contour of our floor map. RoboRealm can return the coordinates of these points as an array to our controlling program where we can analyze them further. The next picture shows the Harris points superimposed on our original image:</div><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><img border="0" height="116" name="graphics9" src="http://www.pirobot.org/blog/0004/images/omni-obs-9.jpg" style="text-align: left;" width="373" /></span><br />
<div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px;"><br />
</span></div><br />
<div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">This image illustrates the power of visual obstacle avoidance over our previous approach using sonar and infrared sensors. From a single snap shot (or frame of a video) we are able to get a 360-degree view of our surroundings with obstacles nicely marked by Harris Corner points. And even though we don't know the distances to these points, we can usually assume that points higher up in the image are further away.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The final step involves using the array of Harris Corners to determine the best direction for our robot to move. The procedure described here looks for a space that is wide enough for our robot to pass and deep enough to allow for some significant forward progress. For example, the big space on the left of the picture would be a good candidate. The smaller space on the right would also allow some movement but we would quickly run up against some obstacles. To assess these spaces we proceed as follows.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">Start at the left edge of the picture and move toward the right, checking each Harris Corner as we go. Set the left boundary of our candidate gap to our starting point. If the vertical coordinate of the current Harris Corner is greater than some threshold, then continue on to the next point. If it falls below the threshold, assume we have found the right boundary of the current gap, store the width and average depth of the current gap, and start again moving toward the right.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">Applying this algorithm to the image above yields two gaps. The one on the left has its mid point at 95 degrees to the left, a width of 123 pixels, and an average depth of 48 pixels. The gap on the right is located at 114 degrees right, is 86 pixels wide and 28 pixels deep. In this case, the clear winner is the gap on the left. Our control algorithm rotates the robot 95 degrees left, then heads straight. As the robot moves across the floor, the analysis is repeated for each frame of the video at a rate of about 10 frames per second. If a set of Harris Corners directly ahead appears below our cutoff threshold toward the bottom of the frame, we make a course adjustment toward the middle of a new gap.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">When the path ahead is clear, the placement of obstacles and Harris points will shift as our robot moves forward. We therefore also program the robot to adjust its heading toward the middle of the current gap as it is moving so that it adapts to the changing shape of the gap from one position to the next.</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The following series of video clips demonstrates this algorithm in action. All of the robot's sonar and infrared sensors are turned off so that only vision is used for navigation. Let's see how well it does when confronted with a floor cluttered with various objects:</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><object height="344" width="425"><embed src="http://www.youtube.com/v/URM9TIJfZK4&hl=en&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="425" height="344"></object></div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">Notice how well the robot avoids even the thin chair legs which are often very difficult to detect with fixed sonar and infrared sensors. (A panning range sensor is much better in this regard.)</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The next video shows the view from the robot just after the edge filter is applied. Here you can see how nicely objects like chair legs stand out in the image:</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><object height="340" width="560"><embed src="http://www.youtube.com/v/8G9MjRg7HN0&hl=en&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="560" height="340"></object></div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">The last video shows the normal view from the robot, including an overlay of the Harris points indicating the positions of obstacles:</div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;"><object height="340" width="560"><embed src="http://www.youtube.com/v/XudrDDLndPY&hl=en&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="560" height="340"></object></div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; text-align: justify;">In conclusion, we have seen that a simple home made omnidirectional vision system can be used to highlight obstacles in the path of the robot. The robot can then take evasive action in the direction of the spaces or gaps between obstacles. No other sensors are required, although combining the visual system with sonar and infrared sensors would yield an even more robust navigation algorithm.</div><div style="margin-bottom: 0pc; orphans: 2; widows: 2;"></div><div style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; font-style: normal; font-weight: normal; line-height: 1.44pc; text-align: justify;"><span class="Apple-style-span" style="color: black; font-family: 'Times New Roman', serif; font-size: medium;"><b>Hardware List</b></span></div><div style="text-align: justify;"><span class="Apple-style-span" style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; font-family: 'Times New Roman', serif; line-height: 23px;"><br />
</span></div><span style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; font-style: normal; line-height: 1.44pc;"><div style="text-align: justify;"><span class="Apple-style-span" style="color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px;">In case some of you are interested in the hardware used in the robot featured in this tutorial, here is a list:</span></div></span><br />
<ul style="-webkit-border-horizontal-spacing: 2px; -webkit-border-vertical-spacing: 2px; color: #003366; font-family: 'Trebuchet MS', Verdana, Times, Arial, Helvetica, serif; font-size: 15px; line-height: 21px; margin-top: 0px;"><li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><a href="http://www.trossenrobotics.com/sharp-ir-distance-sensor-gp2d12.aspx" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Sharp GP2D12 infrared sensors</span></span></span></span></span></a><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">: useful range is about 4" - 30".</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><a href="http://www.trossenrobotics.com/parallax-ping-ultrasonic-range-sensor.aspx" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Ping(TM) sonar sensor</span></span></span></span></span></a><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">: useful range is less than 1" out to about 9 feet.</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><a href="http://www.trossenrobotics.com/store/p/5196-Robotics-Connection-Serializer-WL.aspx" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Serializer microcontroller</span></span></span></span></span></a><span style="color: #003366;"> </span><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">with built in H-Bridge motor controller and wheel encoder inputs.</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><a href="http://www.trossenrobotics.com/store/p/5199-Bluetooth-Communication-Connector.aspx" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Bluetooth wireless module</span></span></span></span></span></a><span style="color: #003366;"> </span><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">for the Serializer so it can communicate with the host computer.</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><a href="http://www.lynxmotion.com/Product.aspx?productID=96&CategoryID=11" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Gearhead 7.2V motors</span></span></span></span></span></a><span style="color: #003366;"> </span><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">with integrated </span></span></span></span></span><a href="http://www.lynxmotion.com/Product.aspx?productID=448&CategoryID=" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">US Digital encoders</span></span></span></span></span></a><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">.</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><a href="http://www.trossenrobotics.com/store/p/3136-VEX-Wheel-Kit.aspx" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Vex Wheel Kit</span></span></span></span></span></a><span style="color: #003366;"> </span><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">and </span></span></span></span></span><a href="http://www.trossenrobotics.com/store/p/3121-VEX-Hardware-and-Metal-Kit.aspx" style="color: blue;" target="_blank"><span style="color: blue;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Hardware and Metal Kit</span></span></span></span></span></a><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">.</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="line-height: 1.44pc; margin-bottom: 0pc; orphans: 2; text-align: justify; widows: 2;"><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><span style="font-style: normal;"><span style="font-weight: normal;">Linksys Wireless-G webcam. <a href="http://www.amazon.com/Linksys-WVC54GCA-Wireless-Internet-Monitoring/dp/B0010OXEDU" style="color: blue;">Model WVC54GCA</a>.</span></span></span></span></span></div></li>
<li style="font-size: 15px;"><div style="font-style: normal; font-weight: normal; line-height: 1.44pc; orphans: 2; text-align: justify; widows: 2;"><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;">50-cent silver Christmas ornament from Target.</span></span></span></div><div style="text-align: justify;"><span style="color: #003366;"><span style="font-family: Times;"><span style="font-size: small;"><br />
</span></span></span></div></li>
</ul>Unknownnoreply@blogger.com0