ID2Care

Systems redesign to improve the survival of critically ill patients using data based modeling

  • Increase font size
  • Default font size
  • Decrease font size
News


You've Never Seen a Robot Drive System Like This Before

E-mail Print PDF

What you're looking here is a hemispherical omnidirectional gimbaled wheel, or HOG wheel. It's nothing more than a black rubber hemisphere that rotates like a spinning top, with servos that can tilt it left and right and forwards and backwards. Powered by this simple drive system, the robot that it's attached to can scoot around the floor in ways that I would have to characterize as "alarmingly fast." asd as asdsadfsdf

Last Updated on Monday, 07 November 2011 15:04
 

Robots Learn to Take Pictures Better Than You

E-mail Print PDF

Humans would like to believe that artistry and creativity can't be distilled down into a set of rules that a robot can follow, but in some cases, it's possible to at least tentatively define some things that work and some things that don't. Raghudeep Gadde, a master's student in computer science at IIIT Hyderabad, in India, has been teaching his Nao robot some general photography rules that he says enable the robot to "capture professional photographs that are in accord with human professional photography."

Essentially, they've simply taught their robot to do its best to obey both the rule of thirds and the golden ratio when taking pictures with its built-in camera. The rule of thirds states that it's best to compose a picture such that if you chop the scene up into nine equal squares (i.e. into thirds both vertically and horizontally), what you're focusing on should be located at an intersection between squares. And the golden ratio basically says that the best place for a horizon is the line that you get when you separate your scene into two rectangles, with one being 1.62 times the size of the other. This all sounds like math, right? And robots like math.

These aren't hard-and-fast rules, of course, and brilliant and creative photographers will often completely ignore them. But we don't need robots to be brilliant and creative photographers (and if they were, it would be serious blow to our egos). It would just be helpful if they were decent at it, which is what this algorithm is supposed to do, although there's certainly no substitute for a human's ability to spot interesting things to take pictures of. That said, I imagine that it would be possible to program a robot to look for scenes with high amounts of color, contrast, or patterns, which (in a very general sense) is what we look for when taking pictures.

The other part to all this is that Nao has some idea of what constitutes a "good" picture from a more abstract, human perspective. Using an online photo contest where 60,000 pictures were ranked by humans, Nao can give the images it takes a quality ranking, and if that ranking falls below a certain threshold, the robot will reposition itself and try to take a better picture.

If this method works out, there are all kinds of applications (beyond just robots) that it could be adapted to. For example, Google image searches could include a new filter that returns only images that don't completely suck. Or maybe the next generation of digital cameras will be able to explain exactly why that picture you just took was absolutely terrible, and then offer suggestions on how to do better next time.