Happy Halloween! At Leap Motion, we’ve seen our fair share of impressive motion-controlled robots that will one day bring about the robocalypse. (We’re looking at you, hexapod!) But the robot revolution is just getting started.

This month, two different robot arms featuring Leap Motion control have vastly overshot their Kickstarter funding targets, promising to bring miniaturized adaptive robotics to your desktop in new and exciting ways. On another level, a high school student’s robotics project is combining the Oculus Rift and motion control to create an experience that takes you wherever the robot goes.

Babbage: The VR Telepresence Rover

Last month, Alex Kerner kicked off the first of a series of videos exploring how he built Babbage, a versatile telepresence robot, from soldering to software. We caught up with Alex earlier this week to ask about his vision for the project.

“What really got me into robotics is that it’s an emerging technology,” he said, “so there’s a lot of room to be innovative without having to make something incredibly sophisticated. I love the idea of building something from scratch on my own, which is why Iron Man is my favorite superhero. Robots in particular are fascinating to me, because of the mechanical sophistication and innovation required to make them function.”

As for the augmented reality side of the equation, Alex sees it as a way to make robotic controls more seamless and intuitive. “Instead of having to learn the controls, or program an AI to interpret commands, it’s as easy as reaching through the screen and doing it myself. It opens up a lot of opportunities for complex systems that would be frustrating to control conventionally, such as the movement of the head.”

babbage

Named after computer science pioneer Charles Babbage, Alex’s robot is controlled through a spiderweb of different languages, which he plans to integrate in the months ahead:

  • The motors are controlled with Node.js using the Johnny-five library.
  • The sensors are read by the Arduinos, which in turn run the Firmata sketch to relay commands from the Beaglebones.
  • A third Arduino board runs custom C++ and communicates via I2C, as Johnny-five has no library to support multiple sonars.
  • Python is used to capture the web video from the cameras and directly overlay the graphics.
  • The Oculus Rift’s accelerometer is read with a custom C++ app (with plans to rewrite this into the Python app instead).
  • The Nokia runs C# code for voice recognition (which is still a work in progress).
  • Unlike most VR projects, Babbage doesn’t involve a 3D engine.

What’s it like being inside Babbage as he explores the world? “As of right now, the video feed is a little jerky, but it feels immersive,” said Alex “The idea of a telepresence robot is to make the operator more like a driver than a commander, and that’s exactly what it feels like.” At this stage, he says, bringing the latency down will be an important step in reducing sim sickness.

Where VR and robotics collide, Alex believes that telepresence will be a major step forward in how humans interact with the world. “It’s a technology that could potentially make mundane transportation obsolete. Anything that a human can do with a vehicle, a remote operated drone, or even on foot could be done using a telepresence robot. Rovers like Babbage will probably see a lot of use in places where it’s too dangerous to go on foot, like rescue or military operations, or even as an opportunity to live an active life for someone who is disabled or homebound.”

Future videos will demonstrate the laser system, sonar, visual system and face recognition, and the Leap Motion input – including a future “snapshot” gesture. We can’t wait to see how Babbage’s journey progresses.

Dobot: A Robotic Arm for Everyone

Dobot is intended to take the industrial robotic arm beyond the maker community and into everyday life. With a 4-axis parallel-mechanism arm connected to an Arduino, the Dobot has seven distinct control methods, including wireless, voice, and Leap Motion controls.

According to one of the creators behind the project, “as industrial robot engineers, we wanted to find a highly functional and agile, desktop robot arm, but were unsatisfied by low cost, low precision and poor functionality desktop robotic arms on the market. The consumer-level robot arms at the time were mostly servo-based. When users bought the robots, they found that the precision wasn’t high enough to replicate the applications shown in Kickstarter demos, like writing, grabbing things, not along helping them with more complicated tasks.”

From there, the group quit their jobs to develop a high precision robot based on stepper motors. The Leap Motion Controller was a natural input choice for its popularity among makers and developers. “With Leap Motion, we can achieve a nature way to manipulate the robot arm, and an easy approach to understanding how it works. In this case, Dobot is not only a professional tool to work with, but a great desktop platform for everyone to enjoy.”

7Bot: An Arm that Can See, Think, and Learn

Another Kickstarter campaign that recently blew past its funding goal, 7Bot is a 6-axis robot arm designed to be a miniature version of the popular IRB 2400 industrial robot. You can teach it how to move by holding its arm and guiding its movements, control it over the web, or through your hand movements:

For us, one of the most exciting things about this video was the extremely low latency on display. We caught up with the 7Bot team to ask about their process. According to Eric, one of the developers on the team, “Leap Motion is an essential control method for 7Bot. It allows everyone, including one of our grandfathers, to control 7Bot at ease.”

“Leap Motion can detect the hand gestures very accurately. But sometimes there are jitters, which are highly undesirable in controlling the robot. We applied a median filter to eliminate jitters, and some simple mapping relations were also used to make this application more intuitive to users. The high capture rate of the Leap Motion Controller and high processing rate of the median filter achieve such a low latency, which is only 0.1 to 0.2 seconds in theory.”

What’s next for 7Bot’s Leap Motion integration? The team plans to add more end-effectors to 7Bot, including one with 5 fingers, like those used in prosthetics. This means that a future version could effectively mirror your real life hand and finger movements.

The world is yours to hack – what will you build? The 2015 3D Jam is running right now with over 25 types of approved hardware, including Arduino, the Parrot AR drone, Lego Mindstorms, Mini Pan-Tilt Kit, OWI Robotic Arm Edge, and more! Bring your hardware dreams to life and register now.

Alex is the head writer and blog editor at Leap Motion, where he stands as the final bulwark against bad grammar. Want to share your Leap Motion project? Email acolgan@leapmotion.com or PM leapmotion_alex on Reddit.

Twitter Skype