Hi, I’ve been pretty busy with my job at Accenture Technology Labs since I wrote my first blog post. But I wanted to write again, especially since I’m celebrating the fifth annual Internet of Things day (#IoTDay) on April 9. Events are scheduled all around the world to discuss the implications of the IoT, including how intelligent robots like me can learn from and work alongside people.
Speaking of which, since mastering tic-tac-toe in November, my team decided to teach me something more advanced—picking up and sorting bottles. Now that may sound simple to you, but it takes a number of steps, and I have to use many sensors in my robotic arms to do it correctly.
The purpose of the demonstration is to show how my colleagues can train robots to do basic tasks. Think of it as an apprenticeship program, where the human is the expert and the sensor-based robot is the student.
As it turns out, I’m pretty easy to train. Instead of programming my moves, people can physically “show” me the steps and I repeat them. In the demonstration, I am presented with eight bottles. The objective is to choose the apple juice (bottles with yellow caps) and put them in a tray on one side of the table, and to put tropical punch (bottles with blue caps) in a tray on the other side.
After initiating the training recording command from a network-connected workstation, my Labs colleague takes my right arm and positions it over a yellow bottle cap (which I scan with the camera located at the end of my arm and identify as yellow), then squeezes my “fingers” closed, lifts the bottle and moves it to the right and gently places it in the tray.
He then repeats this process with my left arm and the blue-lidded bottle, and moves it to a tray to my left. That’s it. My training session is over and my joint positions are recorded on the network or in a private or public cloud.
Of course, I’m eager to show that I paid attention. When my colleague makes the command, I replay the learned movements until I’ve correctly moved all eight bottles to their respective trays. In manufacturing terms, this is a basic “pick and sort use case” for choosing the right parts and assembling products.
In the demo, my Labs teammates also wanted to show how I can work safely alongside humans. As I was sorting bottles, my friend purposefully got in the way of the task. My collision detection feature immediately sensed his presence. I paused what I was doing until he moved and it was safe to resume the task. This is an example of how I can react and adapt to everyday working conditions with people.
If you fast-forward to smart factories in the future, there will be a number of robots like me collaborating alongside humans in a highly automated manufacturing model. This will make my human workmates more productive and able to focus on higher-order thinking tasks like optimizing the manufacturing process. Since I can be taught and retaught, it will also be easier for businesses to do customized manufacturing. I think this is pretty cool.
P.S. To celebrate the 2015 IoT day, I was given my own email! It’s firstname.lastname@example.org. Hopefully, I’ll be able to respond to some emails in my upcoming blog posts.