Wednesday, 11 January 2012

Siroomba, can you come here with the mop?

How about incorporating the iPhone personal secretary Siri inside your family's cleaning robot -the Roomba? You can yell, "yeah siroomba, come here, divya just dropped some milk" and you will have him there, in just a few seconds, with the right mop for cleaning up the mess. 

A huge array of technologies will have to work in sync before the floor is clean... a sound sensor deducts the origin of your scream. The language recognizer understands the language (or goes back to google translate engine on the web, just in case you choose to scream in your mother tongue). The language layer gives the right command to the Robot OS. With the care of a fighter pilot selecting the right missile for his mission, the right mop is picked up. The physics engine kicks in to move Siroomba in the most optimal route. Finally, the work done, he stands there and cheekly flashes the clean floor on the HDTV. 

The world is clearly a  befuddling place to our own crebral processor that has evolved over millions of years. How will the robot understand the circumstance it is in? No one storage can keep all the experiences for the Robotic processor to respond. Sensors will take in visual, audio and nasal inputs and make sense out of it from a giant cloud of world experiences. What the brain knows as a result of evolution, the cloud will know as a result of the learning engine.

Imagine Mr. Cook - your own cooking assistant. Your chicken will smell and look just about right, thanks to the Robotic Recipe on the cloud from Khana Khazana. Mr. Cook will make it and verify it with the gaint cloud of cooking experiences. 

And if you had imagined the robotic Mr. Cook looks like an Honda's ASIMO - a biped, you are likely to be mistaken. The  cooking robot is more likely to look like a spider, with its eight arms tossing around the ingredients. And of course, in the center is the robotic brain with positronic pathflows that Asmiov described. 

Instead of the laws of robotics, we will have the most critical experiences embedded in the brain. Physics. Language. Movements, with flexible axis. Color. Smell. Geography. How about robots with a sense of history? A robot which says, "Last time when I made the chicken, you just about nibbled... are you sure you want this recipe again?" or "How about some Pasta, Siri told me about the run tomorrow morning?"

Imagine combining input of full body scanners and gesture recognition applications into the art of tailoring. Entire industries can shift geographies. You walk through the scanner, select your cut, select your fabric - somewhere in a tailoring center in Arizona a robot will have stitched and shipped your dress before you leave the store in NY. This is almost applying the Dell model manufacturing to the apparel industry. I will be shorting the shares of the Bangladesh shipping industry. 

Imagine the compression this will cause in the electronics supply chain? LED/LCD, Smart/Normal, Touch/Non-Touch, all made in a robotic manufacturing center just before you leave the store. The end of labor arbitrage in manufacturing. In a world of near shore production, the baltic dry index will be rendered meaningless. 

As the cloud's ability to respond to real world experiences improve, as the mechanics incorporates more degrees of freedom inspired by nature, as we cross pollinate the possibilities, a whole new exciting world will emerge. Countries and companies that start accumulating IP in these areas will win. 

Siroomba will not be cleaning up just your dining table. Entire industries will be wiped clean and rebuilt. 

Next: Universal Translator.