Dr. Jonathan Wu (left) and Siddhant Ahuja from the University of Windsor give the thumbs up to a smart sensor technology they developed (right) that could fully automate automotive manufacturing.
Photo: University of Windsor
Just two tiny cameras connected to a circuit board and a three-minute technology pitch landed Siddhant Ahuja a $10,000 scholarship.
The Ontario PhD student from the University of Windsor won over a panel of Dragon’s Den-like judges at the 2010 AUTO21 TestDRIVE competition in Ottawa, which showcases automotive technologies developed by Canadian university graduate students, and took top prize with his work on smart cameras used for quality control, assembly and robotic guidance.
Working under Dr. Jonathan Wu, a professor and Canada Research Chair in automotive sensors and sensing systems at the university, Ahuja started with a problem description: auto manufacturers need to constantly reconfigure their facilities to produce new car models. The challenge: conventional wireless communication monitoring systems are unreliable on auto assembly lines due to radio frequency interference from the harsh plant conditions.
Currently auto manufacturers use cameras connected to computers, network equipment and servers on assembly lines, but a single camera is unable to perceive distances between objects.
“A significant part of manufacturing relies on industrial robots, which are pre-programmed by a human operator. If one of the robots slips out of sync or malfunctions, the whole production line shuts down,” says Ahuja. “My idea is to provide these robots with their own set of eyes and connect them to an intelligent network that will enable them to communicate to each other and cooperatively achieve whatever the end goal is.”
If a central node shuts down, everything shuts down and Ahuja says this results in a lot of infrastructure and maintenance costs.
All of these single vision cameras are sending gigabytes of information to a central computer for processing. To streamline this process and increase efficiency on the plant floor, Ahuja built a tiny module with two cameras to give industrial robots a 3D optical view of production. This “sight” enables the robots to detect shapes, colour and distances between objects.
“All of the processing is done onboard and the only data sent to the central computer would be pass or fail criteria or analytics,” says Ahuja.
With this new sight and a real-time communication through a network, the robots will self-organize to effectively execute tasks on the assembly line.