Industrial robots were conceived in 1954, when George Devol applied for a patent for an industrial robot. Approval took seven years and in 1961 the very first Unimate robot was installed in a General Motors assembly line in New Jersey. The task was to remove hot castings and stack them. By all accounts the robot did a good job and it was given some spot-welding duties too. The machine was hydraulically-driven, large and somewhat noisy, but it accomplished its tasks.
Unimation, the company making the Unimate, was formed in 1962. Those first robots sold for $18,000 (about $171,000 in 2022), and the sale of the first model lost about $35,000 ($332,000 today), about twice as much red ink as black. It took the company until 1974 to make healthy profits.
Unimate generated a lot of talk and interest in the industry, and had its first brush with consumer market fame in 1963 when it appeared on the Johnny Carson show and successfully putted a golf ball and poured a beer into a glass. It was, after all, programmable via a drum memory—magnetic memory that was attached to a rotating drum inside the unit.
In 1978 the PUMA (Programmable Universal Machine for Assembly) was born at Unimation. These machines were more precise, more easily controlled, and transmitted power via servo motors rather than hydraulic fluid. A control computer was based on LSI-11 architecture (similar to the old PDP-11 computers). In the early ‘80s some developers noted the serendipitous pairing of the programmable arm and a part-recognition system. It’s hard to believe, but it’s true: industrial machine vision robotics has 40-year-old roots.
Now that makes sense
To understand how robotics evolved in manufacturing, we need to recall that they were invented to perform tasks that were done by live people. They could spot weld or stack things, and more. As we say now, the dull, dirty, and dangerous things can be done by robots.
If you watch the Johnny Carson appearance, you’ll see in the beer-pouring demonstration that the robot gripper (a two-finger gripper end effector) takes a few moves to get to the right spot, then it puts the beer can—which was much sturdier in those days—between its grippers and lifts and turns its wrist, the beer landing in the glass as a result. If I had to bet, the pressure to hold the can was figured out ahead of time. Today, the gripper would adjust the pressure.
If you had to summarize the major difference in robots between the 1970s, say, and today, you would get no argument from me if you said, “Yesterday’s robots were things unto themselves; today’s robots interact with their environment.” I might even go further and say that robots have graduated from using programmed motion and coordinates to being directed at least in part by feedback from “senses.”
In our story about the Universal Robots UR5e cobot and ActiNav software for vision, we learned how a part-picking application works when a cobot plus software interacts with its environment. Even if we limit ourselves to built-in automation that comes standard on machine tools, we would see imaging being used to check nozzles on laser cutters, a smart camera that can virtually “true up” a remnant piece of sheet metal thrown onto a cutting bed, and detecting (and with the press brake’s help, correcting) the bend angle on a piece of steel.
It’s easy to understand how vision helps improve manufacturing. Other senses can help too. In fact, that last vision example, measuring bend angles, can and is done in other ways, namely by touch. Some brakes forgo vision for probes that contact the metal at very specific points, which can be translated into bend angle. In both the vision and touch cases, the automation system is using a version of human sense to determine something, to get feedback about something, and to take an action or recommend one. That’s what we do, we get feedback about something and then we take action.
What about the other senses? What about smell? What about taste? These are arcane functions at this point, but if you search on “robotic wine tasting” you’ll see that the robot can do things, even to the point of pairing foods, etc. Also, some research has been done on an automobile’s ventilation system sensing volatiles (the “chemical” smell in paint, fuels, tar, etc.) in the ambient air, and closing the intake vents immediately, avoiding bringing them into the car. In the future, a system might detect the molecules of smoke to do a shutdown.
Sound actually comes into play. While it’s not an industrial environment, think of an application like city-street security cameras. Already there is a big urban installed base of cameras that can triangulate sound, discover its source, and point the cameras in that direction. In the future, a nasty noise might shut down an automated saw or laser cutter.
What does all of this mean?
- For users of cobots, it means the flexibility of today’s systems makes them valid in many applications. The trouble may be maintaining a focus on what you want the cobot and associated software to accomplish.
- For designers of systems, think less in terms of coordinates and movements and more in terms of what the job entails and what feedback is useful. Automatic B if you incorporate feedback to direct the next action. Automatic A if you use one or more senses the way a human worker might in order to make decisions or accomplish the job. Automatic A+ if you break free from the goal of “replace the human,” and go to “what can this system do?”
- For machine tool vendors, the disparate robotic and automated parts will be directed at accomplishing the tasks of the machine tool. Automatic A if you consider questions like, “Can we insert or remove steps here?” Automatic A+ if you answer them in the affirmative.
We’re still a few years away from omnipresent AI in manufacturing. Right now, we are using robots and automation to work faster and more efficiently, and frankly, to ease the pain of staffing. Finding people is not getting any easier post-COVID.
During the duration of the fifth wave of manufacturing, we will see many changes but the current changes are most obvious in the march of the robots into many phases of manufacturing.