It was only with CNC technology that it became possible to automatically produce workpieces with the highest demands on accuracy and complexity. The so-called computer numerical control process stands for efficient work as well as precise, reproducible results and is now indispensable in machining production. Let's now take a look at the history of CNC technology, what influenced its development and in which areas it is used today.
Order of the US military triggers development
In 1949/50, John Parsons began developing NC technology at the Massachusetts Institute of Technology in Cambridge, USA. The concept of numerical control is based on the fact that machine tools are controlled by a device for data coding, which means it is a pure hardware concept without the aid of software. This development was triggered by an order from the U.S. Air Force with the requirement to no longer manufacture important parts of large aircraft as riveted and welded joints, but from the solid. However, since the templates and models required for form milling were very time-consuming and costly to produce using conventional technology, the decision was made to develop a control system. Because the contours of these large workpieces could easily be described using mathematical functions. This is how Parsons succeeded in constructing the first NC-controlled machine called "Cincinnati Hydrotel".
NC technology with punch cards
In 1954, the US company Bendix took over Parsons' technology and subsequently developed the first NC machine equipped with over 300 electron tubes. The control of this machine was realized by punched cards. Just two years later, around 100 NC machines were in use in the US aviation industry, and by the end of the 1950s, NC technology had finally found its way into the industrialized nations of Europe. In the 1960s, numerous existing milling machines were equipped with numerical controls and NC technology was continuously developed. Soon it was possible to automate the tool change and implement an integrated circuit.
CNC technology with CAD/CAM and CIM systems
After the first NC control with program memory became available in the early 1970s, the transition to CNC technology took place in 1978 and the first CNC machine was developed. Until the mid-90s, CNC programs were laboriously written by hand, which demanded enormous concentration from the programmer. Even the smallest programming errors could cause damage to the machines. From the turn of the millennium, programs began to be generated directly from a CAD/CAM system, which is still the standard today. In the meantime, work is already underway on a new system, the so-called CIM system (computer integrated manufacturing), with which programming is to take place completely without human influence in the future.
The development of CNC technology fundamentally changed the manufacturing processes in the machining industry. It made it possible to automate the entire manufacturing process from clamping the workpiece to milling, tool changing, quality control and finishing. Due to the significantly faster and yet very precise movement of the axes and tools, rationalization in series and individual production as well as greater efficiency and cost-effectiveness could be achieved. Today, CNC technology is used primarily in metalworking. But also in other machining processes using plastic or wood as workpieces, this technology is often applied today.