Está en la página 1de 4

Roots of Automation

"Automation" refers more to an ideal for industrial production than any one set of technologies or practices. The word was coined in 1946 by the Ford Motor Company's vice president, Dale S. Harder, who used it to describe the automatic or semiautomatic mechanical equipment then coming into use for the assembly of automobiles, the machining of automobile parts, and the stamping of sheet metal items such as fenders. While the popular press sometimes described these machines as "robots," implying a humanlike flexibility of application, the technologies Harder described were designed to perform a single task. Later, the term automation was often used to describe computer-controlled (usually programmable) machines that did include the potential to work on various different tasks. What Harder described was the culmination of the evolution of machine production underway for at least a century and was an extension of what had previously called "mechanization." This mechanization was largely a nineteenth-century phenomenon, involving the deskilling of work or the outright replacement of craft workers with machines. This movement was reaching its limits at Ford and elsewhere by 1950, just at the time when university and military researchers were investigating a new technology that combined traditional production machinery, especially machine tools, and the newly developed electronic computer. By the early 1950s, there would be a distinction in engineering circles between "Detroit auto-mation," relying on purely mechanical means, and computer automation. The impetus for this development was the military's desire to produce aircraft parts at a high rate of speed and with high quality control. Also, aircraft and missiles were then being developed which used parts that were extremely difficult to make, and it was believed that a machine could do a better job than even the most skilled machinist. The U.S. Air Force, working closely with engineers at MIT and elsewhere, introduced the first "numerically controlled" (NC) machine tools in the late 1940s. These machine tools used technologies derived from the computer to control the motions of the machine in accordance with a predetermined program. An NCequipped machine tool could be conveniently reprogrammed whenever necessary, avoiding the inflexibility that was seen as the major pitfall of Detroit automation. Although the early machines did not completely eliminate human labor, they approached the ideal. Later, engineers distinguished these NC tools from so-called computer numerical control (CNC), which received instructions from a generalpurpose computer, often linked to the tool by wires. CNC is the standard technology used today, although its commercial success was slow in coming. While the aircraft industry, largely because of military support, widely adopted NC and CNC machine tools by the 1960s, few other industries followed suit. Few consumer products were as profitable as aircraft parts, making NC/CNC tools too expensive to justify.
Reaction in 1950s

There was sustained resistance to the adoption of NC and CNC tools for other reasons as well. Labor unions saw these technologies as a threat and forecasted massive technological unemployment. The public's reaction to the threat of automated factories was generally unfavorable, despite attempts by industrialists to provide reassurances. One of the most influential books of the era was John Diebold's Automation (1952), which explained the alleged advantages of the technology to the

nonexpert. Countering Diebold, Kurt Vonnegut's 1952 novel Player Piano was a dystopic vision of what might happen if automation succeeded. So powerful was the idea of automation that the image of the "push button factory" of the near future became a clich in movies and the popular press in the 1950s. In the auto industry and elsewhere, unions were able to reach a compromise with managers, allowing automated equipment to be installed in factories while preserving the wages and hours of most workers. The new factories qualitatively degraded the work experience for many highly skilled machinists and greatly reduced the need for them over the long term. Other types of automated equipment did eliminate some of the simplest assembly and materials-handling tasks, leading to some loss of jobs. However, automated production machinery eventually reduced costs and improved the quality of many items.
Other Forms of Automation

Outside the automobile and aircraft industries, automation of another sort also began to emerge in the early twentieth century. Engineers in the chemical industries, where it was common to employ complex, continuously operating processes, developed a form of automation beginning in the 1930s. There large-scale reactions such as the "cracking" of petroleum were monitored and controlled from centralized control rooms. Sensors and actuators, often in the form of pneumatically operated devices, connected the control room to the plant itself. Despite great differences between the chemical and metalworking industries, engineers by the 1940s also described this as part of the same general automation movement. Similarly, the growing size and complexity of electric power plants in the post-1945 period stimulated experiments with centralized control of the boilers, steam turbines, generators, and switch gear associated with the stations. Relying on pneumatic or electrical controls, the power industry thus also developed a distinctive variety of automation. With the advent of nuclear power in the 1950s, the design of this type of centralized automation reached a high state. The control room of a nuclear plant, filled with switches and dials, became an easily recognized symbol of the industry by the 1970s, when many such plants were in operation. There were also nonindustrial applications of automation. A prime example is the sorting of mail, which was done almost entirely by hand until the 1950s. The Post Office sponsored a far-reaching program to automate sorting processes, installing its first semiautomatic mail sorter in Baltimore in 1956. By 1965, the Post Office had installed its first optical character recognition device, which allowed a machine to sort some letters according to their city, state, and ZIP code.
Robotics

An example of the eventual convergence of Detroit-style automation and electronic computing is the development of the industrial robot. Long a feature of science fiction, the first robots were merely armlike mechanical devices, specially designed to handle one particular task. Their utility was limited to applications where high temperature or other factors made it impossible or dangerous for people to perform the same tasks. However, programmable robots appeared as early as 1954, when Universal Automation offered its first product, the Unimation robot. Although General Motors installed such a robot on a production line in 1962, sales of robots were quite limited until the 1970s. During the 1960s, many universities participated in

the development of robots, and although many concepts carried over into the industrial robotics field, these did not immediately result in commercial adoption. It was Japanese companies that moved rapidly into robot utilization in the 1970s. Kawasaki Corporation purchased the Unimation robot technology, and by 1990 forty companies in Japan were manufacturing industrial robots. The shock accompanying the rapid penetration of the domestic auto market by Japanese auto companies led American corporate leaders to adopt Japanese methods, speeding up the diffusion of industrial robotics in the United States.
The Microchip's Role in the Success of Automation

A key technical and economic factor in the widespread success of various forms of automation technologies in the 1980s and 1990s was the development of the microprocessor. This tiny electronic device was invented in the United States in the late 1970s, intended for use in calculators and computers. However, its utility as an industrial process controller was almost immediately exploited. Less well known to the public than the microprocessor, a similar device called the microcontroller outsells the microprocessor today. The original applications for the microcontroller were as an electronic replacement for electromechanical devices called process controllers, such as the ones used in chemical plants. Process controllers incorporated logic circuits that were usually not programmable. They were used to regulate multistep industrial processes using a timed cycle. A familiar example of such a device is the electromechanical switch/timer used on home washing machines for many years. Process controllers using microprocessors or microcontrollers allowed convenient reprogramming, and eventually these were linked together to provide overall monitoring and control of plant activities from a remote central computer or control room.
The Electronics Industry As Automation's Prophet

At the beginning of the twenty-first century, American industries were still in the process of implementing automated production systems. The highest overall level of automation was in the manufacturing of microelectronic devices such as microprocessors and memory chips. The microelectronics industry builds devices on such a small scale and requires such high levels of cleanliness that some kind of mechanical handling is necessary if only to keep levels of contamination and breakage to a minimum. Microelectronics companies have pushed forward the development of specialized, computer-controlled equipment for manufacturing, inspecting, and handling chips. The Institute of Radio Engineers held its first conference on the use of automated equipment in the manufacture of electronic parts in 1954. By 1960, the Western Electric Corporation had constructed a highly automated plant for assembling electrical components called resistors in North Carolina, which became a showpiece for automated production. Yet after the invention of the integrated circuit in 1958, the scale of chip production did not justify robotic handling of the chips, which were simply carried from machine to machine by hand or placed on conveyor belts. Chip

manufacturers actually preferred hand labor to automated equipment until the diminishing size of the chips and the extreme level of attention paid to particulate contamination compelled them to isolate the manufacturing process inside closed "microenvironments" in the 1980s. By this time, the cost of robotic arms and similar products had dropped, and the reliability of the systems had risen from a few thousand average hours between failures to over 80,000 hours. While in the 1980s there was considerable talk about "lights out" chip fabrication facilities completely devoid of humans, that goal has proven less attractive over time, as corporations have continued to rely on some operators even in this highly automated industry.
Bibliography

Adler, Paul S., and Bryan Borys. "Automation and Skill: Three Generations of Research on the NC Case." Politics and Society 17 (September 1989): 377402. Beniger, James R. The Control Revolution: Technological and Pronomic Origins of the Information Society. Cambridge, Mass.: Harvard University Press, 1986. Bennett, Stuart. A History of Control Engineering, 19301955. Stevenage, U.K.: Institute of Electrical Engineers, 1993. Noble, David F. Forces of Production: A Social History of Industrial Automation. New York: Knopf, 1984. David Morton

Read more: http://www.answers.com/topic/automation#ixzz1PD8VdLYy

También podría gustarte