February 22, 1984
Presented at Exxon Research Club
Science and technology became part of our American expectations for social, economic and political progress only a quarter century ago. Our leaders remembered then their role in winning wars, particularly World War II, and in introducing therewith new factors in the power of nations and in the energy and material resources of the world (as shown through nuclear fission and fusion, the substitution of synthetic materials such as nylon and rubber for the world's natural fibers, metals and minerals). So we began to recognize research and development as essences of V. Bush's ScienceThe Endless Frontier.” Accordingly, when 25 years ago our post-war calm was shattered by the rude thrust of the intercontinental ballistic missile-nuclear warhead-Sputnik realities, President Eisenhower, and the talented advisors and administrators whom he recruited, marshalled science and technology as principal agents in winning the peace. Despite the tragedies of local aggression, such peace has been so far sustained for nearly four decades, representing as long or longer period of major maintenance of free nations, and freedom from global conflict, as is known in recorded history.
So this task was levied on our national community of science and engineering, with help from our Allies, and constant and ruthless opposition in the same dimensions from hostile ideologies. Our national science and engineering leadersacademic, industrial and governmentalsaw too that vastly more versatile and benign services could also be given to our world through the mechanisms of discovery and application, similar to those for national security, and involving major collaborations of governments, universities, and industry. However, as you would expect from the universal media of science and engineeringatoms and molecules, particles and waves, and organisms and behaviorthe results of application of R&D reach into virtually every aspect of society. Thus the public expectations that arose in connection with national R&D became even more extensive.
Accordingly, the Age of Science and its derivatives, the Age of Knowledge and of Information, have surged in this quarter century, through these mechanisms which really had more limited and specialized origins in national security, and in certain health and nutrition endeavors. Let us then begin a review and tentative evaluations of how the mechanisms noted (in which many of us here have had intimate and continuous participation) have indeed come to underlie the vast public purposes of science and technology. Let us start this examination, which we hope will be pursued thoughtfully in 1984, with its compelling questions of national policies (security and economic), national election, national educational and cultural objectives, that will be treated this year. Let us start it in terms of results which are put in terms of scientific, technical and educationalcompetence and quality. (The much subtler range of present opportunities unused and challenging estimates of how much social and humanistic gain have been provided demand other skills. These, too, we shall be seeing exerted, as we have already for several years, in the press, academically and politically, with greater and greater intensity. These include work of the several hundred university departments of history and the philosophy of science and technology, the many journals devoted to the matter, (including a new one from the National Academies) and the extensive political orations which will be related to such technical matters as public health, environment, food, shelter and transport, etc.)
But rather, let us see what we find about the underlying question of whether the mechanisms for producing science and engineering and practicing them have indeed provided the people of our nation and the world with a reasonable excellence of knowledge and its applicationwhatever the eventual effects may be. How well have we indeed achieved, in research and development, the potentials which the marvelous and beautiful rationale of nature has offered? How well have we reached the goals and objectives that even our imperfect vision of 1959 provided for the nation's scientists and engineers?
Just a quarter century ago, on December 27, 1958, President Eisenhower issued the report of the President's Science Advisory Committee entitled “Strengthening American Science.” In his statement about the report, Eisenhower cited with characteristic keenness: “I call particular attention to the conclusion of the Science Advisory Committee that the task of further strengthening United States science is so broad that government, industry, universities, foundations and individuals all have essential roles to play. The future of growth and strength of American science will depend upon the efforts of all of these parts of our national community if we are to rise to the demands of our times.” Now you see in that statement, and in the report itself, (indeed the second report of the Science Advisory Committee, the first being our findings on information processing in science and technology, since it was felt that knowledge had to be recorded and shared if anything was to come of a new age of learning), that a mechanism of pluralism was adopted. This has often been challenged since, and is still a matter of vigorous debate. But we have consistently and even fiercely defended a pluralistic practice of science and engineering in all of the institutions that the President referred to. This is in contrast to a possible concentration, in particular Federal, national or in other preemptive and totalitarian centers even as large and effective as the NIH. Even in the delicate arena of nuclear energy and nuclear weapons, where every pressure was to centralize and unitize, significant pluralism was sustained, with a shared mission of Department of Defense and the Atomic Energy Commission/Department of Energy being one of the principal Federal examples. Here, however, perhaps because of the deadly nature of the reactions of the nucleus, the R&D mechanisms (which we always can expect to be imperfect, being based on human behavior), seemed to have been flawed within the scope of pluralism. This flaw was in the fatal factor of technical detachment from the users. Namely, pluralism in nuclear weapon and warhead development has worked quite well, with dispersion among the users and the creators, yet with bridges between those elements being formed by such ventures as the Sandia Laboratories. But the users of nuclear power generation were brought in too little and too late for reactor engineering. Accordingly, we see that (probably indispensable) field still in turmoil and regression, at least in this country. Thus, if the Electric Power Research Institute had been formed and expanded in 1952 instead of 1972, it is quite possible that the nation's electric utilities and the nuclear power agencies would have been able to have a successful service to the compelling cause of energya central aid of civilization to which you here have made such historic contributions. So there we have unfinished agenda, which we shall not now pursue further.
However, it does bring us to the next consideration, which is: what are the essences of the mechanisms to which we earlier referredthe mechanisms which are well on the way, in 25 years, to transforming the nature of civilization? For these underlie much of the hope and plan for American advance in the decades to come. These mechanisms seem, in outline, to have come from systems research and development, in government and industry, leading to and coupled with a scale of study of the principles and performances in nature, of matter and energy, which is far more interdisciplinary that even the most brilliant scientiststhe Rutherfords, Thompsons, Bohrs, Einsteins, DeByes, Summerfelds, Heisenbergs, Averys, Tatums, Paulingsof the earlier part of the century had generated or even desired. This seems to have happened through curious circumstances that the nation had some scattered systems experience that certain sophisticated scientists in the Bush-Conant-Jewett War-time era mobilized. One was in a telecommunications system largely shaped by Jewett and his colleagues following Theodore Vail's concepts of its public purpose. That shaping had already shown the scale of research and engineering necessary for providing systems services for a continent, and the dependence of that capability on basic science, such as the finding of the wave nature of matter by Davisson.and Germer in Bell Laboratoriesthe crucial experiment establishing quantum mechanics.
In a curious and quite different way, there were other inputs to these systems concepts, although not much for the railroads, which ironically had a compelling need and opportunity to follow the idea. Rather, it was food and agriculture, with the Federal and State Governments, their land grant universities and extension services, that formed a loosely knit, but highly productive agricultural science and engineering system, again of dimensions vast enough to do the job. Here industry, unlike the nuclear power case, was early involved through the fertilizer and farm equipment and machinery technologies. These, in a wise and almost miraculous way, and aided enormously by the land grant colleges and universities, integrated the polymorphs of the agricultural endeavor. The extension services had an absolutely crucial systems-field engineering role. One's personal experience with these agents, who spread over the country by the thousands in the early part of the Century, affirms that they were indeed pioneering systems engineers.
Even in this broad brush treatment which is all we can do this evening, some fascinating perspectives appear. Thus an early challenge of World War II was the materials systems, which in a way had already been stimulants for the war in their implications for Lebensraum and Great East Asia co-prosperity sphere. Namely, it was believed that denial to us of natural rubber and other polymer materials would, by the blockade of all the Eastern region, paralyze our war defense. Indeed, there was good evidence that this would have happened in conflict with the modernized Blitzkrieg forces of the Nazis and Imperial Japan. So when the time came to mobilize our national effort in substitute materials and replacement of natural rubber particularly, the Federal Government and its War Production Board turned to Jewett's organization for the leadership and shaping of all synthetic rubber R&D. This was headed by Dr. R. R. Williams. Well do we remember the morning in the Mayflower Hotel in Akron, Ohio, in which the total technology and research of American polymers was nationalized. One of the strongest elements in that was the participation of Esso Research and Development, which had already obtained from ingenious German laboratories fascinating although quite inadequate methods of polydiene formation. Esso however,. already had created its own extensions of European research. Paramount was the utterly essential Butyl rubber through ingenuity of Dr. Sparks and Dr. Thomas, who then played such creative parts in the rest of our national effort.
But the point is that extensive systems research and development, in this case leading from the basic petroleum raw materials through to the finished fabricated and engineered tires and other structures containing the millions of tons of synthetic rubber quickly provided, were decisive elements in containing the Blitzkrieg. Other cases quickly come to mind, such as radar systems research and engineering. There, in combination with the British, innovation from the most basic physical science to the ultimate factory production was achieved, centered then in the radiation laboratory of MIT. This involved also the principal electronic laboratories of industry and again included thousands of individuals, where before only a few hundred had been thought adequate to cover the subject of vacuum electronics.
The Manhattan project followed the same pattern. Thus were evolved the mechanisms of pluralistic but very large scale combinations of basic science, and development and production engineering, which were called systems R&D. The methodology and strategy were again used for the ballistic missile systems era, so admirably executed by General Bernard Shriever. This in turn, based on the fixation of Assistant Air Force Secretary Trevor Gardner and myself with the usefulness of the systems engineering scheme, led to the formation of the Air Force Systems Command under General Shriever, which moved into the Space Age for national defense with high success and formation of the Aerospace Corporation.
Now we have cited conventional and somewhat specialized examples of how the mechanisms for national research and development came out, in that demanding period of the late ‘50s and early ‘60s, when the free nations of the world were confronted with the grimmest threat of nuclear blackmail yet conceived. We have come through that so far, but what about the broader issues which we must now evaluate? Namely, in the course of these mechanisms, have we, as we asked above, done reasonably well in the quality of science and engineering? Have we accomplished what is possible, what is worthy of our talents in which, as we shall see in a moment, the American people have invested so much money as well as expectations? This is what we'd like to ponder in the next minutes. Resolution of this question, like so many complex matters, will depend on judgment and your judgmental roles are respected. Your instincts and tastes about how things are going, in what fields, are important indicators for the future. After all, there are only a million originating engineers and scientists in total in the nation, whose population is approaching a quarter billion. A pitifully small portion of these are enabled and encouraged to do truly creative work, on which systems discovery and engineering must depend. Such ideas, significant for systems advance, have to be a lot better than those for toothpaste tastiness, or even if one could be forgiven for saying it, for a “tiger in a tank.” More seriously, they can't just involve an absolute rate constant for the abstraction by oxygen of the first hydrogen from iso-octane, but rather have to represent the dynamics of octane combustion with wall effects, impurity atoms and complexes, and methods of identifying intermediates which only now the picosecond/temtosecond spectroscopy of the laser makes possible. So in fact, we don't have enough of those ideas, and that makes it all the more urgent to see whether we are getting at least some of them for our money. Are we on roughly the right track in the pluralistic mechanisms which we have so ruthlessly advocated?
One response to this that all will note is that the size of spending for research and development, although a somewhat smaller percentage of gross national product than in the pace setting years of the ‘60s, is so large and has grown so steadily that there must be major belief in its values, in both economic and public purpose respects. However, this is unfortunately an inadequate deduction, since even a free market system often fails to indicate what might have been, in comparison to what actually was produced and sold. Nevertheless, before looking at how some of the present and possible mechanisms for doing R&D may work out, let us look briefly at the current investment and its reflection of public commitment.
The familiar Batelle Memorial Institute estimate for 1984 notes total United States spending of $94.2 billions, about 9% over the $86.5 billions that the NSF believes was the 1983 investment. This forecast implies about 3.7% of real growth (corresponding to about 5% inflation) in 1984 and can be compared to a 10-year average real growth of about 3.3%. Of the $94 billions (which the NSF estimates believe may be as much as $97 billions) industry will provide about $49 billions of support from its own funds (about 10% more than 1983) and the Federal Government will provide about $42.7 billions or 45.3% of the total. Academic inputs from state and private resources will be about $1.7 billions or roughly 2% of the total and about 1% will come from other institutions.
As to actual spending of these funds, industry will account for about $71 billions or 75.2% of the total, up from about 72.5% of 1983. About 11% of the total or $10.3 billions will be used in Federal laboratories and the academic institutions will spend about the same, $10.5 billions or 11.1% of the total.
With respect to Federal funding sources, the DOD, the NASA, the DOE and the Department of Health and Human Services (NIH) will be the source for about 91% of the funding, of which about 64 1/2% will come from the DOD, compared to 59% in 1983. This contains, of course, dominant developmental proportions, in comparison to research. Now as we shall note presently, the major proportion of the usage of the Federal funds is through systems development, with varying proportions of inputs from basic research programs that may not be organized or explicitly connected with a given system technology.
However, these figures confirm in all particulars that American science and technology are supported in highly diverse pluralistic and dispersed forms, through all the institutions noted, rather than through, centralized monolithic agencies or academies. Of course, the industrial portion is particularly widely distributed, with its investment of $48.8 or $49 billions applying in heterogeneous ways to every segment of our economy. Principal concentrations are in $15.8 billions total spending by aerospace (of which 73% is from Federal funds) and electrical machinery and communications using $14.8 billions, of which 40% is Federally funded. Autos and other transportation equipment, with virtually no Federal funding, claim about $6.2 billions, chemicals $8.2, petroleum about $2.9, food and beverage about $1.0. Another way of looking at the industrial part is that manufacturing will spend about $68.5 billion, 68.2% of that being from its own investment and about 32% from Federal funds.
Now these are substantial commitments, all right, but the complexity of associating them directly with the marketplace and thus with the public satisfaction or public purpose, is illustrated by the economics in the electronics industry, for instance. This laps over, of course, into aerospace and communications, and computers and other fields. Namely, the Federal Government is expected to spend $49 billions in development and acquisition of electronics in 1984. So the total spending for R&D for both independent and government support of the total aerospace, communications and electrical machinery industries, expected to be $30 billions, is well supported by a Federal Government fund of $50 billions for combined acquisition and development/engineering. So much for those “market” economics. But also market satisfactions and, public evaluation are poorly accessible even in so extensively a consumer dominated technology as automobiles or food and beverages. But it is interesting that in food and beverages, the public purpose R&D is primarily the extensive agricultural systems work, although as we have reported to the present and predecessor OSTP, in the White House, this should be joined by much heavier science and technology effort in the distribution and processing of food and beverages as well. In contrast, the automotive factors are fragmented, and are only incidentally, for environmental and other regulatory reasons, attached to the transportation system of fuels, roads, traffic control, etc. Likewise, the original systems prototype of telecommunications serving the public purpose is now being heavily revised politically and economically. So the market driven (non-Federal) examples of science and engineering for the public purpose are declining in the systems integration and qualities, despite growing capabilities largely based on computers and large data systems to move their execution and evaluation in that direction. This situation is in contrast to certain significant trends internationally, such as the Japanese Fifth Generation and Supercomputer consortia. There, five to eight of their largest electronics and communications industries, combing through the realities of industry coordination as well as MITI policy, have strong systems orientation. Likewise, the long needed cooperative research efforts in the United States, begun forty years ago in textiles but generally inhibited by outworn antitrust and other regulatory policies, are reappearing in the formation of the MCC, (microelectronics and computers) combine at Austin, Texas and the semiconductor industry associations affiliations for cooperative R&D. These two, by the nature of their mission, as well as their structure, require systems research and engineering effort, with compatibility and standards for components and circuits which can then be combined into systems of computers and other products.
Also, bioscience and biotechnology are in many ways a most challenging area for appropriate systems technology, and yet have been among the least organized in that form. This is probably because of the ancient traditions and cultural conditions for the practice of medicine. We have attempted to change this stagnant structure, in the face of brilliant individual advances in bioscience, many of which have come themselves from a systemic combination of physical science and biology in molecular genetics and related fields. The attempted change has been especially in one of the most refractory fields of health care, the treatment of cancer and related diseases. There, when we were tasked by President Nixon, in concert with the work in the Congress led by Mr. Benno Schmidt and Mrs. Mary Lasker, to organize in 1972 a National Cancer Program, we required generation of “a national cancer plan.” This very large, detailed document was regarded first with dismay and disbelief by the large and skillful biomedical community, both in the government and the independent sectors. It has undergone many revisions and refinements in the years since, and has probably been most distinguished by “benign neglect.” But it has had an impact. It was constructed in the systems engineering mode of orderly categorization of things to be found out scientifically, things to be developed technologically, things to be applied clinically and even industrially. The role of this systems findings based on some 2500 individual research efforts, mostly academic, has been encouraging. So has the formation of new knowledge centers and treatment centers around the country, which have again honored the theme of pluralism and independent responsibility for cancer therapy and patient care. But principally, are indicated steadily growing abilities to use new knowledge for public good. And the knowledge being used does qualify for the qualities that we have sought in this discussion. For instance, from this program and its high goals have come the new saga of immune reagents and reactions. Surely these imply most promising systemic restraints to disease and destructions of the organism so far identified. Antibody/antigen exchanges at last illuminate the miracle of Pasteur and the vaccines which have already rid the world of plagues and pains in many fields. But now the coupling of the molecules of biology with the studies of growth and form sponsored by the National Cancer programs are revealing pathways for new relief of ancient ills. And the mechanisms chosen for this work involve novel combinations of independent autonomous laboratories, industrially-based facilities as well as abundant industrially-produced computers and information handling, along with governmentally sustained coordination systems. The knowledge product is leading to help for all people.
Our principal theme of quality of the work, some idea of its complexity, in the systems program, can be seen in the recent studies of Professor Robert Weinberg at MIT. His studies concern oncogenes, that seem to be particular elements of DNA causing cancerous transformations. Thus in the more than 50,000 genes in a typical DNA string, only relatively few seem to control growth or cellular multiplication, either normally or neoplastically. Yet among these, about ten or fifteen have been identified or ready as possible cancer generators. Professor Weinberg has shown the difference between one of these oncogenes causing human bladder cancer and its normal growth version. Cloning enabled isolation of the individual genes and identification of the some 6000 nucleotides, just one of which turned out to be out of order in the appropriate nucleic acid entity. This would mean that one anamolous nucleotide out of about six billion contained overall in human DNA moves the cell operation toward cancer, through the slightly altered proteins that are synthesized from this displaced nucleotide component. One can see, of course, that immune reactions might then inhibit the application of this distorted gene and correct or inhibit the spread of abnormal cells. In every case, deep knowledge of structure and chemical composition is necessary to understand natural defenses and to seek then to strengthen and supplement them. The task is formidable indeed, and again we show as evidence of the fact that our pluralistic mechanisms of research and development are moving along, the recent computer generated, but X-ray crystallographic-derived model of the enzyme Cu3Zn superoxide dismutase, with particular designations proposed by work in the Scripps Institute in La Jolla, California. This simulation depicts how molecules are accommodated and associated on surfaces and membranes like those of the enzyme. Arrows represent the electrostatic field orienting the negative charges on the superoxide substrate to the positive and catalytic “dots” on the molecular surface. Thus the superoxide radical toxicity is countered by its “dismutation” to molecular oxygen and hydrogen peroxide. The colors represent the electrostatic field gradients of the molecular surface.
So we have sought, in the U.S., mechanisms for research and development based on ready access of various disciplines and components in various institutions to each other, in circumstances that simply do not fit single purpose monolithic agencies. These are sure to push their own successes and specialties, such as, for instance, recombinant DNA, to the exclusion of peptide chemistry, or nuclear magnetic resonance physics, or topological crystallography and mathematical analysis. Nevertheless, and crucial to the success of this strategy, is the paradoxical need to have large enough ventures in science and development so that enough people working toward a common goal, whether the conquest of cancer, the moon landing or better engine lubricants, can vigorously assimilate the diversity of knowledge now necessary for the human purposes of science and engineering. That's why we must have large laboratories like Exxon Research and Engineering, coupled to the large objectives of a great corporation such as Exxon. That's where management and citizens and political leaders must come together and recognize the modern scale of research and development requisite for both public and private purposes.
But as we have tried to say throughout, the national mechanisms we have pursued in this past 25 years fit both private and public purposes, through the role of industry and universities. So our challenge now is to pursue ideas and effects of such quality and import that they will fit the systems dimensions that we describe, and not simply float around in the trivia of fragments of those systems.
For instance, membranes are crucial elements of all living systems, man and animal. But their study is only now taking on some of the dimensions we have sought as being systems-worthy. Earlier, we knew from the synthetic rubber era that micells, the site of polymerization of those and a host of other synthetic materials, provided a medium of control and adjustability. Cell membranes do the same. The recent work of Professor Nicholas J. Turro at Columbia University on NMR tracking of the rate of movement of radicals out of micells promises valuable information on the influence of the intimate medium of organic reactions. At the same time, many other researchers are looking at simulations of other membranes by different micells, by polyelectrolytes and other collodial forms. They are finally bringing together some models of enzymes and cell reactions. As pointed out by Professor J. H. Fendler of the Clarkson Institute of Technology, many separate contributions have enriched this field. For instance, the paper of 25 years ago by my former colleague at Bell Laboratories, Professor Ernest Grunwald, now at Brandeis University, indicated an influence of surface active agents on the association of pH indicator dyes that excited new interest in micellar film catalysis.
Now such important and emerging simulations as thinfilm-membrane-surface structure illustrate further our theme of systems-worthy ideas and the mechanisms for how they are being pursued with respect to the national objectives for science and technology. Irving Langmuir, at an industrial laboratory, G.E., in the earlier part of the century discovered particular properties of monolayers and surfaces which stimulated much technology in lubrication, vaporization, cloud formation, and electron emission from oxides. Nevertheless, this field was not much cultivated until it was recognized in the materials Science and Engineering era supporting the discovery of semiconductor devices and many new insulating structures. Now we realize that surfaces and solids are a continuum - a continuous system wherein individual atomic and molecular qualities can be identified rather than the assemblies of films or adsorbates, which had been about the level of insight before.
And one or two industrial laboratories recognized in the early ‘Sixties that semiconductors and electronic devices really call for a film system, where indeed an integrated circuit would function primarily through surface fields and charges. It was also beginning to emerge that photonics, which we were hotly pursuing following the invention of the laser by Schalow and Townes, would flourish also in surface and thin film systems, although needed science was exceedingly tenuous, if not lacking. Still further, it was recognized that the patterns delineating the components of integrated circuits would have to be generated by other thin films, mostly of polymers or in the famous case of the Derek-Frosch, dominant patent, through films of oxides such as silica. These encouraged selective diffusion.
All this was, of course, supplementary to the increasing interest in catalytic surfaces where, nevertheless, film and adsorbate properties remained complex and often obscure.
But now, the obvious systemic expanse of this film/surface field, has made it important eventually in public health through membrane and cell properties, in public works through the influence of corrosion on stability of bridges, roads, railroads, buildings, etc., as well as economically in foreign competition for modern electronics and photonics and various chemical processes. But there was no coherent theme for a particular mechanism using global thin film institute or even an analogy to the British Government's Inmos for semiconductors, or a giant center for examination of surfaces and the superficial. Rather, from dependence on the pluralistic mechanisms, let us see again how the public interest has been served.
First we note that our independent scientific and engineering associations have helped by encouraging conferences and publications on appealing and generally non-proprietary aspects of surface science. The classical colloid chemist, as well as heterogeneous catalysis chemists, and an increasing component of solid state and materials people have pursued this. For instance, in the most fundamental sense of what we call the systems-worthy ideas, our associate Dr. Homer Hagstrum founded a division of surface physics in the American Physical Society. (One remembers vividly the first meeting at a Washington Spring assembly of the Society, at which about four people appeared). This was reminiscent of the shocked disdain which accompanied the early ventures into solid state physics, where an indecent population of more than two or three atoms made science appear hopeless in view of the traditional quantum physics of the century up to then. Many physicists felt the surfaces were for Sherwin-Williams or Proctor and Gamble, but not for them. Nevertheless, Hagstrum and a little band of zealots stuck with it, supported by such accessories as the elegant evolution of low energy electron diffraction (LEED) by Germer and a couple of others during the later, culminating, years of Germer's career.
Thus it is comforting to look in the latest “Five Year Outlook on Science and Technology” issued by the National Science Foundation and prepared by the Committee on Science, Engineering and Public Policy of the National Academy of Sciences (COSEPUP). In this report have been selected the eight most compelling fields for American science and technology. And the chapter on Surface Science and its Applications by Homer Hagstrum projects a spirited and high-valued future. These other seven objectives, by the way, to which we shall refer later, range from (naturally) the genetic program of complex organisms and of plants, through cell receptors for neural transmitters, psychobiology, and reach crescendo on a “genetic theme” of even more dramatic import (although it is not yet so labeled). Namely it is Professor Jacob Schwartz's prospects on “The Next Generation of Robots.”
Well, you see in this fast-paced prospect for our public goals, surface science is well based. Its Auger electrons spectrosphy yields chemical composition of surfaces to one atom in a thousand. Electron diffraction and X-ray fine structure yield distance determinations on surfaces closer than one tenth of an Angstrom or one hundredth of a nanometer. Further, we have learned, through the industrial impetus implied before, through the work of Gossard, Dingle, Störmer, and others (for which they will receive the Buckley Prize of the American Physical Society in March) how to grow superlattice thin crystals. This is molecular beam epitaxy, by depositing layers of a hundred atoms thickness, varying in this film property by only one or two atoms over the entire area of the layer. It is these discoveries along with related ones such as ion implantation and the physical and chemical modification of surface atoms and molecules by laser pulsing (in which structure and composition can be changed so rapidly that only the electron environments are modified by electron-phonon collisions with relatively little motion of the nuclei) that have now produced systems-worthy science and engineering. New surface science also spreads steadily biomedical areas and many other elements of public purpose without ever having to be the title subjects of an act of Congress.
Now all this doesn't mean, of course, that we can be complacent or be certain that all new necessary findings and their application can be assured by these multiple and pluralistic mechanisms of academic curiosity and industrial interest. But it does mean that as we look at the outputs of the last 25 years, where these mechanisms have been encouraged, there is at least cause for hope and evidence of quality. But there are warnings too, such as the recognition that the gains that we have seen have come mostly from large integrated coordinated laboratories and communities. They have come from increasing realization of the nature of systems worthiness. The extensive consortia and combinations which Japan and other nations in Europe are increasingly forming in their own ambitions for science and technology further suggest that we should not give away or otherwise disrupt our own ventures of appropriate size, of which the Exxon laboratories are a notable example.
Another systems-worthy idea, which occurred to Alexander Graham Bell more than a century ago, was the use of photons, then constrained to the Newtonian description of light waves, for communications. It is thus not surprising that among the eight major public themes in the “Five-Year Outlook” of the NSF-NAS is a chapter on lasers written by Dr. Kumar Patel, the inventor of the carbon dioxide laser. He very well knows the enormous future of photonics in communications and information handling. And by his work on photoacoustic spectrosphy, for example, he has also demonstrated that the vast body of basic science will be enhanced by laser photonics in a myriad of yet unknown forms. Already selective excitation as well as femto second spectroscopic observations are bringing a new insight into chemical dynamics and energy exchange.
Now there is throughout this developing epoch of systems worthiness a pervasive feature which is becoming intrinsic to our economic social and governmental operations everywhere. It is the use of digital machines for the computing and communications, the organization of data and of all knowledge which is characterizing the final decades of the century. The systems history of these capabilities follows very much the pattern we have said before. There was never an institute or a national center for megacomputing. Rather the first machines of Stibitz, Eckert and Mauchley and Aiken and von Neumann reflected systems-worthy ideas which were attached to compelling public interests, largely stimulated by the National Defense.
The extent of the systems involved has usually connected with Federal programs. These have generally realized that superior computing and modeling capabilities have been compelling public needs. Here we have a remarkable combination of the pluralistic mechanisms we have discussed, with certain central demands, and overall with a heavy dependence for both hardware and software on scarce and crucial talents. Accordingly, there is an interesting interdependence of mechanisms in maintaining our present national leadership in one of the most important fields of economic and national security. For we are facing clear and candid challenges from the Japanese commitment for Fifth Generation mainframe computers and also for vastly enhanced so-called “supercomputers” which are intended to provide a major gain in the popular pursuit of artificial intelligence.
The scale of Japanese aims, which is a bold extension of what we have done or are in the process of obtaining, will provide a useful reference. In our USA venture, from which they took off, about 50 supercomputers (CDC and CRAY) have been made and are distributed worldwide. About 38 are in the United States, 25 of which are in government labs, about 10 in Europe and 2 in Japan. (We can assume that the two in Japan have had unusually meticulous “reverse engineering”). These American products will be succeeded in 1984 by improved versions of their makers’ lines which are the CRAY-2 or XMP and the Control Data Corporation’s Cyber 205.The next 1984 shipments of CRAY will do about one billion floating point operations per second with about .25 billion bytes of memory and about one billion bytes per second bandwidth for that memory. The present CRAY-1 has about a tenth of a billion FLOPS with a 12 nanosecond cycle time, which is about the same as.the Cyber 205, which operates a little faster at 0.6 - 0.8 billion FLOPS peak speeds. Current, although not necessarily available, Japanese machines, like the Hitachi M200HIAP and the Facom-APU of Fujitsu use high electron mobility transistors (HEMT) created industrially in the U.S. as noted before, of gallium arsenide, and gallium indium arsenide phosphide. They enable with room temperature switching of 35-50 x 10-12 seconds, or in other words, about 35 thousandths of a nanosecond, but the forthcoming superspeed goals are a full 10 billion FLOPS, or an order of magnitude beyond the best now. They are planned with a billion bytes of memory and with capabilities for parallel processing.
Now let us back off and look at what our scientific expectations and technical extrapolations were several decades, and also one decade, ago. We suggest that this is a way to test whether these mechanisms really serve the public as well as independent needs. Just about 10 years ago, our colleague Dick Hamming prepared for the Hudson Institute's “Encyclopedia of the Future,” (a special venture of the late Herman Kahn), a wise and bold assessment entitled “Fifth Generation Computers and Beyond.” There he expected 10-30 picoseconds speeds “in the late 1990's,” so it appears that the field of junction devices, (one subject to the strongest scientific inputs of any area of technology outside the nucleus), has readily outstripped our strong expectations. Having also discussed the packing density of electronic circuits, their heat dissipation and other sensible parameters, Hamming suggested probability of up to 1980 about 10,000 (or more) components per chip. This we have exceeded by an order of magnitude at that time and are headed toward another order of magnitude gain.
The rest of our world contest involves vastly less certain matters, such as the best design of the processors and especially the software or operating systems that will be used with them. The early computers were highly sequential and in von Neumann's design, almost everything went through the arithmetic unit. Then index registers gave some flexibility for routing of arithmetic unit and now there is a strong trend to be centralized control which will permit many ingenious variations for the future. However, we still lack adequate scientific base for programming and software development. Accordingly, common languages and other civilizing features of operations systems are scarce. In this arena, neither the national mechanisms nor independent productivity have been adequate.
In this regard, let us go back not to the decade of the Hamming foresight, but to the chapter we wrote for the Engineers Joint Council report on “The Nation's Engineering Research Needs, 1965-1985.” This report was published 22 years ago, (and by the way, had a wonderfully foresighted chapter three pages long, entitled, “Engineering Applications of Biological Processes or Systems;” a co-author among the three was Dr. E. E. David, Jr.). But now the report also had a very long chapter with a single author entitled “Information Handling Systems.” In that we noted bravely the early experiments we had done in Bell Laboratories at no less than 500,000 logic operations/sec. and then we opined that the world of science and engineering would probably collapse from its own weight of knowledge and data if it didn't get some improved digital mechanical help. In an orgy of understatement we said, “Large fractions (of the nation's total of scientific and engineering personnel) are engaged in information handling of a kind suitable for progressive mechanization.” We then went on to ask, in some detail for methods for simulation, for the design of self-reproducing machines a la the Penroses, for adequate pattern recognition and thus product inspection and quality control, and for the automatic reading of cursive script. We plead also for some level of speech recognition and language translation.
These were not idle dreams since each was coupled to active endeavors in that direction in our Laboratories and elsewhere at the time. Nevertheless, in the quarter century since some of these efforts were begun, progress has been inadequate - although substantial and socially and economically revolutionary. Further, our recommendations in that volume had proposals for education and the linking of information automata machines which step well beyond the conventional robotics and which we believe represent major needs for the recasting of American manufacturing. So it is sobering to find that so many of these functions remain unfulfilled. And even the service of computers to engineers routine activities such as is covered in a conference entitled, “Engineering Work Stations Offered by the Institute for Graphic Communication in Monterey, California, January 31 - February 2, 1984” has a program discussing what we hoped would be altogether conventional by now. But all this delay is not despairing, but rather shows that some elements of computer technology take a good deal of time, especially in the absence of software science. And above all, there is no indication that we have failed to recognize the opportunities, or that a centralized system would have done things faster or better. We are recreating in Washington new national centers well connected to universities, which will extend what was done in 1958-60 in such efforts as the IDA at Princeton.
As always, we have been confined . to our direct knowledg and experience in illustrating this reconsideration of service for public purposes. Many arenas have been neglected, but it appears that the principle is supportable. It is showing up in various new ways. Thus, it is also the essence of one of our more recent ventures in environmental science and engineering. Here, the nature of the regulation and multiplicity of sources favored central control from the outset. Thus the EPA, in its various research and development programs, with a minimum of external inputs, undertook to lay out conditions for public health and industrial performance. This single R&D source doctrine did not, however, seem essential and there have been increasingly difficult economic and social reactions. Accordingly, early in 1980, we organized with full agreement of the EPA and the automobile and engine manufacturers an operation called the Health Effects Institute. It was designed once more to parallel the pluralistic mechanisms which had worked in science and engineering of major systems before. Although it is still early in the history of this national venture, we believe there is evidence that the public purpose of clean air in the presence of a viable engine economy is being strongly aided. In the Health Effects Institute we have mobilized some of the best university centers in bioscience and combustion chemistry and engineering.