I think that a man tries to be better than he thinks he will be.I'm a fan of Faulkner quotes, and that one has always been in the back of my mind, waiting for its time. I think that this Spring, as we celebrate successfully completing (surviving) the eleven-year lifecycle of the NSF Engineering Research Center (ERC) at Mississippi State, and I reflect on the ERC and the changes it has fostered at MSU, the time has come for this quote. Faulkner spoke this one for himself, not through one of his characters. And I think that's significant, for if he could reflect on his own success here like that, I believe that we all can also, without apology. In fact, I believe this only heightens the sense of accomplishment.
I think that that is his immortality,
that he wants to be better, he wants to be braver,
he wants to be more honest than he thinks he will be,
and sometimes he's not,
but then suddenly to his own astonishment he is.
And the ERC's time was the 90s, a tremendous decade when high performance computing and the Internet came of age and took us all for a wild ride. It was a time when our innovation may have gotten ahead of our reason.
In a book Art and Physics: Parallel Visions in Space, Time & Light by Leonard Shlain, I read that Newton made reference to "the glory of geometry". This book goes on to point out that the development of perspective in the 15th century was a milestone in the history of art, suddenly opening the 2D canvas to the 3D world. In fact, Renaissance parents urged their children to become professional perspectivists because the skill was in such demand - reminds me of the present IT workforce situation. Grid generation has analogously moved computational simulation from squares and circles into the real world. The far-reaching effect of Alan Winslow's Journal of Computational Physics paper in the inaugural issue of JCP, which was re-printed last year, bears strong analogy for me to the 1435 treatise of Leon Battista Alberti on perspective.
Joe Steger, of UC Davis, was one of the earliest users of structured grids in computational fluid dynamics (CFD) and I still remember how good that endorsement felt. At the time of his death, Joe had a paper, with William Chan, in the midst of the review process for Applied Mathematics and Computation, for which I was then a Senior Associate Editor. I immediately moved his paper ahead directly into publication in tribute to his stature, adding a footnote:
Joe Steger died on May 1, 1992. CFD has lost a giant.Back in 1988, Joe and I had said in an AGARD survey of grid generation in practice on large aerospace applications in Europe and the US:
Joe was a regular on the staff of the grid short courses at MSU in the 80s, and he was back in Mississippi the summer before his death. I remember sitting and talking with him in the sunshine: there was so much life ahead of him then. The last time I saw him was here at UC Davis that fall. The panel he chaired was "CFD on Complex Grids" - most appropriate.
One more time from Faulkner, in Big Woods:
But you can't be alive forever, and you always wear out life long before you have exhausted the possibilities of living.With Bernd Hamann here, who Davis took away from the ERC, I have to mention visualization. Without visualization, the computations are like the proverbial tree falling in the woods - only uninterpreted pressure waves: no sound without an ear to hear. The announcement for the 1993 Technology Transfer Meeting of the Federal Laboratory Consortium carried a quote from Christina Rossetti:
Who has seen the wind? Neither I nor you: But when the leaves hang trembling, The wind is passing through.It is visualization that shows us the wind, the fields of physics, and that makes the human connection with HPCC.
Recognition of the importance of visualization is not new: Leonard Shlain, in "Art and Physics: Parallel Visions in Space, Time & Light" (Quill, William Morrow, New York, 1991) noted Newton's appreciation:
Newton, the scientist, reduced the visual world to mathematical relationships and yet was not satisfied with his formulations until he could make an easily visualizable geometrical model he could see.And Leonardo da Vinci understood the link between visualization and understanding. His words, quoted in Shlain:
The eye, which is the window of the soul, is the chief organ whereby the understanding can have the most complete and magnificent view of the infinite works of nature.The idea of visualization via a 2D graph was once a major advance, and Shlain notes its importance to science:
In the 1360s Nichole d'Oresme, a medieval schoolman, introduced a graphic means to plot scientific functions. The graph, an indispensable tool of science, gave to thinkers the means to express visually the concepts of motion, time, or space on a piece of paper intersected by a horizontal abscissa and vertical ordinate. The ability to make abstract concepts visual was an absolute prerequisite for the scientific discoveries that followed.And then the development of perspective by the artists enabled visualization in 3D. Again from Shlain:
The beginning development of perspective by Giotto and its elaboration by Alberti and other artists was a revolutionary milestone in the history of art.Using perspective to project a scene upon a two-dimensional surface made the flat canvas become a window that opened upon an illusory world of stereovision.
The introduction of shadow into art came at about the same time, enabling even more effective 3D representation. Cezanne, quoted in Shlain, had this to say:
Nature is more depth than surface, the colours are the expressions on the surface of this depth; they rise up from the roots of the world.and John Russell in "The Meanings of Modern Art" (Harper & Row, New York, 1974) made the connection with physics:
Color is energy made visible.And Monet painted haystacks at various times of day.
Reflecting on the success of the ERC here at MSU, riding the tide of information technology in the 90s, has brought me into some musings, concern, and perhaps resolution - maybe with Faulkner's old farmer. And since the 90s took me away from both teaching and hands-on research, I can't tell you anything new about grid generation. So all I can to is attempt to either entertain or enlighten. I could try the former, since I like to talk about connections between science and literature, but since this seminar is during working hours, I'll try to raise some thoughts on information technology.
The study of science and the practice of engineering have experienced two monumental paradigm shifts over the ages, and we are in the time of flowering of the second. There was always the experimental mode of scientific investigation and engineering application, from the time that a caveman observed that a club from one tree delivered more blows without breaking than one from another.
The experimental mode was all there was, except perhaps for some classical musing and medieval speculation, until Newton invented the calculus in the 17th century. The calculus brought on the first great paradigm shift,enabling mathematical analysis of physical phenomena and processes, and adding the second mode of investigation: the theoretical.
Then came the invention of the transitor, followed by the integrated circuit and microelectronics - the second of the two great paradigm shifts. Now there are three fundamental modes of scientific investigation and engineering application: the experimental, the theoretical, and the computational, all three standing with both independence and interaction. This recognition has even made it into the Federal budget: thus from the 1995 OSTP report "HPCCS: Foundation for America's Information Future", Supplement to the President's FY 1996 Budget:
Simulation has become recognized as the third paradigm of science.So now the computational paradigm - the computer has indeed become a new and very powerful device for scientific discovery. The 1990 report "Renewing US Mathematics" from the National Research Council expressed it well at the start of the 90s:
The computer can act as a microscope and a telescope, allowing researchers to model and investigate phenomena ranging from the dynamics of large molecules to gravitational interactions in space.I've always liked that microscope/telescope analogy; computational simulation of physical phenomena and processes is truly revolutionizing engineering analysis and design in industry, as well as scientific investigation in general.
The 1999 report "Information Technology Research: Investing in Our Future" (http://www.itrd.gov/ac/report/) of the President's Information Technology Advisory Committee (PITAC), on which I am priviledged to serve, notes that we have consistently neglected to adequately address the software advances that are required to fully utilize advances in hardware. More powerful machines do not porportionally increase capability in computational simulation applications until software suitable to the hardware architecture is developed. In fact, we know more about how to design hardware today than we do about how to design software. PITAC described software as the new physical infrastructure of the information age, and as among the most complex of human-engineered structures. That poses a daunting challenge.
Major effort is required in both systems software and applications software. Systems software makes the hardware work: such things as compilers, schedulers, file managers, debuggers, security, communications, etc. Applications software performs computational simulations of physical problems. But behind each of these are sorely needed fundamental advances in software design and engineering, so that both systems software and applications software can be developed to be reusable, robust, reliable and secure. Software design must be brought to a higher level, similar to that which now enables complex chip design. We simply cannot continue to tolerate the labor-intensive and error-prone software development that is now the case. The PITAC report notes that neither are we adequately improving the efficiency of software construction nor are we training enough professionals to supply the needed software.
One way to get a grasp of the software problem is to consider the contrast between replacing an entire fleet of trucks with replacing a software system. Replacement of an entire fleet of GM trucks with Ford trucks would pose no great problem either for drivers (users) or mechanics (maintenance). The same could even be said about the Air Force replacing all its Boeing aircraft with aircraft from Lockheed. But today replacing a major software system is a traumatic experience for businesses, universities, or government. While chip design (hardware) is fundamentally done with similar design principles and tools in different companies, such effective design tools for software do not yet exist. Yet we are basing the essential infrastructure of commerce and government on this esoteric medium. Contrast the effect of being denied access to all Ford mechanics and engineers with loss of access to the supporters of a major software system.
And we face a fragility in this software infrastructure.
The Internet was, of course, not designed for the purpose it now serves for business, and we are in considerable danger of the consequences of that fact as commerce becomes ever more dependent on the Internet. The wild successes of the 90s make me think of what Faulkner said:
A mule will work for a man faithfully for ten years just for the pleasure of kicking him once.Ten years: the 90s.
That is why the research in network technology is so important to the country. This is essential research to address fundamental problems with the present Internet, such as security, quality of service, scaling, and management. Such effort is essential to future commercial success, but such long-term effort is not going to attract commercial investment in today's orientation toward short-term launch of start-ups and rapid release of a succession of Internet applications. Never has a particular area of research been so critical to the Nation in such a fundamental and pervasive way.
And there is the matter of response time: we are now faced with almost immediate response time in our fundamental infrastructure - for better or worse. Life never can be perfect and without risk - Faulkner's mule is going to kick - we survive not because we can avoid risk, but because we can understand it and implement damage control to restore equilibrium. The negative and concerning factor in IT is that the time available for damage control is shrinking drastically. And this affects communications, the power grid, financial transactions, and most aspects of commerce and security. We are being placed by information technology into the position of not being able to rely on marshalling our own response to crisis; rather, we are becoming dependent on the reliability and security of the infrastructure system and its own capacity for response to failure. As response time decreases through the advance of hardware and software, our stability is thus a direct function of the software.
And the Internet knows no boundaries. There was a time when we folks in small Mississippi towns felt somewhat insulated from some of the bad guys around the world, but no more. We are now exposed to the entire range of humanity. There was a time when an adversary had to come within striking distance, and expose himself to retaliation, but the Internet has changed that. Even personal identity has become detacted and made an abstraction, with financial liabilities being assigned by credit offerers competing to satisfy our desire for instant gratification. And a further concern that has yet to have much expression: We in the US have never felt at a disadvantage being greatly outnumbered by some other countries, because we had destructive weapons that could destroy many as easily as few. But if the measure of strength shifts to the potential for mass e-commerce, there may again be strength in numbers.
Bill Joy, co-founder of Sun Microsystems and the creator of Unix, raised serious concerns in an article entitled "Why the future doesn't need us" in the April 2000 issue of Wired:
I have always believed that making software more reliable, given its many uses, will make the world a safer and better place; if I were to come to believe the opposite, then I would be morally obligated to stop this work. I can imagine such a day may come.This is disturbingly reminiscent of Oppenheimer's famous response to the Trinity test. Joy refers to "knowledge-enabled mass destruction", and notes that, while nuclear technology has been under close government control, robotics, genetic engineering, and nanotechnology - all information technology based - are driven by commercial interests.
The coming of ubiquitous wireless attachment to the Internet will brook no shelter from IT, as is noted in an article entitled "Malicious Code Moves to Mobile Devices" in the December 2000 Computer:
Wireless technology has the potential of enabling malicious code to jump off our computer networks and into our everyday lives in a way it never has before.The Internet is the greatest entropy generator yet devised. TV was one also, but not to the degree. The Internet allows everyone with nothing to say to say it to everyone else. Civilization will end in valueless uniformity - will just stop - rather than a cataclysm. I am really coming to believe that, although inventivness always increases, feeding on itself, imagination and creativity do not.
There's an article in the December 2000 Scientific American entitled "The New Uncertainty Principle" that speaks of the "precautionary principle", whereby the effect of new technology on the environment is considered along with the desire for advancement, as a moderator on speed of application. The reference is to the physical environment, noting that:
Governments everywhere are confronted with the need to make decisions in the face of ignorance.But such concern may well be applied also to the effect of information technology on the human operational environment. PITAC called for research into the socio-economic aspects of information technology - the impact of IT on the human environment, noting similar concern about decisions being made in the dark:
Policy decisions and IT investments are being made on the basis of incomplete research and data concerning the effects of IT on our society.and continuing:
In fact, it is ofter the case that the implementation of information technologies has a considerably different set of consequences than were originally intended or anticipated.It is this concern over inadequate data and unintended consequences that argues for the precautionary principle in regard to IT.
An article entitled "Are We Forgetting the Risks of Information Technology" in the December 2000 issue of Computer ennunciated similar concern, noting adverse national impacts from our rapidly increasing reliance on IT and the Internet, including that decreasing response time:
Reduced operational buffer zone in most infrastructures, and the ever-increasing adherence to the just-in-time philosophy.This article goes on to express concern that new IT is used immediately after it is introduced, with risk analysis to come later, and puts it bluntly:
The adage that there is no free lunch seems to have escaped our society when it comes to advanced IT use. We continue to accept the economic and other myriad benefits of IT without simultaneously conducting an appropriate, comprehensive cost-risk-benefit analysis. This constitutes a major societal failure.closing with the admonition:
If we don't begin to answer these questions in a systematic and meaningful way, we will reap the whirlwind of technology that has become indispensable but whose reliability and trustworthiness have become questionable.These expressions are disturbingly like those that have expressed in regard to the impacts of other technologies on the future of the planet itself.
So now computational simulation - the third paradigm of scientific investigation - potentially much more far-reaching that the first two paradigms. Both experiment and theory are inherently limited: experiment by the practical impossibility of testing all possibilities, and theory by the practical inability to solve the equations. But the computational mode is unlimited: only more powerful machines and algorithms are necessary, in principle, to include all possibilities and to achieve any accuracy. And the third mode enables optimization.
This is both exciting and disturbing. Complete automation of the design cycle would allow total customization and optimization, opening the possibility of an infinite variety of products to meet all human needs. And eliminating the need for us engineers in the process. So we come to the stage of having all human desires satisfied with no human involvement: not an entirely comfortable feeling - are we then masters or slaves to cold logic? Software is logical creativity, and the essence of engineering is creativity, but can we come to be so effectively creative as to no longer require creativity or to be allowed to excercise it, assuming we even survive the effects of all our creativity?
Ted Lewis, writing in the September 1995 issue of Computer - "Living in Real Time, Side A (What is the Info Age?)" said:
Civilization is reaching a terminal velocity - a rate of change so voracious that it is limited by the human capacity to absorb it. The major characteristic of living in the Info Age is constant, unrelenting, maximum change.Faulkner has one of his characters say, in "Light in August":
It is because a fellow is more afraid of the trouble he might have than he ever is of the trouble he's already got. He'll cling to the trouble he's used to before he'll risk a change.We have to welcome change, but we also have to control it - assuming that is still possible.
The 90s were a fabulous time on the stock market also, with IT driving a productivity increase and transforming commerce, accounting for a third of the national economy. But there has been a persistent decline in the number of students in science and engineering, because the road to riches has perhaps been seen as riding the engine of technology rather than putting out the effort to become part of it. In one of Faulkner's short stories "The Tall Men", a Mississippi country man says:
Life has done got cheap, and life ain't cheap. Life's a pretty darn valuable thing. I don't mean just getting along from one WPA relief check to the next one, but honor and pride and discipline that make a man worth preserving, make him of any value. That's what we got to learn again. Maybe it takes trouble, bad trouble, to teach it back to us.We may see some of that bad trouble in this present decade. A lot of computer science talent has been siphoned off into selling catfood on the Internet, chasing IPO riches with the company as the commodity rather than the product. There's an old book, published in 1841, that should be required reading: "Extraordinary Popular Delusions and the Madness of Crowds" by Charles MacKay. In his foreword to the 1932 reprint, Bernard Baruch observed:
All economic movements, by their very nature, are motivated by crowd psychology.Some of the 90s dot.coms aren't too far from MacKay's description of public passion for a stock in South-Sea bubble in 18th century London:
A company for carrying on an undertaking of great advantage, but nobody to know what it is.And I have to mention the Mississippi bubble, in England and France in the 18th century, when:
The price of shares sometimes rose ten ot twenty per cent in the course of a few hours, and manu persons in the humbler walks of life, who had risen poor in the morning, went to bed in affluence.But the tulipomania in 17th century Holland may be an even better analogy:
A golden bait hung temptingly out before the people, and one after the other, they rushed to the tulip-marts, like flies around a honey pot. Every one imagined that the passion for tulips would last for ever, and that the wealthy from every part of the world would send to Holland, and pay whatever prices were asked for them.In the last issue of 1999, Business Week extrapolated IT into the new decade:
The big story of 2000 is likeky to be tech stocks - how far and how fast they will rise.It may be that the dot.com shakeout is going to redirect some good computer science talent into more essential IT infrastructure research and development, and even into graduate study. This current decade may be the decade of computational science, rather than computer science - substantitive effort on science/engineering problems in big companies. There's a good article in the January Atlantic Monthly "The New Old Economy: Oil, Computers, and the Reinvention of the Earth" which notes that in 1989 only 5% of wells drilled in the Gulf of Mexico made use of 3D seismic data, but by 1996 the figure was 80%, and that the average cost of finding new oil is now a third of what it was 20 years ago, primarily because of IT.
In a speech in mid-1999, Treasury Secretary Summers characterized the New Economy as a fundamental move:
from an economy based on the production of physical goods to an economy based on the production and application of knowledge.But I don't think he got it quite right. The economy will always be based on the production of physical goods in the end: virtual pottery would have still left our prehistoric ancestors eating off the ground. Rather it is that application of knowledge in the production of physical goods that is the "new old economy", which will likely emerge into its own, after transition from the dot.com mania, in the present decade.
IT is making major transformations in all of life, as the PITAC report noted, but we do have to keep a perspective. With Faulkner, in his Nobel Prize speech, real human life is:
the old verities and truths of the heart, the old universal truths lacking which any story is ephemeral and doomed - love and honor and pity and pride and compassion and sacrifice ...No virtual reality here. And Joel Garreau, relating in the November 28, 2000, Washington Post an interview with Ruzena Bajcsy, the gracious lady who runs the ITR program at NSF asks, and concludes otherwise, as he recalls the sweetness of real roses:
Are we indeed entering some new realm in which humans are represented more by their elecrons than by their flesh and blood? Is being there - face to face - obsolete?Singing with Jerome Kern, "I want to be there..."
There was major Federal investment in IT research during those 90s, in the High Performance Computing and Communications (HPCC), Next Generation Internet (NGI), and Information Technology Research (ITR) initiatives. The HPCC initiative dramatically increased our ability to attack Grand Challenges in science; the NGI initiative is significantly expanding the reach and power of high-speed network connectivity; and the ITR initiative is addressing some of the concerns over software and the network raised in the PITAC report. And there have been important elements in these initiatives that extend beyond science and engineering into commerce, education, and even entertainment.
However, there has yet to be a focus on the vast potential that information technology has to enhance the humanities. But, back in 1995, William Wulf, now president of the National Academy of Engineering and one of the pioneers of IT research while at the National Science Foundation, said:
Humanists will lead the way to innovative applications of information technology in the university.in an article entitled "Warning: Information Technology Will Transform the University" in the Summer 1995 issue of Issues in Science and Technology. And Richard Atkinson, president of the University of California system, writing as part of a special segment on "The Future of Higher Education" in the Winter 2000 edition of Issues, noted that:
Multimedia technologies are creating new vehicles and new demand for the arts and humanities.That microscope/telescope metaphor extends into the humanities in perhaps an even more revolutionary manner: IT has the potential to provide scholars in the humanities with tools undreamed of, tools without physical analogy. And this can carry into great enhancements and new opportunities in all of life, impacting education, understanding, creativity, appreciation and enjoyment, and thus with potential to unleash another wave of economic impact of IT.
This is not without historical precedence, for as Leonard Shlain notes in his book "Art and Physics: Parallel Visions in Space, Time & Light":
The development of perspective in the 14th Century was a revolutionary milestone in the history of art. Using perspective to project a scene upon a two-dimensional surface made the flat canvas become a window that opened upon an illusory world of stereovision.Perspective was a surprising and delightful technical advance, embraced as enthusiastically as computer technology is today. Renaissance parents urged their children to become professional perspectivists because this skill was much in demand.
Information technology has, in fact, already greatly impacted the arts, providing new implements and new media of creativity and expression. Yet, as noted by the Computer Science & Telecommunications Board of the National Research Council in the 1995 report "Evolving the High Performance Computing & Communications Initiative to Support the Nation's Information Infrastructure":
The vast implications of computer graphics (what-you-see-is-what-you-get document creation systems, scientific visualization, the entertainment industry, virtual reality) were of course totally unforeseen at the time that this fundamental research was undertaken.Information technology is causing a convergence that opens particular new opportunities where expertise in science and technology is co-located with that in the arts and humanities. This convergence was noted in the 1995 National Research Council report "Keeping the U.S. Computer & Communications Industry Competitive: Convergence of Computing, Communications, & Entertainment":
Digital convergence - combining computing, communications, and entertainment - is taking shape in the form of new kinds of business and consumer products (goods and services) and new business ventures and alliances. It has become a regular topic in the business, trade, and mass media, yet it remains hard to define and hard to interpret in terms of its ultimate technical, business and societal ramifications. Digital convergence is expected to transform every field and every aspect of U.S. society, from business and education to health care and libraries, from the structure of the nation's economy and laws to the psychology of the individual.
And this convergence has created the need for cross-training across traditional boundaries between science and the humanities, as noted in the 1995 report "America in the Age of Information: A Forum, Committee on Information & Communications" of the National Science & Technology Council:
Increasing numbers of individuals will be cross-trained in technology, literature, and the arts in order to work effectively in the new industries made possible by the convergence of entertainment, computing, and communications.IT thus enables new approaches and applications for the arts and humanities, but it goes both ways. Thus, Wulf again, this time on "The Image of Engineering" in the Winter 1998 issue of Issues in Science and Technology:
Something's wrong with the public perception of engineering. A recent Gallup poll found that only 2 percent of the respondents associated engineers with the word "invents" and only 3 percent associated them with the word "creative", whereas 5 percent associated them with the phrase "train operator." This perception is not only unappealing, it's profoundly inaccurate! It may help explain why U.S. students are losing interest in engineering.What do engineers do?
My favorite quick definition of what engineers do is "design under constraint. "We design things to solve real problems, but not just any solution will do. Our solutions must satisfy constraints of cost, weight, size, ergonomic factors, environmental impact, reliability, safety, manufacturability, repairability, and so on. Finding a solution that elegantly satisfies all these constraints is one of the most difficult and profoundly creative activities I can imagine. This is work that in some ways has more in common with our artistic colleagues than our scientific ones. Engineers do this creative, challenging work, not the dull, pocket-protector, cubicle stuff of popular myth.
Whatever the reason, we are seen as dull. But it has not always been that way. From the mid-19th to the mid-20th centuries, engineers were heroes in films, novels, and even poetry. Listen to Walt Whitman,
Singing the great achievements of todayIn most of the world's countries, engineers still enjoy this type of respect. The dull image of engineering is not preordained!
Singing the strong light works of engineers.
Bob Liebeck - Boeing Fellow, adjunct professor at Southern Cal, and a member of the National Academy of Engineering - in conversation at the MSU Raspet Lab Anniversary ceremonies during National Engineers Week in February 1999 raised the potential of art courses to enhance creativity in undergraduate engineering students. Liebeck, as an industrial member of the NSF review team for the ERC, has praised MSU's own convergence of engineering and art in his reviews, and has applied animation in the evaluation of the design of aircraft interiors for functionality. So now breaking out and repeating for emphasis one of Wulf's statements above on problem solving:
This is work that in some ways has more in common with our artistic colleagues than our scientific ones. In fact, the 1997 Regan and Associates report "A Labor Market Analysis of the Interactive Digital Media Industry" cited the core competencies for professionals in interactive digital media as being:
Research enabling the application of information technology in the humanities is important, not simply for the great revolutionary potential that IT has in the humanities, but also for the daunting challenges that are posed by the humanities for IT research. Meeting these challenges in computer science to serve the humanities will result in reverse spin-off of important technological advances into science and engineering.
Five centuries ago, Leonardo da Vinci - artist and engineer - said:
Art is the Queen of all sciences, communicating knowledge to all the generations of the world.If Leonardo could integrate the two halves of his divided psyche, then perhaps this integration of information technology and the humanities can forge an alliance between humanists and IT scientists, offering undreamed of enhancement for the humanities and unanticipated advances in IT for science and engineering.
Information technology is, in fact, transforming all aspects of life, as noted in the PITAC report:
As we approach the new millennium, it is clear that the "information infrastructure" -- the interconnected networks of computers, devices, and software -- may have a greater impact on worldwide social and economic structures than all networks that have preceded them. The advances in computing and communications technologies of the last decade have already transformed our society in many ways. These advances have transformed the ways in which we view ourselves, our relationships with each other and with other communities, and the ways in which we obtain services, ranging from entertainment and commerce to education and health care. Even so, we have only just begun to grasp the opportunities and experience the transformations that will occur as these technologies mature.PITAC noted the need for the extension of IT research beyond science and engineering, recommending major effort also in socio-economic areas. And PITAC called for "Expeditions into the 21st Century" - large virtual visionary multidisciplinary research centers to "live in the technological future":
The mission of these expeditions will be to report back to the Nation what could be accomplished by using technologies that are quantitatively and qualitatively more powerful than those available today. In essence, these centers will create "time machines" to enable the early exploration of technologies that would otherwise be beyond reach for many years. Just as the Lewis and Clark expedition opened up our Nation and led to unanticipated expansion and economic growth, the ideas pursued by information technology expeditions could lead to unexpected results and nourish the industry of the future, creating jobs and benefits for the entire Nation.The Jeffersonian connection is appropriate here because we well may find ourselves in need of Ambrose's "Undaunted Courage" in this InfoAge.
These are matters to which we must give attention now. We achieved the capability for world destruction from nuclear technology in mid-century, and now the peaceful issue of nuclear power has become strangled by emotion and NMRIs are referred to as MRIs. Information technology - computing and networking - may have similar dark potential. We have found before that the possession of a technology does not necessarily mean that we can afford to use it. As in all times, we have never faced a greater challenge.
In his Nobel Prize acceptance speech, William Faulkner said:
Our tragedy is a general and universal physical fear so long sustained by now that we can even bear it. There are no longer problems of the spirit...I believe that man will not merely endure: He will prevail. He is immortal, not because he alone among the creatures has an inexaustible voice, but because he has a soul, a spirit capable of compassion and sacrifice and endurance.
That spirit must overlie all paradigms. And engineering creativity must rise to the software challenge, ensuring that we are masters of a robust, secure, and effective information technology.
Dewitt Jones, a National Geographic Photographer, once told me, talking about still looking for the great photograph after the good:
Look for what's wanting to happen.But that's what engineers do.